WO2022188497A1 - 搬运机器人、搬运系统及提示信息生成方法 - Google Patents

搬运机器人、搬运系统及提示信息生成方法 Download PDF

Info

Publication number
WO2022188497A1
WO2022188497A1 PCT/CN2021/138720 CN2021138720W WO2022188497A1 WO 2022188497 A1 WO2022188497 A1 WO 2022188497A1 CN 2021138720 W CN2021138720 W CN 2021138720W WO 2022188497 A1 WO2022188497 A1 WO 2022188497A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
target object
prompt
side wall
visual
Prior art date
Application number
PCT/CN2021/138720
Other languages
English (en)
French (fr)
Inventor
张硕
Original Assignee
灵动科技(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 灵动科技(北京)有限公司 filed Critical 灵动科技(北京)有限公司
Publication of WO2022188497A1 publication Critical patent/WO2022188497A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/07504Accessories, e.g. for towing, charging, locking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F17/00Safety devices, e.g. for limiting or indicating lifting force
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F17/00Safety devices, e.g. for limiting or indicating lifting force
    • B66F17/003Safety devices, e.g. for limiting or indicating lifting force for fork-lift trucks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present application relates to the field of robotics, and in particular, to a handling robot, a handling system and a method for generating prompt information.
  • Handling robots are industrial robots that can perform automated handling operations.
  • Handling operation refers to using a device to hold or support an object and move it from one location to another.
  • the driver controls the forklift to carry the load.
  • the pallet holding the object is lifted and loaded on the handling robot, and then the handling robot will transport the object to the designated location.
  • the driver's line of sight is easily blocked by the forklifted object, and it is difficult to see the relatively low carrying robot (typically about 20-40 cm in height).
  • the relatively low carrying robot typically about 20-40 cm in height.
  • the present application provides a handling robot, a handling system and a method for generating prompt information that solve the above problems or at least partially solve the above problems. specifically,
  • a handling robot includes:
  • a sensing device arranged on the body, for sensing environmental information around the body
  • a controller electrically connected to the sensing device, for acquiring the environment information through the sensing device when the handling robot is in a waiting state for loading; determining the equipment around the body based on the environment information When the target object exists in a certain range, output a prompt command;
  • a prompt device electrically connected with the controller, for outputting prompt information perceivable by the human body according to the prompt instruction;
  • the prompt information reflects the position of the body, so as to provide a reference for the target object to perform the loading action.
  • a handling system includes:
  • the handling robot when the handling robot is in a state of waiting for loading, after sensing the manually-driven transport vehicle, it outputs prompt information perceivable by the human body, so as to guide the driver to drive the manually-driven transport vehicle to complete the object loading action.
  • a method for generating prompt information is provided, which is applied to a handling robot.
  • the method includes:
  • the environmental information determine whether there is a target object within a set range around the body
  • the prompt information reflects the position where the transport robot is located, so as to provide a reference for the target object to perform the loading action.
  • the sensing device on the body of the handling robot enables the handling robot to have the ability to sense information about the surrounding environment of the body;
  • a controller electrically connected to the sensing device can obtain the environmental information through the sensing device, and output a prompt instruction when it is determined based on the environmental information that a target object exists within a set range around the body;
  • the prompting device electrically connected to the controller can output prompting information that can be perceived by the human body and can reflect the position of the body according to the prompting instruction, so as to provide a reference for the target object to perform the loading action, and reduce the collision of the transport robot. , the incidence of damage, and simple structure, low cost.
  • FIG. 1 is a schematic structural diagram of a handling robot according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a handling system provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of two light rays projected on the ground output by a visual signal output unit in a handling robot according to an embodiment of the present application;
  • FIG. 4 is a schematic diagram of two columns of arrow patterns projected on the ground by a visual signal output unit in a handling robot according to an embodiment of the present application;
  • FIG. 5 is a schematic flowchart of a method for generating prompt information according to an embodiment of the present application.
  • FIG. 1 shows a schematic structural diagram of a handling robot provided by an embodiment of the present application.
  • the transport robot includes: a body 10 , a sensing device 20 , a controller (not shown in the figure) and a prompting device 30 . in,
  • the sensing device 20 is arranged on the body 10 and is used for sensing environmental information around the body 10 .
  • a controller electrically connected to the sensing device 20, for acquiring the environmental information through the sensing device 20 when the handling robot is in a waiting state for loading; determining the surroundings of the body based on the environmental information When the target object exists within the set range of , a prompt command is output.
  • a prompting device 30 electrically connected to the controller, for outputting prompting information according to the prompting instruction;
  • the prompt information reflects the position of the body, so as to provide a reference for the target object to perform the loading action.
  • various positions around the body 10 can be divided into: front side, rear side, left side, and right side.
  • a corresponding space above the top of the body may also be included around the body 10 .
  • the position of the sensing device 20 on the body 10 can be designed according to actual operation requirements.
  • the sensing device 20 may be disposed on the side wall and the top of the body 10 .
  • the side wall of the body 10 may include: a front side wall, a rear side wall, a left side wall and a right side wall.
  • the handling robot when the handling robot is in the state of waiting for loading, it stops at a fixed position and does not move in a fixed posture. At this time, only one side wall of the handling robot, such as the front side wall, needs to be set up for sensing device 20.
  • the sensing device 20 can also be installed on both the front side wall and the rear side wall of the body 10 .
  • the handling robot moves towards the target object (such as fork tines) to a position where the target object is convenient for loading.
  • the handling robot needs to obtain the surrounding environment information; at this time, the handling robot Sensing devices may be provided on the front side wall, the rear side wall, the left side wall and the right side wall of the body 10 .
  • the top of the body 10 is provided with a sensing device 20, which is used when the target object loads the object lifted by itself on the transfer robot. Assuming that the sensing device is a distance sensor, the distance between the target object and the top of the body when the loading action is performed can be sensed.
  • the sensing device 20 is provided on the front side wall, the rear side wall, the right side wall and the left side wall of the body 10 is shown.
  • One or more sensing devices 20 may be disposed on the top of the body.
  • the sensing device 20 is provided at the central position of the top of the body 10 ; and/or the sensing device 20 is provided at the central position of at least one side of the top of the body 10 .
  • the sensing device 20 is provided at the central position O of the top of the body 10 ; the sensing devices 20 are respectively provided at the central positions B1 and B2 corresponding to the two opposite sides of the top of the body 10 .
  • the sensing device 20 is also provided at the other two opposite sides of the top of the body 10 corresponding to the central positions C1 and C2. It should be noted here that the number of the sensing devices provided on the top and/or the side wall of the body 10 shown in FIG. 1 is only schematic, and does not represent the actual number.
  • the sensing device 20 is a distance sensor, and when the distance sensor is used to sense environmental information around the body 10 , it is specifically used to sense objects around the body 10 The distance of the object relative to the body 10 .
  • the distance sensor may measure the distance of the target object relative to the body 10 by transmitting signals and receiving reflected signals.
  • the distance sensor can be any sensor that can realize the distance measurement of objects, such as but not limited to: acoustic wave sensor, infrared sensor, and lidar.
  • the sensing device 20 can also be a visual sensor.
  • the visual sensor is used to collect image information, and then the distance and/or orientation of the target object relative to the body 10 is obtained through image analysis and calculation.
  • the controller in this embodiment may be a central processing unit (CPU), a single-chip microcomputer, or the like with data processing and computing capabilities.
  • the controller can be set at any position of the body 10 according to the actual situation, which is not limited here.
  • the controller is electrically connected to the sensing device 20, and when the carrier is in a waiting state for loading, it can obtain the environmental information around the body 10 through the sensing device 20, so as to determine the When the environmental information determines that there is a target object in the set range around the body 10, a corresponding prompt instruction is output.
  • the above-mentioned target object may be a manually-driven transport vehicle, such as a forklift, for the transport robot to load objects; the transport robot being in a waiting loading working state may mean that the transport robot is waiting to load objects for it.
  • the prompt device 30 After the controller outputs the prompt instruction, the prompt device 30 is electrically connected to the controller.
  • the prompting device 30 can output prompt information perceivable by the human body, such as prompt sound and visual information, according to the prompting instruction.
  • the visual information may include, but is not limited to, at least one of the following: visible light rays, visible light curtains, visible projection patterns, and the like.
  • the above-mentioned prompt information can reflect the position where the body 10 is located, so as to provide a reference for the target object to perform the loading action.
  • the loading action may include: the action of the target object moving close to the body 10 to move to a suitable loading position, and the action of the target object loading the object on the body 10. It should be noted that the above prompt information can not only reflect the position of the body 10 , but also reflect the posture of the body 10 .
  • a sensing device is set on its body, so that the handling robot has the ability to sense the environmental information around the body;
  • the controller electrically connected to the detection device can obtain the environmental information through the sensing device, and output a prompt instruction when it is determined based on the environmental information that there is a target object within a set range around the body;
  • the prompt device electrically connected to the device can output prompt information that the human body can perceive and can reflect the position of the body, so as to provide a reference for the target object to perform the loading action, so as to realize the position of the transport robot.
  • the purpose of the prompt function is to reduce the occurrence rate of the handling robot being collided and damaged, and has a simple structure and low cost.
  • the prompting device 30 in this embodiment includes a speaker (not shown in the figure) and/or a visual signal output unit (not shown in the figure).
  • the speaker is used to output the prompt sound
  • the number of the speaker can be one or more
  • the position of the speaker on the body 10 can be flexibly set according to the actual situation, and the number of the speaker and the number of the speaker are not affected here.
  • the position on the body is limited.
  • the visual signal output unit is used to output visual information, and the visual information may include but is not limited to at least one of the following: visible light (light 301 indicated by the thick black line in FIG. ray 301 , the light ray 301 projected on the ground in FIG. 3 ), a visible light curtain (not shown in the corresponding drawings), and a visible projection pattern (arrow pattern 302 shown in FIG. 4 ).
  • the visual signal output unit 410 may be, but not limited to, one or a combination of a laser light, a projection device, and a visible light curtain emitter.
  • the visual output unit 410 is a laser light
  • the visual information output by the visual output unit 410 is visible laser light.
  • the visual output unit 410 is a projection device, such as a projector, the visual information output by the visual output unit 410 is a visual projection pattern, and the visual projection pattern can be: Not limited to patterns such as arrows; if the visual output unit 410 is a visible light curtain emitter, the visual information output by the visual output unit 410 is a visible light curtain, and the visual light curtain is composed of multiple beams.
  • a light curtain composed of parallel visible rays (such as visible infrared rays).
  • the visual signal output unit may be disposed on the top and/or the side wall of the body 10 .
  • at least one visual signal output unit is provided on the top of the body 10 to output first visual information in an upward direction, and the first visual information can guide the target The object loads the object on the body 10, and the target object can also be guided to move closer to the body 10 to move to a suitable loading position; and/or, on the side wall or the top edge of the body 10
  • At least one of the visual signal output units is provided to output second visual information projected on the ground, the second visual information can guide the target object to move closer to the body 10 to move to a suitable load Location.
  • the handling robot may be provided with a visual signal output unit that outputs the light 301 shown in FIG. 2 that is directed upward from the top of the body, and the light 301 projected on the ground shown in FIG. 3 or the visual signal of the arrow pattern 302 shown in FIG. 4 . output unit.
  • the upward-facing light 301 can prompt the target object to perform an action of loading objects into the handling robot; the light or pattern projected on the ground can prompt the target object to move toward the handling robot.
  • the visual signal output unit is provided at the central position of the top of the body 10.
  • the visual signal output unit may be provided at the central position O of the top of the body 10 shown in FIG. 1 . unit (shown in the figure).
  • the visual signal output unit is provided at the center position of at least one side of the top of the body 10, for example, the center positions B1 and B corresponding to the two opposite sides of the top of the body 10 shown in FIG.
  • a visual signal output unit may be respectively provided at B2
  • a visual signal output unit may be respectively provided at the center positions C1 and C2 corresponding to the other two opposite sides of the top.
  • the visual signal output unit is provided at the position of at least one corner of the top of the body 10, for example, at the corresponding positions of the four corners of the top of the body 10 shown in FIG. 1, That is, the position D1, the position D2, the position D3 and the position D4 are respectively provided with a visual signal output unit.
  • the visual signal output unit disposed at the center of the two opposite sides of the top of the body 10 is projected on the ground on the corresponding sides of the body 10 respectively.
  • the visual information (such as the two light rays 301 in FIG. 3 and the two groups of arrow patterns 302 arranged in sequence in FIG. 4 ) is axisymmetric with the body 10 as the center, and the two projected The width D between the visual information is larger than the width d of the target object.
  • the controller in this embodiment is configured to determine the distance and/or orientation of the target object relative to the body 10 according to the environmental information, and determine the distance and/or orientation of the target object relative to the body 10 according to the the distance and/or orientation, and output the corresponding prompt instruction to control the prompt device 30 to output at least one of the following prompt information as the distance and/or orientation is different: different frequencies and/or volumes warning sound, prompt voice of different content, visible light of different brightness and/or color, visible light curtain of different brightness and/or color, visual projection diagram of different brightness and/or pattern, as the target object Perform a load action to provide a reference.
  • the controller can control the prompting device 30 when it is determined that the target object is far away from the body 10 .
  • the speaker periodically outputs a warning sound with a low volume, and/or controls the visual signal output unit to output low-brightness, light-colored visible light;
  • the controller may control the speaker to output a warning sound whose audio and/or volume increases in steps (for example, it may be similar to the warning sound of a reversing radar), and/or control the visual signal output unit to output a high level.
  • the controller can control the speaker to output a warning sound such as a long beep, or can also control the speaker to output a corresponding voice announcement, and/or control the visual signal unit on the top of the body to output high brightness, Visible light in dark colors, etc.
  • the driver drives the forklift to move closer to the position of the handling robot to move to a suitable loading position.
  • the sensing device 20 on the side wall of the handling robot body 10 will acquire the environmental information around the body 10 and send it to the controller, and the controller can determine that the forklift is relative to the body according to the environmental information 10, and when it is determined that the forklift is not in the loading position according to the distance and/or orientation, control the speaker to output a corresponding prompt sound according to the distance and/or orientation, and/or control the top of the body 10 and/or Or the visual signal output unit on the side wall outputs the corresponding visual information to guide the forklift to move to the loading position.
  • the controller can The speaker is controlled to output a low-frequency, low-volume first warning sound. At this time, the driver can determine that the current forklift is relatively far away from the body 10 based on the first warning sound heard.
  • the controller can control the speaker to output audio and/or the second warning that the volume increases in steps At this time, the driver can judge that the current forklift is relatively close to the body 10 based on the second warning sound heard. In order to avoid collision with the handling robot, he can control the forklift to move forward at a relatively slow speed; When the distance from the body 10 reaches the third distance (eg 0.5m), the controller can control the speaker to output a third warning sound with a higher volume and a long beep.
  • the third distance eg 0.5m
  • the controller can obtain the environmental information above the top of the body 10 through the sensing device on the top of the body 10, and determine the distance of the object relative to the body 10 based on the environmental information, Therefore, based on the distance of the object relative to the body, the corresponding prompt command is output to control the speaker to output the corresponding prompt sound, so as to guide the forklift driver to control the forklift to load the object on the body 10.
  • the guide forklift described above.
  • the controller controls the speaker to output corresponding prompt sounds according to the difference in distance and orientation
  • the output can be prompt voices with different contents, for example, "there is still a distance of 1m", "shift a little to the left” prompt voice, etc.
  • the visual output unit can also be controlled to output corresponding visual information according to the difference in distance and/or orientation, so as to guide the driver to control the forklift to move to the loading position through the visual information, and/or guide the driver to control the forklift to load objects.
  • the visual output unit can also be controlled to output corresponding visual information according to the difference in distance and/or orientation, so as to guide the driver to control the forklift to move to the loading position through the visual information, and/or guide the driver to control the forklift to load objects.
  • the body for the specific guiding principle, please refer to the above-mentioned process of guiding based on the prompt sound, which will not be described in detail here.
  • the handling robot provided in this embodiment has a position prompt function, which is realized by combining auditory and/or visual aspects, which can effectively reduce the incidence of collision and damage of the handling robot, and has a simple structure, low cost.
  • FIG. 5 shows a schematic flowchart of a method for generating prompt information provided by an embodiment of the present application.
  • the prompt information generation method is applied to a handling robot.
  • the prompt information generation method includes:
  • the environmental information determine whether there is a target object within a set range around the body
  • the prompt information reflects the position where the transport robot is located, so as to provide a reference for the target object to perform the loading action.
  • the target object may be a human-driven transport vehicle 40 (as shown in FIG. 2 , FIG. 3 and FIG. 4 ) that loads objects for the handling robot, such as a forklift.
  • objects for the handling robot such as a forklift.
  • the above-mentioned transport robot being in the waiting state for loading may refer to a state in which the transport robot is moving towards the target object waiting to be loaded with objects, or may also mean that the transport robot is waiting to load objects for it.
  • the state of the target object moving toward it is not limited here.
  • the setting range around the body is related to the sensing range of the sensing device provided on the body.
  • step 103 "controlling the prompting device on the body to output prompting information perceivable by the human body” may specifically include the following steps:
  • control the prompting device to output corresponding prompt information that can be sensed by the human body.
  • the above-mentioned prompting device includes a visual signal output unit; accordingly, step 1032 "control the prompting device to output corresponding human-perceivable prompting information according to the distance and/or orientation" , the following steps can be used to achieve:
  • A11 According to the distance and/or orientation, determine whether the target object is already in the loading position
  • control the visual signal output unit on the top and/or the side wall of the body to output visual information, so as to guide the target object to move to the device position;
  • the visual signal output unit may be, but not limited to, a laser lamp, a projection device, and a visible light curtain emitter.
  • the visual information may be, but is not limited to, at least one of the following: visible light, visible light curtain, and visible projection pattern; wherein, the visible light curtain is composed of multiple parallel visible light rays (such as visible infrared light) composed of light curtains.
  • the visual signal output unit on the top and/or the side wall of the body can be controlled to output at least one of the following visual information according to the difference in the distance and/or orientation: Visible light of different brightness and/or color, visible light curtain of different brightness and/or color, visible projection pattern of different brightness and/or pattern, to guide the target object to move to the device location.
  • the visual signal output unit on the top of the body can also be controlled to output visual information in the above-mentioned manner, so as to guide the target object to load the object on the body.
  • the above-mentioned prompting device further includes a speaker, in this case, the above step 1032 "control the prompting device to output corresponding human-perceivable prompting information according to the distance and/or orientation.
  • the above step 1032 control the prompting device to output corresponding human-perceivable prompting information according to the distance and/or orientation.
  • the speaker outputs a prompt sound, and when the target object is not yet in the loading position, the speaker can be controlled to output at least one of the following depending on the distance and/or orientation Prompt tones: warning tones of different frequencies and/or volumes, and prompt voices of different contents.
  • the transport robot when the transport robot is in a waiting state for loading, it is determined based on the acquired environmental information around the transport robot body that there is a target object within a set range around the body. , the prompting device on the body is controlled to output prompting information that can be perceived by the human body and can reflect the location of the transport robot, so as to provide a reference for the target object to perform the loading action.
  • the technical solution can effectively avoid collision of the target object and damage to the transport robot, and the solution is simple and low in cost.
  • the execution subject of each step in the method provided in this embodiment may be a controller on a handling robot.
  • the content not described in detail in each step reference may be made to the corresponding content in the foregoing embodiments, and details are not repeated here.
  • the method provided in this embodiment may also include some or all of the other steps in the above embodiments. For details, please refer to the corresponding contents of the above embodiments, which will not be repeated here.
  • the handling system includes:
  • a handling robot specifically the body 10 shown in FIG. 2 , is used to autonomously travel to handle objects;
  • the handling robot when the handling robot is in a state of waiting for loading, after sensing the manually driven transport vehicle 40, it outputs a prompt information that can be sensed by the human body, so as to guide the driver to drive the manually driven transport vehicle 40 to complete the object loading action .
  • the manually-driven transport vehicle 40 may be a forklift, and certainly may be other types of transport vehicles, which are not limited here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Transportation (AREA)
  • Civil Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本申请实施例提供一种搬运机器人、搬运系统及提示信息生成方法。其中,所述搬运机器人包括:机体;感测装置,设置在所述机体上,用于感测所述机体周围的环境信息;控制器,与所述感测装置电连接,用于在所述搬运机器人处于等待装载工作状态时,通过所述感测装置获取所述环境信息;基于所述环境信息确定所述机体周围的设定范围内存在目标对象时,输出提示指令;提示装置,与所述控制器电连接,用于按照所述提示指令,输出人体可感知的提示信息;其中,所述提示信息反映所述机体所在的位置,以为所述目标对象执行装载动作提供参照。本申请实施例提供的技术方案,可降低搬运机器人被碰撞、损毁的发生率,且结构简单、成本低。

Description

搬运机器人、搬运系统及提示信息生成方法
交叉引用
本申请引用于2021年03月12日递交的名称为“搬运机器人、搬运系统及提示信息生成方法”的第202110272328.9号中国专利申请,其通过引用被全部并入本申请。
技术领域
本申请涉及机器人技术领域,尤其涉及一种搬运机器人、搬运系统及提示信息生成方法。
背景技术
搬运机器人是可以进行自动化搬运作业的工业机器人。搬运作业是指用一种设备握持或承托物体,从一个位置移动到另一个位置,例如,对于仓库中存在的搬运机器人和叉车配合的搬运作业场景,则是由驾驶人员控制叉车将承托物体的托盘托起并装载在搬运机器人上,之后再由搬运机器人将物体搬运至指定地点。
然而,驾驶人员在控制叉车搬运物体的过程中,其视线易被叉起的物体阻挡,难以看到相对低矮的搬运机器人(典型约高20-40cm)。实际作业中,存在叉车碰撞、损毁搬运机器人的危险,且同时还存在驾驶人员无法判断当前叉车所处的位置是否适于执行将物体装载在搬运机器人上。
发明内容
本申请提供一种解决上述问题或至少部分地解决上述问题的搬运机器人、搬运系统及提示信息生成方法。具体地,
在本申请的一个实施例中,提供了一种搬运机器人。该搬运机器人,包括:
机体;
感测装置,设置在所述机体上,用于感测所述机体周围的环境信息;
控制器,与所述感测装置电连接,用于在所述搬运机器人处于等待装载工作状态时,通过所述感测装置获取所述环境信息;基于所述环境信息确定所述机体周围的设定范围内存在目标对象时,输出提示指令;
提示装置,与所述控制器电连接,用于按照所述提示指令,输出人体可感知的提示信息;
其中,所述提示信息反映所述机体所在的位置,以为所述目标对象执行装载动作提供参照。
在本申请的一实施例中,提供了一种搬运系统。该系统包括:
如上所述的搬运机器人,用于自主行进以搬运物体;
人工驾驶运输车辆,用于为搬运机器人装载物体;
其中,所述搬运机器人在处于等待装载工作状态时,感测到所述人工驾驶运输车辆后,输出人体可感知的提示信息,以引导驾驶人员驾驶所述人工驾驶运输车辆完成物体装载动作。
在本申请的一实施例中,提供了一种提示信息生成方法,应用于搬运机器人。该方法包括:
在所述搬运机器人处于等待装载工作状态时,获取所述搬运机器人机体周围的环境信息;
根据所述环境信息,确定所述机体周围的设定范围内是否存在目标对象;
在确定所述目标对象时,控制所述机体上的提示装置输出人体可感知的提示信息;
其中,所述提示信息反映所述搬运机器人所在的位置,以为所述目标对象执行装载动作提供参照。
本申请各实施例提供的技术方案中,搬运机器人机体上的感测装置,使所述搬运机器人具有感测所述机体周围环境信息的能力;在所述搬运机器人处于等待装载工作状态时,与所述感测装置电连接的控制器,可以通过所述感测装置获取所述环境信息,并基于所述环境信息确定出所述机体周围的设定范围内存在目标对象时,输出提示指令;与所述控制器电连接的提示装置按照所述提示指令,可以输出人体可感知的、能够反映所述机体所在位置的提示信息,以为所述目标对象执行装载动作提供参照,降低搬运机器人被碰撞、损毁的发生率,且结构简单、成本低。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请一实施例提供的一种搬运机器人的结构示意图;
图2为本申请一实施例提供的搬运系统的结构示意图;
图3为本申请一实施例提供的搬运机器人中视觉信号输出单元输出投影在地面上的两条光线的示意图;
图4为本申请一实施例提供的搬运机器人中视觉信号输出单元输出投影在地面上的两列箭头图案的示意图;
图5为本申请一实施例提供的提示信息生成方法的流程示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
在本申请的说明书、权利要求书及上述附图中描述的一些流程中,包含了按照特定顺序出现的多个操作,这些操作可以不按照其在本文中出现的顺序来执行或并行执行。操作的序号如101、102等,仅仅是用于区分各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作可以按顺序执行或并行执行。需要说明的是,本申请中术语“或/和”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如:A或/和B,表示可以单独存在A,同时存在A和B,单独存在B这三种情况;本申请中字符“/”,一般表示前后关联对象是一种“或”关系。此外,下述的各实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1示出了本申请一实施例提供的一种搬运机器人的结构示意图。如图1所示,该搬运机器人,包括:机体10、感测装置20、控制器(图中未示出)及提示装置30。其中,
感测装置20,设置在所述机体10上,用于感测所述机体10周围的环境信息。
控制器,与所述感测装置20电连接,用于在所述搬运机器人处于等待装载工作状态时,通过所述感测装置20获取所述环境信息;基于所述环境信息确定所述机体周围的设定范围内存在目标对象时,输出提示指令。
提示装置30,与所述控制器电连接,用于按照所述提示指令,输出提示信息;
其中,所述提示信息反映所述机体所在的位置,以为所述目标对象执行装载动作提供参照。
沿搬运机器人行进方向,机体10周围各方位可区分为:前侧、后侧、左侧、右侧。机体10周围还可包含机体顶部上方对应的空间。在具体实施时,可根据实际作业需求来设计感测装置20在机体10上的位置。比如,机体10的侧壁上及顶部均可设置有感测装置20。
为了方便描述,按照上述方位,机体10的侧壁可包含有;前侧壁、后侧壁、左侧壁和右侧壁。举例来说,搬运机器人处于等待装载工作状态时,停靠在固定位置,且采用固定的姿态不动,此时只需在所述搬运机器人的一个侧壁上,如前侧壁上,设置感测装置20。当然,也可在机体10的前侧壁和后侧壁上均安装感测装置20。又比如,搬运机器人处于等待装载工作状态时,搬运机器人朝向目标对象(如叉齿)行进至目标对象方便装载的位置处,搬 运机器人在行进过程中,需获取四周的环境信息;此时搬运机器人机体10的前侧壁、后侧壁、左侧壁及右侧壁上均可设置感测装置。
机体10顶部设置感测装置20,为了在目标对象将自身托举的物体装载在搬运机器人上时使用。假设,所述感测装置为距离传感器,则可感测目标对象在执行装载动作时距机体顶部的距离。
图1示出的实施例中,示出了所述机体10的前侧壁、后侧壁、右侧壁及左侧壁上均设有所述感测装置20的情况。
所述机体顶部上可设置一个或多个感测装置20。所述机体10顶部的中心位置处设置有感测装置20;和/或,所述机体10顶部的至少一条边的中心位置处设有所述感测装置20。比如,图1中示出的,所述机体10顶部的中心位置O处设有感测装置20;所述机体10顶部的两相对边对应的中心位置B1和B2处分别设置有所述感测装置20,所述机体10顶部的另两条相对边对应中心位置C1和C2处也设置有所述感测装置20。此处需说明的是,图1中示出的设置在所述机体10顶部和/或侧壁上的感测装置的数量仅仅是示意性的,并不代表实际设置数量。
在本申请的一些实施例中,所述感测装置20为距离传感器,所述距离传感器在用于感测所述机体10周围的环境信息时,具体用于感测所述机体10周围的目标对象相对于所述机体10的距离。在一些实施例中,距离传感器可以通过发射信号并接收反射信号来测量目标对象相对于所述机体10的距离。所述距离传感器可选用任何能实现物体测距的传感器,如但不限于:声波传感器、红外传感器和激光雷达等。
所述感测装置20除了可以是距离传感器外,还可以是视觉传感器。利用视觉传感器采集图像信息,然后通过图像分析和计算,得到目标对象相对于所述机体10的距离和/或方位。
本实施例中的所述控制器可以是具有数据处理、计算能力的中央处理单元(CPU)、单片机等。该控制器可以根据实际情况设置在所述机体10的任何位置,此处并不作限定。所述控制器与所述感测装置20电连接,在所述搬运机人处于等待装载工作状态时,其可以通过所述感测装置20获取所述机体10周围的环境信息,从而在基于所述环境信息确定所述机体10周围的设定范围内存在目标对象时,输出相应的提示指令。在一些实施例中,上述目标对象可以是为所述搬运机器人装载物体的人工驾驶运输车辆,如叉车;所述搬运机器人处于等待装载工作状态可以是指所述搬运机器人朝着等待为其装载物体的所述目标对象移动靠近的状态,或者也可以是指所述搬运机器人等待为其装载物体的所述目标对象,朝着其移动靠近的状态,此处不作限定。
在所述控制器输出提示指令后,与所述控制器电连接的提示装置30。提示装置30可按照所述提示指令输出人体可感知的提示信息,如提示音、视觉信息。其中,视觉信息可包括但不限于如下中的至少一种:可见光线、可视光幕、可视投影图案等。
上述提示信息能够反映所述机体10所在的位置,以为所述目标对象执行装载动作提供参照。其中,所述装载动作可以包括:所述目标对象向所述机 体10进行移动靠近,以移至适合的装载位置的动作,以及所述目标对象将物体装载在所述机体10上的动作。需说明的是,上述提示信息除了能够反映所述机体10所在的位置之外,还能够反映所述机体10的姿态。
本实施例提供的搬运机器人,其机体上设置了感测装置,使该搬运机器人具有感测所述机体周围的环境信息的能力;在所述搬运机器人处于等待装载工作状态时,与所述感测装置电连接的控制器,可以通过所述感测装置获取所述环境信息,并基于所述环境信息确定所述机体周围的设定范围内存在目标对象时,输出提示指令;与所述控制器电连接的提示装置按照所述提示指令,可以输出人体可感知的能够反映所述机体所在位置的提示信息,以为所述目标对象执行装载动作提供参照,从而实现了使所述搬运机器人具有位置提示功能的目的,可降低搬运机器人被碰撞、损毁的发生率,且结构简单、成本低。
进一步地,继续参见图1,本实施例中的所述提示装置30包括扬声器(图中未示出)和/或视觉信号输出单元(图中未明显示出)。其中,所述扬声器用于输出提示音,所述扬声器的数量可以为一个或多个,且其设置在所述机体10上的位置可以根据实际情况灵活设置,此处并不对扬声器的数量及其所设置在机体上的位置进行限定。所述视觉信号输出单元用于输出视觉信息,所述视觉信息可以包括但不限于如下中的至少一种:可视光线(如图1中粗黑线示意的光线301、图2中朝向机体上方的光线301、图3中投影在地面上的光线301)、可视光幕(无相应附图示出)、可视投影图案(如图4所示箭头图案302)。
相应地,在一些实施例中,所述视觉信号输出单元410可以为但不限于为镭射灯、投影设备、可视光幕发射器中的一种或组合。具体实施时,若所述视觉输出单元410为镭射灯,所述视觉输出单元410输出的视觉信息则为可视镭射光线,所述镭射光线的呈现形态具体可以为线状、红色,当然也可以为其他形态,此处不作限定;若所述视觉输出单元410为投影设备,比如投影仪,所述视觉输出单元410输出的视觉信息则为可视投影图案,所述可视投影图案可以为但不限于诸如箭头类型的图案;若所述视觉输出单元410为可视光幕发射器,所述视觉输出单元410输出的视觉信息则为可视光幕,所述可视光幕是由多束平行的可视光线(如可视红外光线)组成的光幕。
在本实施例中,所述视觉信号输出单元可设置在所述机体10的顶部和/或侧壁上。具体地,在一可实现的技术方案中,所述机体10的顶部设有至少一个所述视觉信号输出单元,以输出方向向上的第一视觉信息,所述第一视觉信息可以引导所述目标对象将物体装载在所述机体10上,也可以引导所述目标对象向所述机体10进行移动靠近,以移至合适的装载位置;和/或,所述机体10的侧壁上或顶部边缘设有至少一个所述视觉信号输出单元,以输出投影在地面上的第二视觉信息,所述第二视觉信息可以引导所述目标对象向所述机体10进行移动靠近,以移至合适的装载位置。
搬运机器人上可设置有输出图2中示出的自机体顶部朝向上的光线301的视觉信号输出单元以及图3中示出的投影在地面上的光线301或图4中箭 头图案302的视觉信号输出单元。朝向上的光线301可提示目标对象执行向搬运机器人装载物体的动作;投影在地面上的光线或图案,可提示目标对象朝搬运机器人行进的方向。
在具体实施时,所述机体10顶部的中心位置处设有所述视觉信号输出单元,比如,在图1中示出的所述机体10顶部的中心位置O处可设有一所述视觉信号输出单元(图中为示出)。和/或,所述机体10顶部的至少一条边的中心位置处设有所述视觉信号输出单元,比如,在图1中示出的所述机体10顶部的两相对边对应的中心位置B1和B2处可分别设有一视觉信号输出单元,顶部的另两相对边对应的中心位置C1和C2处也可分别设有一视觉信号输出单元。和/或,所述机体10顶部的至少一个边角的位置处设有所述视觉信号输出单元,比如,图1中示出的所述机体10顶部的四个边角各自对应的位置处,即位置D1、位置D2、位置D3及位置D4,分别设置有一个视觉信号输出单元。
这里需说明的是:图1中示出的设置在所述机体10上的视觉信号单元数量仅仅是示意性的,并不代表实际的设置数量。
如图3和图4所示的实例,所述机体10顶部的两个相对边的中心位置处设置的所述视觉信号输出单元,分别在所述机体10相应两侧的地面投影出的第二视觉信息(如图3中的两条光线301和图4中两组顺序排列的箭头图案302),是以所述机体10为中心,成轴对称的,且两个投影出的所述第二视觉信息间的宽度D大于所述目标对象的宽度d。
基于上述内容,在一可实现的技术方案中,本实施例中的所述控制器用于根据所述环境信息确定出所述目标对象相对于所述机体10的距离和/或方位,并根据所述距离和/或方位,输出相应的所述提示指令,以控制所述提示装置30随着所述距离和/或方位的不同,输出如下中的至少一种提示信息:不同频率和/或音量的警示音、不同内容的提示语音、不同亮度和/或颜色的可视光线、不同亮度和/或颜色的可视光幕、不同亮度和/或图案的可视投影图,以为所述目标对象执行装载动作提供参照。
例如,若所述提示装置30中的视觉信号输出单元为镭射灯,所述控制器在确定所述目标对象相对于所述机体10的距离较远情况下,可以控制所述提示装置30中的扬声器周期性的输出音量较低的警示音,和/或控制所述视觉信号输出单元输出低亮度、浅颜色的可视光线;或者,在确定所述目标对象相对于所述机体10的距离较近情况下,所述控制器可以控制所述扬声器输出音频和/或音量呈阶梯性变高的警示音(如可类似于倒车雷达警示音),和/或控制所述视觉信号输出单元输出高亮度、深颜色的可视光线;又或者,在确定所述目标对象相对于所述机体10的距离达到适合于所述目标对象执行将物体装载在所述机体10上的设定距离时,所述控制器可以控制所述扬声器输出诸如频率为长鸣的警示音,或者也可以控制所述扬声器输出对应的语音播报,和/或控制所述机体顶部上的所述视觉信号单元输出高亮度、深颜色的可视光线等等。
下面举一具体应用场景来详细说明上述提示信息是如何为所述目标对象 执行装载动作提供参照的。
例如,假设目标对象为叉车,如图2、图3所示,叉车提升物体后,驾驶人员驾驶叉车朝向搬运机器人所在的位置方向进行移动靠近,以移至合适的装载位置。在叉车朝向搬运机器人进行移动靠近的过程中,搬运机器人机体10侧壁上的感测装置20会获取机体10周围的环境信息并发送至控制器,控制器根据环境信息可以确定出叉车相对于机体10的距离和/或方位,且在根据距离和/或方位确定叉车未处于装载位置时,随着距离和/或方位的不同控制扬声器输出相应的提示音,和/或控制机体10顶部和/或侧壁上的视觉信号输出单元输出相应的视觉信息,以引导叉车移动至装载位置。比如,对于根据距离来控制扬声器输出相应的提示音,以通过提示音将叉车引导至装载位置来说,若确定叉车相对于机体10的距离大于第一距离(如10m、5m),控制器可以控制扬声器输出低频率低、小音量的第一警示音,此时驾驶人员基于听到的第一警示音可以判断出当前叉车处于距离机体10相对较远的位置,为此其可以控制叉车以较快的速度前移;若叉车相对于机体10的距离小于第一距离且大于第二距离(如2m、1m),控制器可以控制扬声器输出音频和/或音量呈阶梯性变高的第二警示音,此时驾驶人员基于听到的第二警示音可以判断出当前叉车处于距离机体10相对较近的位置,为避免碰撞到搬运机器人,其可以控制叉车以相对较慢速度前移;若叉车相对于机体10的距离达到第三距离(如0.5m),控制器可以控制扬声器输出音量较高、长鸣的第三警示音,此时驾驶人员基于第三警示音可以判断出叉车已到达适合于执行将物体装载至机体10上的装载位置,为此其可以控制叉车停止前移,并开始执行将物体装载至机体10上的动作。驾驶人员在控制叉车执行将物体装置至机体10上时,控制器可以通过机体10顶部的感测装置来获取机体10顶部上方的环境信息,并基于该环境信息确定物体相对于机体10的距离,从而基于物体相对于机体的距离,输出相对应的提示指令,以控制扬声器输出相应的提示音,从而引导叉车驾驶员控制叉车将物体装载在机体10上,具体引导过程可参见上述描述的引导叉车移动至合适的装载位置过程,此处不再作具体赘述。而对于所述控制器根据距离和方位的不同来控制扬声器输出相应的提示音的情况下,输出的可以是不同内容的提示语音,比如,“还有1m的距离”“向左偏移一些”的提示语音等。
同理,也可以根据距离和/或方位的不同来控制视觉输出单元输出相应的视觉信息,以通过视觉信息引导驾驶人员控制叉车移动至装载位置,和/或引导驾驶人员控制叉车执行将物体装载在机体上,具体引导原理可参见上述基于提示音进行引导的过程,此处不再作具体赘述。
综上可见,本实施例提供的搬运机器人具有位置提示功能,该位置提示功能是结合听觉和/或视觉两个方面实现的,可有效降低搬运机器人被碰撞、损毁的发生率,且结构简单、成本低。
图5示出了本申请一实施例提供的提示信息生成方法的流程示意图。该提示信息生成方法应用于搬运机器人。如图5所示,该提示信息生成方法包 括:
101、在所述搬运机器人处于等待装载工作状态时,获取所述搬运机器人机体周围的环境信息;
102、根据所述环境信息,确定所述机体周围的设定范围内是否存在目标对象;
103、在确定存在所述目标对象时,控制所述机体上的提示装置输出人体可感知的提示信息;
其中,所述提示信息反映所述搬运机器人所在的位置,以为所述目标对象执行装载动作提供参照。
上述101中,有关所述搬运机器人的具体结构功能可参见上述与图1相关的相应内容,此处不再做赘述。所述目标对象可以是为所述搬运机器人装载物体的人工驾驶运输车辆40(如图2、图3和图4所示),比如叉车。在实际应用中,所述目标对象在为所述搬运机器人装载物体时,一般存在如下两种情况:一种是所述搬运机器人不动,所述目标对象提升物体后,将朝向所述搬运机器人所在的位置方向进行移动靠近,并在到达合适的位置时,执行将物体装载在所述搬运机器人上的动作;另一种是所述目标对象提升物体,等待所述搬运机器人向其不断移动、靠近,并在所述搬运机器人到达适合的位置时,所述目标对象执行将物体装载在所述搬运机器人上的动作。据此,上述搬运机器人处于等待装载工作状态可以是指所述搬运机器人朝着等待为其装载物体的所述目标对象移动的状态,或者也可以是指所述搬运机器人等待为其装载物体的所述目标对象,朝着其移动的状态,此处并不作限定。在所述搬运机器人处于等待装载工作状态的情况下,可以通过设置在所述搬运机器人机体上的感测装置来获取所述搬运机器人机体周围的环境信息。
上述步骤102中,所述机体周围的设定范围与机体上设置的感测装置的感测范围有关。
上述步骤103中“控制所述机体上的提示装置输出人体可感知的提示信息”,可具体包括如下步骤:
1031、根据所述环境信息,确定所述目标对象相对于所述机体的距离和/或方位;
1032、根据所述距离和/或方位,控制所述提示装置输出相应的人体可感知的提示信息。
在一具体实现的技术方案中,上述提示装置中包含有视觉信号输出单元;相应地,步骤1032“根据所述距离和/或方位,控制所述提示装置输出相应的人体可感知的提示信息”,具体可采用如下步骤来实现:
A11、根据所述距离和/或方位,确定所述目标对象是否已处于装载位置;
A12、在所述目标对象还未处于装载位置时,控制所述机体顶部和/或侧壁上的视觉信号输出单元输出视觉信息,以引导所述目标对象移动至所述装置位置处;
A13、在所述目标对象处于装置位置时,控制所述机体顶部上的所述视觉信号输出单元输出视觉信息,以引导所述目标对象将物体装载在所述机体上。
具体实施时,所述视觉信号输出单元可以为但不限于镭射灯、投影设备、可视光幕发射器。相应地,所述视觉信息可以为但不限于如下中的至少一种:可视光线、可视光幕、可视投影图案;其中,所述可视光幕是由多束平行的可视光线(如可视红外光线)组成的光幕。在所述目标对象还未处于装载位置时,可以随着所述距离和/或方位的不同控制所述机体顶部和/或侧壁上的视觉信号输出单元输出如下中的至少一种视觉信息:不同亮度和/或颜色的可视光线、不同亮度和/或颜色的可视光幕、不同亮度和/或图案的可视投影图案,以引导所述目标对象移动至所述装置位置处。同样,在所述目标对象处于装置位置时,也可以按照上述方式控制所述机体顶部上的所述视觉信号输出单元输出视觉信息,以引导所述目标对象将物体装载在所述机体上。
在另一具体可实现的技术方案中,上述提示装置还包含有扬声器,在此情况下,上述步骤1032“根据所述距离和/方位,控制所述提示装置输出相应的人体可感知的提示信息”的具体实现步骤可参见上述所述提示装置包含视觉信号单元对应的实现步骤,此处不再作具体赘述。不同之处仅在于,所述扬声器输出的是提示音,在所述目标对象还未处于装载位置时,可以随着所述距离和/或方位的不同控制所述扬声器输出如下中的至少一种提示音:不同频率和/或音量的警示音、不同内容的提示语音。
本实施例提供的技术方案,在所述搬运机器人处于等待装载工作状态时,基于所获取到的所述搬运机器人机体周围的环境信息,确定出所述机体周围的设定范围内存在目标对象时,将控制所述机体上的提示装置输出人体可感知的能够反映所述搬运机器人所在位置的提示信息,从而可以为所述目标对象执行装载动作提供参照。该技术方案可有效避免所述目标对象碰撞、损毁所述搬运机器人,且方案简单、成本低。
这里需要说明的是:本实施例提供的所述方法中各步骤的执行主体可以是搬运机器人上的控制器。另外,各步骤中未尽详述的内容可参见上述各实施例中的相应内容,此处不再赘述。此外,本实施例提供的所述方法中除了上述各步骤以外,还可包括上述各实施例中其他部分或全部步骤,具体可参见上述各实施例相应内容,在此不再赘述。
本申请还一实施例提供一种搬运系统。参见图2、图3和图4所示,所述搬运系统包括:
搬运机器人,其具体如图2中所示的机体10,用于自主行进以搬运物体;
人工驾驶运输车辆40,用于为搬运机器人装载物体;
其中,所述搬运机器人在处于等待装载工作状态时,感测到所述人工驾驶运输车辆40后,输出人体可感知的提示信息,以引导驾驶人员驾驶所述人工驾驶运输车辆40完成物体装载动作。
上述有关所述搬运机器人的具体结构功能可参见上述与图1相关的相应内容,此处不再做赘述。人工驾驶运输车辆40可以为叉车,当然也可以为其他类型的运输车辆,此处并不作限定。
这里需要说明的是:本实施例提供的所述搬运系统中各步骤未尽详述的 内容可参见上述各实施例中的相应内容,此处不再作赘述。此外,本实施例提供的所述搬运系统中除了上述各步骤以外,还可包括上述各实施例中其他部分或全部步骤,具体可参见上述各实施例相应内容,在此不再赘述。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (15)

  1. 一种搬运机器人,其特征在于,包括:
    机体;
    感测装置,设置在所述机体上,用于感测所述机体周围的环境信息;
    控制器,与所述感测装置电连接,用于在所述搬运机器人处于等待装载工作状态时,通过所述感测装置获取所述环境信息;基于所述环境信息确定所述机体周围的设定范围内存在目标对象时,输出提示指令;
    提示装置,与所述控制器电连接,用于按照所述提示指令,输出人体可感知的提示信息;
    其中,所述提示信息反映所述机体所在的位置,以为所述目标对象执行装载动作提供参照。
  2. 根据权利要求1所述的搬运机器人,其特征在于,所述提示装置包括:扬声器和/或视觉信号输出单元;其中,
    所述扬声器,用于输出提示音;
    所述视觉信号输出单元,用于输出视觉信息;
    所述视觉信息包括如下中的至少一种:可视光线、可视光幕、可视投影图案。
  3. 根据权利要求2所述的搬运机器人,其特征在于,
    所述控制器,用于根据所述环境信息确定所述目标对象相对于所述机体的距离和/或方位;根据所述距离和/或方位,输出相应的所述提示指令,以控制所述提示装置随着所述距离和/或方位的不同,输出如下中的至少一种信息:
    不同频率和/或音量的警示音;
    不同内容的提示语音;
    不同亮度和/或颜色的可视光线;
    不同亮度和/或颜色的可视光幕;
    不同亮度和/或图案的可视投影图案。
  4. 根据权利要求2所述的搬运机器人,其特征在于,
    所述机体的顶部设有至少一个所述视觉信号输出单元,以输出方向向上的第一视觉信息;和/或
    所述机体的侧壁上或顶部边缘设有至少一个所述视觉信号输出单元,以输出投影在地面上的第二视觉信息。
  5. 根据权利要求4所述的搬运机器人,其特征在于,
    所述机体顶部的中心位置处设有所述视觉信号输出单元;和/或
    所述机体顶部的至少一条边的中心位置处设有所述视觉信号输出单元;和/或
    所述机体顶部的至少一个边角的位置处设有所述视觉信号输出单元。
  6. 根据权利要求5所述的搬运机器人,其特征在于,所述机体顶部的两个相对边的中心位置处设置的所述视觉信号输出单元,分别在所述机体相应两侧的地面投影出的第二视觉信息,以所述机体为中心,成轴对称,且两个 投影出的所述第二视觉信息间的宽度大于所述目标对象的宽度。
  7. 根据权利要求2至6中任一项所述的搬运机器人,其特征在于,所述视觉信号输出单元为镭射灯、投影设备、可视光幕发射器。
  8. 根据权利要求1至6中任一项所述的搬运机器人,其特征在于,所述机体的顶部和侧壁均设有所述感测装置。
  9. 根据权利要求8所述的搬运机器人,其特征在于,沿所述搬运机器人的行进方向,所述机体的侧壁包括前侧壁、后侧壁、右侧壁及左侧壁;
    所述机体的前侧壁设有所述感测装置;或者
    所述机体的前侧壁和后侧壁上均设有所述感测装置;或者
    所述机体的前侧壁、后侧壁、左侧壁及右侧壁上均设有所述感测装置。
  10. 根据权利要求8所述的搬运机器人,其特征在于,所述机体顶部的中心位置处设有所述感测装置;和/或
    所述机体顶部的至少一条边的中心位置处设有所述感测装置。
  11. 根据权利要求1所述的搬运机器人,其特征在于,所述感测装置为距离传感器。
  12. 一种搬运系统,其特征在于,包括:
    如权利要求1至11中任一项所述的搬运机器人,用于自主行进以搬运物体;
    人工驾驶运输车辆,用于为搬运机器人装载物体;
    其中,所述搬运机器人在处于等待装载工作状态时,感测到所述人工驾驶运输车辆后,输出人体可感知的提示信息,以引导驾驶人员驾驶所述人工驾驶运输车辆完成物体装载动作。
  13. 一种提示信息生成方法,应用于搬运机器人,其特征在于,包括:
    在所述搬运机器人处于等待装载工作状态时,获取所述搬运机器人机体周围的环境信息;
    根据所述环境信息,确定所述机体周围的设定范围内是否存在目标对象;
    在确定存在所述目标对象时,控制所述机体上的提示装置输出人体可感知的提示信息;
    其中,所述提示信息反映所述搬运机器人所在的位置,以为所述目标对象执行装载动作提供参照。
  14. 根据权利要求13所述的方法,其特征在于,控制所述机体上的提示装置输出人体可感知的提示信息,包括:
    根据所述环境信息,确定所述目标对象相对于所述机体的距离和/或方位;
    根据所述距离和/或方位,控制所述提示装置输出相应的人体可感知的提示信息。
  15. 根据权利要求14所述的方法,其特征在于,所述提示装置包含视觉信号输出单元;以及
    根据所述距离和/或方位,控制所述提示装置输出相应的人体可感知的提 示信息,包括:
    根据所述距离和/或方位,确定所述目标对象是否已处于装载位置;
    在所述目标对象还未处于装载位置时,控制所述机体顶部和/或侧壁上的视觉信号输出单元输出视觉信息,以引导所述目标对象移动至所述装载位置处;
    在所述目标对象处于装置位置时,控制所述机体顶部上的所述视觉信号输出单元输出视觉信息,以引导所述目标对象将物体装载在所述机体上。
PCT/CN2021/138720 2021-03-12 2021-12-16 搬运机器人、搬运系统及提示信息生成方法 WO2022188497A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110272328.9 2021-03-12
CN202110272328.9A CN115072626B (zh) 2021-03-12 2021-03-12 搬运机器人、搬运系统及提示信息生成方法

Publications (1)

Publication Number Publication Date
WO2022188497A1 true WO2022188497A1 (zh) 2022-09-15

Family

ID=83226306

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/138720 WO2022188497A1 (zh) 2021-03-12 2021-12-16 搬运机器人、搬运系统及提示信息生成方法

Country Status (2)

Country Link
CN (1) CN115072626B (zh)
WO (1) WO2022188497A1 (zh)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281653A (ja) * 2002-03-26 2003-10-03 Victor Co Of Japan Ltd 自律行動ロボット
JP2006048308A (ja) * 2004-08-03 2006-02-16 Funai Electric Co Ltd 自走式掃除機
CN206219148U (zh) * 2016-11-25 2017-06-06 广州顶牛汽车用品有限公司 叉车智能一体雷达
CN208796109U (zh) * 2018-09-21 2019-04-26 东莞市开胜电子有限公司 一种多传感器感知的背驮式自动引导运输车
CN110045739A (zh) * 2019-05-10 2019-07-23 湖北汽车工业学院 一种智能仓储物料机器人、控制系统及控制方法
CN209980437U (zh) * 2019-01-29 2020-01-21 浙江瑞华康源科技有限公司 一种到达提醒装置
CN110844496A (zh) * 2019-11-25 2020-02-28 威海职业学院 一种智能机电自动化送料控制系统及方法
WO2020197244A1 (en) * 2019-03-25 2020-10-01 Lg Electronics Inc. Mobile robot and method of controlling the same

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10236476A (ja) * 1996-12-27 1998-09-08 Yoshiko Fujisawa パレット、このパレットの製造方法、フォークリフト、およびこれらを用いた荷役搬送システム、並びにこの荷役搬送システムに用いられる間仕切り板
JP4330050B2 (ja) * 1999-08-27 2009-09-09 東急車輛製造株式会社 パレット搬送車
JP2006111415A (ja) * 2004-10-15 2006-04-27 Toyota Industries Corp ロケーション指示装置、ロケーション管理システム
KR101397342B1 (ko) * 2012-02-29 2014-05-20 부산대학교 산학협력단 무인지게차의 팔레트 자율하역 장치 및 방법
US20150202770A1 (en) * 2014-01-17 2015-07-23 Anthony Patron Sidewalk messaging of an autonomous robot
JP6267539B2 (ja) * 2014-02-24 2018-01-24 株式会社岡村製作所 搬送台車
DE102016012313A1 (de) * 2016-10-15 2018-04-19 Man Truck & Bus Ag Vorrichtung zur Unterstützung eines Fahrers eines Fahrzeugs mit einer Projektionseinrichtung
CN206536514U (zh) * 2017-01-16 2017-10-03 山东华力机电有限公司 搬运机器人安全保护装置
KR102235227B1 (ko) * 2017-02-17 2021-04-02 호쿠요덴키 가부시키가이샤 물체 포착 장치, 포착 대상물, 및 물체 포착 시스템
JP2018139020A (ja) * 2017-02-24 2018-09-06 シーオス株式会社 自律移動装置および反射部材
CN108502434A (zh) * 2017-02-28 2018-09-07 广东利保美投资有限公司 托盘机器人
JP6940969B2 (ja) * 2017-03-29 2021-09-29 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 車両制御装置、車両制御方法及びプログラム
JP2018169894A (ja) * 2017-03-30 2018-11-01 村田機械株式会社 特異部分検出装置、自律移動装置、および、特異部分検出方法
CN109129389A (zh) * 2017-06-27 2019-01-04 京东方科技集团股份有限公司 一种机器人及其拼接方法、机器人拼接系统
CN108303972B (zh) * 2017-10-31 2020-01-17 腾讯科技(深圳)有限公司 移动机器人的交互方法及装置
KR102018765B1 (ko) * 2018-06-20 2019-09-04 회명정보통신(주) 지게차 포크 위치 검출 장치
CN208715752U (zh) * 2018-07-20 2019-04-09 中南林业科技大学 一种可追溯的竹木复合托盘
CN109844674B (zh) * 2018-10-15 2023-02-03 灵动科技(北京)有限公司 具有可操控摄像机和指示器的物流机器人和运行方法
CN110860057A (zh) * 2019-11-18 2020-03-06 燕山大学 一种消防侦察机器人及侦察方法
CN111361917A (zh) * 2020-03-16 2020-07-03 福建通力达实业有限公司 一种移动货架位置测算纠偏方法和系统
CN111533051B (zh) * 2020-05-08 2021-12-17 三一机器人科技有限公司 托盘位姿检测方法、装置、叉车和货运系统
CN112171663A (zh) * 2020-09-03 2021-01-05 上海姜歌机器人有限公司 机器人状态提示系统、方法、装置和电子设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281653A (ja) * 2002-03-26 2003-10-03 Victor Co Of Japan Ltd 自律行動ロボット
JP2006048308A (ja) * 2004-08-03 2006-02-16 Funai Electric Co Ltd 自走式掃除機
CN206219148U (zh) * 2016-11-25 2017-06-06 广州顶牛汽车用品有限公司 叉车智能一体雷达
CN208796109U (zh) * 2018-09-21 2019-04-26 东莞市开胜电子有限公司 一种多传感器感知的背驮式自动引导运输车
CN209980437U (zh) * 2019-01-29 2020-01-21 浙江瑞华康源科技有限公司 一种到达提醒装置
WO2020197244A1 (en) * 2019-03-25 2020-10-01 Lg Electronics Inc. Mobile robot and method of controlling the same
CN110045739A (zh) * 2019-05-10 2019-07-23 湖北汽车工业学院 一种智能仓储物料机器人、控制系统及控制方法
CN110844496A (zh) * 2019-11-25 2020-02-28 威海职业学院 一种智能机电自动化送料控制系统及方法

Also Published As

Publication number Publication date
CN115072626A (zh) 2022-09-20
CN115072626B (zh) 2023-07-18

Similar Documents

Publication Publication Date Title
EP3382489A1 (en) Tour guide robot and moving area calibration method, computer readable storage medium
US20200001917A1 (en) Materials handling vehicle obstacle scanning tools
US11975955B2 (en) Autonomous material transport vehicles, and systems and methods of operating thereof
US20170049290A1 (en) Intelligent robot, and sensor assembly and obstacle detection method for the same
CN112171663A (zh) 机器人状态提示系统、方法、装置和电子设备
WO2022188497A1 (zh) 搬运机器人、搬运系统及提示信息生成方法
JP2014157051A (ja) 位置検出装置
JP2019087210A (ja) 自律移動装置
WO2022116649A1 (zh) 搬运设备的控制方法、装置、搬运设备及存储介质
US20240085916A1 (en) Systems and methods for robotic detection of escalators and moving walkways
US11423560B2 (en) Method for improving the interpretation of the surroundings of a vehicle
WO2018173595A1 (ja) 移動装置
JP2012168602A (ja) 自動追従式台車
JP2017224136A (ja) 移動体及び移動体用の障害物検出装置
KR20100011376A (ko) 컨테이너 이송 차량의 자가 위치 확인 시스템, 방법 및 그방법을 기록한 기록 매체
JP2023163886A (ja) 誘導システム
JP2023163885A (ja) 誘導システム
JPH09269828A (ja) 搬送車の制御方法及びその制御装置
CN116654842B (zh) 搬运设备、搬运设备的控制方法、装置和存储介质
EP3112897A1 (en) Intelligent robot, and sensor assembly and obstacle detection method for same
JP7267850B2 (ja) 操作端末、移動システム、及び表示方法
JPS59186899A (ja) 無人搬送荷役装置におけるパレツト位置検出方法
US20230035361A1 (en) Moving body
CN215711500U (zh) 自主移动叉车
TW202001286A (zh) 物件定位系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929959

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929959

Country of ref document: EP

Kind code of ref document: A1