WO2021082964A1 - 自主移动装置及仓储物流系统 - Google Patents
自主移动装置及仓储物流系统 Download PDFInfo
- Publication number
- WO2021082964A1 WO2021082964A1 PCT/CN2020/121858 CN2020121858W WO2021082964A1 WO 2021082964 A1 WO2021082964 A1 WO 2021082964A1 CN 2020121858 W CN2020121858 W CN 2020121858W WO 2021082964 A1 WO2021082964 A1 WO 2021082964A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- autonomous mobile
- mobile device
- obstacle avoidance
- distance
- processor
- Prior art date
Links
- 238000003860 storage Methods 0.000 claims description 28
- 238000004891 communication Methods 0.000 claims description 15
- 230000033001 locomotion Effects 0.000 claims description 14
- 230000003068 static effect Effects 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 18
- 238000000034 method Methods 0.000 description 13
- 241001465754 Metazoa Species 0.000 description 6
- 238000009826 distribution Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 208000033999 Device damage Diseases 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241001417527 Pempheridae Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the present invention relates to a device, in detail, it relates to an autonomous mobile device and a warehouse logistics system.
- autonomous mobile devices on the market such as robots
- the obstacle is a movable object, such as a human or another robot
- the obstacle avoidance range is too small, it is likely that when the movable object starts to move, it will collide with the autonomous mobile device that is avoiding orbiting.
- This application discloses an autonomous mobile device and a warehousing logistics system.
- the technical means of determining the obstacle avoidance distance by judging whether the obstacle is a movable object solves the problems in the background art, so as to avoid the personnel caused by the too small obstacle avoidance distance. Or the problem of damage to the machine, and at the same time, to avoid the reduction of the working efficiency of the autonomous mobile device due to the excessive obstacle avoidance distance.
- an autonomous mobile device includes a sensing component, a processor, and a driving component.
- the sensing component is used for sensing the information of the object;
- the processor is used for judging whether the object is a movable object according to the information, and generating a control signal according to the judgment result, wherein the control signal is used for indicating all The obstacle avoidance distance of the autonomous mobile device;
- the driving component is used to drive the autonomous mobile device to move, wherein when the object is located on the travel path of the autonomous mobile device, the driving component is at least The obstacle avoidance distance drives the autonomous mobile device to avoid obstacles.
- the obstacle avoidance distance when the object is a movable object is greater than the obstacle avoidance distance when the object is a static object.
- the moving component controls the autonomous mobile device to use the object as the obstacle avoidance center and the obstacle avoidance distance as the radius
- the semi-circular arc path is used as the obstacle avoidance path to avoid obstacles.
- the sensing component includes an image sensor, and the information is an image of the object captured by the image sensor.
- the processor determines whether the object is a movable object based on images captured by the image sensor at two time points.
- the autonomous mobile device further includes a storage device for storing contrast images.
- the autonomous mobile device further includes a communication device for transmitting location information of the autonomous mobile device to a remote server, and receiving the corresponding location information from the remote server Possible movable object features, and the possible movable object features are stored in the storage device as the contrast image.
- the processor compares the image with the contrast image to determine whether the object is a movable object.
- the sensing component includes a distance sensor, and the information is the distance between the object and the autonomous mobile device.
- the processor determines whether the object is a movable object based on the distance sensed by the distance sensor at two time points.
- the processor is further configured to determine the moving speed of the object according to the information sensed by the sensing component at two points in time, and select the autonomous movement according to the moving speed The obstacle avoidance path of the device.
- the sensing component is further used to sense the load-bearing weight of the autonomous mobile device, and the processor adjusts the obstacle avoidance distance according to the load-bearing weight.
- a warehouse logistics system which includes a plurality of autonomous mobile devices and a dispatch server.
- the dispatch server is configured to receive obstacle avoidance information from one of the autonomous mobile devices, and send a travel signal to instruct the other autonomous mobile devices to continue or suspend the travel.
- the autonomous mobile device and warehousing logistics system disclosed in the present invention can determine whether the object is a movable object through the sensing component and the processor, and thereby determine the obstacle avoidance distance. In this way, it can avoid the obstacle avoidance distance being too small.
- the problem of personnel or machine damage at the same time, avoids the reduction of the working efficiency of the autonomous mobile device due to the excessive obstacle avoidance distance.
- Fig. 1 is a schematic diagram of an autonomous mobile device according to an embodiment of the present invention.
- FIG. 2 is a schematic diagram of a comparison image according to an embodiment of the present invention.
- Fig. 3 is a schematic diagram of judging an object according to an embodiment of the present invention.
- FIG. 4 is a schematic diagram of communication between a communication device and a remote server according to an embodiment of the present invention.
- Fig. 5 is a schematic diagram of judging an object according to an embodiment of the present invention.
- 6A and 6B are schematic diagrams of judging an object according to an embodiment of the invention.
- FIG. 7A and 7B are schematic diagrams of judging an object according to an embodiment of the invention.
- 8A and 8B are schematic diagrams of selecting the obstacle avoidance distance according to an embodiment of the present invention.
- 9A and 9B are schematic diagrams of obstacle avoidance paths according to an embodiment of the present invention.
- FIG. 10 is a schematic diagram of selecting the obstacle avoidance distance according to the carrying weight according to an embodiment of the present invention.
- FIG. 11 is a schematic diagram of selecting an obstacle avoidance path according to the moving speed of an object according to an embodiment of the present invention.
- Fig. 12 is a schematic diagram of an autonomous mobile device according to another embodiment of the present invention.
- FIG. 13 is a block diagram of an autonomous mobile device according to an embodiment of the invention.
- Fig. 14 is a schematic diagram of a warehouse logistics system according to an embodiment of the present invention.
- first and second features are in direct contact with each other; and may also include
- additional components are formed between the above-mentioned first and second features, so that the first and second features may not be in direct contact.
- present disclosure may reuse component symbols and/or labels in multiple embodiments. Such repeated use is based on the purpose of brevity and clarity, and does not in itself represent the relationship between the different embodiments and/or configurations discussed.
- spatially relative terms here such as “below”, “below”, “below”, “above”, “above” and similar, may be used to facilitate the description of the drawing in the figure
- the relationship between one component or feature relative to another component or feature is shown.
- the original meaning of these spatially-relative vocabulary covers a variety of different orientations of the device in use or operation, in addition to the orientation shown in the figure.
- the device may be placed in other orientations (for example, rotated 90 degrees or in other orientations), and these spatially-relative description vocabulary should be explained accordingly.
- Autonomous mobile devices on the market such as robots, use a fixed mode to avoid obstacles when avoiding obstacles.
- the obstacles are movable objects, such as humans, animals or robots
- the autonomous mobile devices are The obstacle avoidance mode is the same when the object is a static object. Once the movable object and the autonomous mobile device move at the same time, the autonomous mobile device may not be able to dodge as much as possible, which may cause damage to the person or the device.
- the autonomous mobile device is set to avoid obstacles with a larger obstacle avoidance distance regardless of whether the obstacle is a movable object or a static object, it will cause the autonomous mobile device to spend more time avoiding obstacles and reduce work efficiency. .
- the present invention discloses an autonomous mobile device and a warehouse logistics system.
- the sensing component and the processor determine whether the object is a movable object, and thereby determine the obstacle avoidance distance. In this way, the obstacle avoidance distance can be avoided. It is a small problem that causes damage to personnel or machines, and at the same time, avoids the reduction of the working efficiency of the autonomous mobile device due to the excessive obstacle avoidance distance.
- the autonomous mobile device may be an unmanned logistics vehicle used for carrying goods in a warehouse logistics system, or a suitcase with an autonomous mobile function.
- the movable object may be a person, animal, mechanical device or object that is already in dynamic motion, or a person, animal, or mechanical device that can move at any time but is temporarily stationary.
- FIG. 1 is a schematic diagram of an autonomous mobile device 10 according to an embodiment of the invention.
- the autonomous mobile device 10 is an unmanned logistics vehicle for carrying goods.
- the autonomous mobile device 10 includes a sensing component 110, a processor 120 and a driving component 130.
- the sensing component 110 includes an image sensor 111 and a distance sensor 112 arranged in front of the autonomous mobile device 10.
- the image sensor 111 is used to capture an image of an object.
- the distance sensor 112 is used to sense the distance between the object and the autonomous mobile device 10.
- the sensing component 110 can be used to sense the information of the object, where the object information can be the image of the object captured by the image sensor 111, or the object and autonomous movement sensed by the distance sensor 112 The distance between the devices 10, or a combination of the two. It should be noted that in other embodiments, the sensing component 110 may also include other sensors to implement other functions of the autonomous mobile device 10. Optionally, the sensing component 110 may further include a weight sensor 113 for sensing the carrying weight of the autonomous mobile device 10.
- the image sensor 111 can be implemented by a general camera, and the distance sensor 112 can be a depth camera (or called an RGBD camera), a laser radar (Light Laser Detection and Ranging, LiDAR), an ultrasonic sensor, and an infrared sensor.
- a depth camera or called an RGBD camera
- LiDAR Laser Detection and Ranging
- ultrasonic sensor or an ultrasonic sensor
- infrared sensor a laser radar (Light Laser Detection and Ranging, LiDAR), an ultrasonic sensor, and an infrared sensor.
- the processor 120 is disposed in the autonomous mobile device 10, and is used to determine whether the object is a movable object based on the object information sensed by the sensing component 110, such as the aforementioned object image or the distance between the object and the autonomous mobile device, and The control signal is generated according to the judgment result.
- the control signal is used to indicate the obstacle avoidance distance of the autonomous mobile device 10 with respect to the object.
- the driving component 130 is used to drive the autonomous mobile device 10 to move, wherein when an object is located on the traveling path of the autonomous mobile device 10, the driving component 130 performs obstacle avoidance by at least the obstacle avoidance distance.
- the driving assembly 130 includes a motor (not shown) and the power wheel shown in FIG. 1, and the motor provides kinetic energy to the power wheel to drive the autonomous mobile device 10.
- the autonomous mobile device 10 may also include other components and elements to implement other functions of the autonomous mobile device 10.
- the autonomous mobile device 10 also includes a storage device 140 for storing information, a communication device 150 for communicating with a remote server, a battery 160 for providing power, and power for distributing power to various components.
- the distribution module 170 and the display screen 180 for displaying information.
- the autonomous mobile device 10 shown in FIG. 1 is only an example, and the present invention does not limit the detailed architecture of the autonomous mobile device 10. In particular, the present invention does not limit the components shown in FIG. 1 such as the image sensor 111.
- the distance sensor 112 is arranged at a position on the autonomous mobile device 10.
- the autonomous mobile device 10 may include a plurality of image sensors 111 and a plurality of distance sensors 112, which are respectively distributed at different positions of the autonomous mobile device 10, so as to sense multi-directional object information. The detailed location of the image sensor 111 and the distance sensor 112 depends on actual design requirements.
- FIGS. 2 to 5 illustrate that when an object appears in the sensing range of the image sensor 111, the processor 120 obtains the appearance characteristics of the object through the image of the object captured by the image sensor 111, and thereby determines whether the object is a movable object. Examples. In detail, in the embodiments of FIGS. 2 and 5, the processor 120 is used in conjunction with the image sensor 111 and the storage device 140 to determine whether the object is a movable object.
- the storage device 140 can be used to store information.
- the storage device 140 can be used to store location information of the autonomous mobile device 10, map information of the location, task information of the autonomous mobile device 10, and so on.
- the storage device 140 can be used to store contrast images.
- the storage device 140 may store the image and appearance feature information of a specific object as a comparison image before the autonomous mobile device 10 leaves the factory.
- the storage device 140 may store images of a human body or an animal as shown in FIG. 2 in advance. The skeleton and appearance feature information is used as a contrast image.
- the processor 120 can determine whether the object is a human body or not by comparing the object image captured by the image sensor 111 with the contrast image in the storage device 140. Movable objects such as animals. Referring to FIG. 3, when the object (person) in FIG. 3 appears in the sensing range of the image sensor 111, the image sensor 111 captures an image of the object (person). Since the storage device 140 has previously stored the image and appearance feature information of the specific object as the contrast image, the processor 120 can compare the object (person) image captured by the image sensor 111 with the contrast image previously stored in the storage device 140. And use this to determine that the object is a human body.
- the storage device 140 may also store images and appearance characteristics of other movable objects as contrast images.
- the storage device 140 can also store images of other autonomous mobile devices or known mechanical devices for the processor 120 to determine. The present invention is not limited to that shown in FIG. 2.
- the contrast image stored in the storage device 140 is stored in advance, however, this is not a limitation of the present invention.
- the autonomous mobile device 10 can communicate with a cloud server or a remote server through the communication device 150 to update the comparison image in the storage device 140 in real time. 4, the autonomous mobile device 10 transmits the location information of the autonomous mobile device 10 to the remote server 20 through the communication device 150, and the remote server 20 returns characteristic information to the autonomous mobile device 10 according to the location of the autonomous mobile device 10 , Where the feature information includes image and appearance feature information of possible movable objects in the location of the autonomous mobile device 10.
- the autonomous mobile device 10 stores the characteristic information in the storage device 140 as a comparison image.
- the processor 120 can determine whether the object is a human body or not by comparing the object image captured by the image sensor 111 with the contrast image in the storage device 140. Movable objects such as animals. For example, if the location information of the autonomous mobile device 10 indicates that the autonomous mobile device 10 is located in a warehouse, the remote server 20 may return characteristic information of other autonomous mobile devices or mechanical devices. For another example, if the location information of the autonomous mobile device 10 indicates that the autonomous mobile device 10 is located in an airport, the remote server 20 may return the characteristic information of the device such as the airport manned vehicle and sweeper.
- the autonomous mobile device 10 can also communicate with the remote server 20 in a timely manner to make the determination.
- the image sensor 111 captures an image of the object 30.
- the processor 120 cannot determine whether the object 30 is a movable object based on the current contrast image in the storage device 140.
- the autonomous mobile device 10 communicates with the remote server 20 through the communication device 150. After the communication device 150 transmits the location information of the autonomous mobile device 10 to the remote server 20, the location information is received from the remote server 20.
- the image and appearance characteristics of possible movable objects are stored in the storage device 140 as a contrast image.
- the processor 120 compares the image sensor 111 with the contrast image stored in the storage device 140 to determine whether the object 30 is a movable object according to the method of the embodiment shown in FIG. 2 and FIG. 3. It should be noted that, in other embodiments, the communication device 150 can directly send the image of the object 30 to the remote server 20, and the remote server 20 determines whether the object 30 is movable by means of manual identification or intelligent identification. Object, and transmit the judgment result to the processor 120 through the communication device 150.
- the location information may be one or more of the IP address of the autonomous mobile device 10, 4G or 5G positioning information, or the global positioning system information of the autonomous mobile device 10. Combinations, the present invention is not limited to this.
- the autonomous mobile device 10 may also use other image recognition methods to make judgments.
- the image sensor 111 can capture images of the object at two different time points for the processor 120 to determine whether the object is a movable object in motion.
- FIGS. 6A and 6B are respectively object images captured by the image sensor 111 at two different time points.
- the image sensor 111 captures the first image of the object at the first time point.
- the image sensor 111 captures the second image of the object at the second time point.
- the image sensor 111 transmits the two images to the processor 120, and the processor 120 determines whether the object is a movable object according to the images captured at two time points.
- the processor 120 intercepts the distance of the object (ball) relative to the multiple feature points in the image from the image shown in FIG. 6A.
- the characteristic points may be the endpoints of the shelf in the background.
- the processor 120 intercepts the distance of the object (ball) relative to the same feature point from the image shown in FIG.
- the object (ball) determines whether the relative distance of the object (ball) relative to the multiple feature points in the two time points is If there is a change, the object (ball) is judged to be a movable object in motion; otherwise, the object (ball) is judged to be a static object.
- the processor 120 will simultaneously correct the error caused by the movement of the autonomous mobile device 10 to make the judgment more accurate.
- the autonomous mobile device 10 can also determine whether the object is a movable object by sensing the relative distance between the object and the autonomous mobile device 10.
- the distance sensor 112 can sense the distance between the object and the autonomous mobile device 10 at two different time points, so that the processor 120 can determine whether the object is in progress. Moving movable objects.
- FIGS. 7A and 7B respectively depict the distance between the object sensed by the distance sensor 112 and the autonomous mobile device 10 at two different time points.
- the distance sensor 112 senses the distance between the object (ball) and the autonomous mobile device 10 at a first point in time.
- FIG. 7A the distance sensor 112 senses the distance between the object (ball) and the autonomous mobile device 10 at a first point in time.
- the distance sensor 112 senses the distance between the object (ball) and the autonomous mobile device 10 at the second time point.
- the distance sensor 112 transmits the distance sensed at the two time points to the processor 120, and the processor 120 determines whether the object is a movable object according to the distance sensed at the two time points.
- the distance sensor 112 senses the distance between an object (ball) and the autonomous mobile device 10, and at the same time senses the distance between a background object (such as a rear shelf) and the autonomous mobile device 10.
- a background object such as a rear shelf
- the distance sensor 112 again senses the distance between the object (ball) and the autonomous mobile device 10, and at the same time senses the distance between the background object (such as the rear shelf) and the autonomous mobile device 10.
- the processor 120 uses the distance between the background object (such as the rear shelf) and the autonomous mobile device 10 as a reference to determine whether the relative distance between the object (ball) and the autonomous mobile device 10 has changed within two time points. If so, then The object (ball) is judged to be a movable object in motion; otherwise, the object (ball) is judged to be a static object.
- the processor 120 will also correct the error caused by the movement of the autonomous mobile device 10, making the judgment more Precise.
- the above embodiments respectively illustrate that the image sensor 111 or the distance sensor 112 is used for sensing to obtain the information of the object, and the processor 120 can determine whether the object is a movable object based on this.
- the autonomous mobile device 10 can perform sensing in combination with the image sensor 111 and the distance sensor 112 to obtain more accurate judgment results.
- the processor 120 determines whether the object is a movable object according to the above-mentioned embodiment, if the object is on the travel path of the autonomous mobile device 10, it selectively avoids obstacles at different obstacle avoidance distances according to whether the object is a movable object. barrier.
- the processor 120 determines that the object is a movable object, the obstacle avoidance is performed with a larger obstacle avoidance distance, so as to avoid the movement of the object making the autonomous mobile device 10 unable to dodge, thereby causing personnel and device damage; Conversely, if the processor 120 determines that the object is a static object, it will avoid obstacles with a smaller obstacle avoidance distance, so as to shorten the obstacle avoidance time of the autonomous mobile device 10 and improve the working efficiency of the autonomous mobile device 10.
- the processor 120 determines that the object 81 is a static object through the foregoing embodiment, the autonomous mobile device 10 only needs to avoid obstacles with a small obstacle avoidance distance D1. Therefore, the processor 120 sends a control signal to make the driving assembly 130 perform obstacle avoidance on the semi-circular arc obstacle avoidance path with the object 81 as the center and the obstacle avoidance distance D1 as the radius.
- the processor 120 determines that the object 82 is a movable object through the foregoing embodiment, the autonomous mobile device 10 needs to avoid obstacles with a larger obstacle avoidance distance D2 to ensure safety. Therefore, the processor 120 sends a control signal to make the drive assembly 130 perform obstacle avoidance on the semi-circular arc obstacle avoidance path with the object 82 as the center and the obstacle avoidance distance D2 as the radius.
- the autonomous mobile device 10 uses a semi-circular arc obstacle avoidance path to avoid obstacles.
- the obstacle avoidance method adopted by the autonomous mobile device 10 is not a limitation of the present invention. In other embodiments, the autonomous mobile device 10 may adopt other obstacle avoidance paths to avoid obstacles.
- the processor 120 determines that the object 82 is a movable object in the above-mentioned embodiment, the autonomous mobile device 10 needs to avoid the obstacle by the obstacle avoidance distance D2 to ensure safety. Then, the processor 120 sends a control signal to make the driving assembly 130 bypass the object 82 in a straight line avoiding obstacles. Referring to FIG.
- the processor 120 determines that the object 82 is a movable object through the above-mentioned embodiment, the autonomous mobile device 10 needs to avoid the obstacle by the obstacle avoidance distance D2 to ensure safety. Then, the processor 120 sends a control signal to make the driving assembly 130 bypass the object 82 in a right-angle turning manner.
- the distance between the autonomous mobile device 10 and the object 82 is always greater than the obstacle avoidance distance D2, so as to prevent the sudden movement of the object 82 from causing damage to the personnel and the device.
- the autonomous mobile device 10 can determine which obstacle avoidance method should be adopted for obstacle avoidance according to factors such as terrain, channel width, and the number of obstacles on the travel path. In some embodiments, the autonomous mobile device 10 avoids obstacles in a way that can quickly avoid obstacles and improve work efficiency. In some embodiments, the autonomous mobile device 10 avoids obstacles in a manner that best ensures the safety of personnel. The present invention is not limited to this.
- the autonomous mobile device 10 can be used to carry goods.
- the weight carried by the autonomous mobile device 10 is heavier, collisions with persons or other devices will cause more serious damage.
- the weight sensor 113 is used to sense the carrying weight of the autonomous mobile device 10.
- the processor 120 selectively selects the obstacle avoidance distance of the autonomous mobile device 10 according to the object information sensed by the sensing component 110 in the above-mentioned embodiment and the carrying weight of the autonomous mobile device 10.
- the processor 120 can adjust the obstacle avoidance distance in a timely manner, so that the autonomous mobile device 10 can avoid obstacles with a larger obstacle avoidance distance to ensure safety.
- the autonomous mobile device 10 moves with cargo and encounters obstacles on the travel path.
- the processor 120 determines that the object is a movable object. Therefore, the autonomous mobile device 10 needs to avoid obstacles with a larger obstacle avoidance distance D2 to ensure safety.
- the weight sensor 113 in the sensing component 110 senses the carrying weight of the autonomous mobile device 10, and the processor 120 determines whether the carrying weight of the autonomous mobile device 10 at this time is greater than a preset weight. If the carrying weight is greater than a preset weight, the processor 120 determines that the autonomous mobile device 10 needs to avoid the obstacle by the obstacle avoidance distance D3, where the obstacle avoidance distance D3 is greater than the obstacle avoidance distance D2. If the carrying weight is less than or equal to the preset weight, the processor 120 determines that the autonomous mobile device 10 performs obstacle avoidance at the obstacle avoidance distance D2.
- the autonomous mobile device when the autonomous mobile device is unable to avoid obstacles with a large obstacle avoidance distance due to the narrow channel width or too many obstacles, the autonomous mobile device will move autonomously when the safety of personnel is prioritized.
- the device 10 can stop traveling in a timely manner, and wait for the obstacle to be removed (such as a person leaving or an object being removed) before traveling.
- the autonomous mobile device 10 may also include components such as speakers and warning lights. When the autonomous mobile device 10 is avoiding obstacles or cannot avoid obstacles, the effect of warning people is achieved by means of warning sounds or warning lights.
- the image sensor 111 or the distance sensor 112 senses the object information.
- the processor 120 determines whether the object is a movable object in motion. In order to accurately choose the obstacle avoidance path, the moving speed and direction of the object must be taken into consideration. Referring to FIG. 11, at the first time point T1, the object 82 is located within the sensing range of the image sensor 111 or the distance sensor 112. At the same time, according to the position of the object 82, the processor 120 defines the position coordinates of the object 82 at this time as (P XA , P YA ).
- the coordinate position of the object 82 can be defined by taking the position of the autonomous mobile device 10 itself as the origin, and the distance between the object 82 sensed by the distance sensor 112 and the autonomous mobile device 10. Next, the object 82 moves to the position coordinates (P XB , P YB ) at the second time point T2. According to the first time point T1 and the corresponding coordinates (P XA , P YA ) and the second time point T2 and the corresponding coordinates (P XB , P YB ), the processor 120 can calculate the moving speed and direction of the object 82.
- the processor 120 calculates the travel path that the object 82 will enter the autonomous mobile device 10 according to the moving speed and direction of the object 82 and the moving speed and direction of the autonomous mobile device 10 itself, and starts preparing for obstacle avoidance. Since the obstacle avoidance paths selectable by the autonomous mobile device 10 are the path P1 and the path P2 in the figure, the processor 120 matches two obstacle avoidance paths P1 and P1 according to the moving speed and direction of the object 82 and the moving speed of the autonomous mobile device 10 P2, choose one of the obstacle avoidance paths that can best ensure the safety of personnel. Taking FIG.
- the processor 120 calculates that according to the moving speed and direction of the object 82 and the moving speed of the autonomous mobile device 10 itself, if the obstacle avoidance path P2 is selected for obstacle avoidance, it will interact with the object 82 at time T3. Collision at the position of coordinates (P XC , P YC ). Therefore, the processor 120 selects the obstacle avoidance path P1 for obstacle avoidance to avoid collision with the object 82.
- FIG. 12 is a schematic diagram of an autonomous mobile device 40 according to another embodiment of the present invention.
- the autonomous mobile device 40 is a luggage case with an autonomous mobile function.
- the autonomous mobile device 40 includes a sensing component 410, a processor 420, a driving component 430, a storage device 440, a communication device 450, a battery 460, and a power distribution module 470.
- the sensing component 410 includes an image sensor 411, a distance sensor 412, and a weight sensor 413.
- the functions of the components included in the autonomous mobile device 40 are the same as the corresponding components of the autonomous mobile device 10 shown in FIG. 1.
- the autonomous mobile device 40 can also apply the technical solutions disclosed in FIGS. 2 to 11, and the detailed description is omitted here to save space.
- the autonomous mobile device 10 and the autonomous mobile device 40 may not include the weight sensors 113 and 413. Accordingly, the autonomous mobile device 10 and the autonomous mobile device 40 may be based on their own movement acceleration And the power provided to the driving components 130 and 430 calculates the bearing weight.
- the autonomous mobile device of the present application is summarized in Fig. 13 to facilitate the understanding of the present invention.
- 13 is a system block diagram of an autonomous mobile device according to an embodiment of the present application.
- the autonomous mobile device 10 includes a sensing component 110, a processor 120, a driving component 130, a storage device 140, and a communication device. 150, a battery 160, a power distribution module 170, and a display screen 180.
- the sensing component 110 includes an image sensor 111, a distance sensor 112 and a weight sensor 113.
- the image sensor 111 can be implemented by a general camera, and the distance sensor 112 can be implemented by one of a depth camera, a laser radar, an ultrasonic sensor, and an infrared sensor, or a combination of multiple of them.
- the sensing component 110 is electrically connected to the processor 120 (for example, a central processing unit CPU).
- the sensing component 110 transmits the sensed object information to the processor 120.
- the processor 120 determines whether the object is a movable object according to the object information, and sends a control signal.
- the driving component 130 is electrically connected to the processor 120, which is used to drive the autonomous mobile device 10 to move. When the object is on the traveling path of the autonomous mobile device 10, the driving component 130 performs obstacle avoidance according to the obstacle avoidance distance indicated by the control signal .
- the driving assembly 130 includes a motor and power wheels.
- the storage device 140 is used to store information, such as movable object feature information, static object feature information, location information of the autonomous mobile device 10, map information of the location, task information of the autonomous mobile device 10, shelf information, and so on.
- the communication device 150 is used to communicate with a cloud server or a remote server to update the characteristic information of the movable object in the storage device 140 or to receive the judgment result of the object by the remote server.
- the storage device 140, the communication device 150 and the processor 120 are electrically connected.
- the display screen 180 is used to display information.
- the power distribution module 170 is electrically connected to each component of the autonomous mobile device 10, and is used to distribute the power provided by the battery 160 to each component.
- FIG. 14 is a schematic diagram of a warehouse logistics system 50 according to an embodiment of the present invention.
- the warehouse logistics system 50 includes a dispatch server 60 and autonomous mobile devices 71, 72, 73, and 74.
- the autonomous mobile devices 71, 72, 73, and 74 can be implemented by the autonomous mobile device 10 shown in FIG. 1.
- the warehouse logistics system 50 is used to control the movement of autonomous mobile devices 71, 72, 73, and 74 between shelves through the dispatch server 60.
- the autonomous mobile device after determining that the object is a movable object or a static object, the autonomous mobile device will adopt different obstacle avoidance distances. However, the distance between shelves in general warehouses is limited.
- autonomous mobile devices may enter the travel path of other autonomous mobile devices when avoiding obstacles, thereby causing autonomous mobile devices to collide with each other.
- the dispatch server 60 receives the obstacle avoidance information from the autonomous mobile device, it performs autonomously based on factors such as the distance between the shelves, the obstacle avoidance distance adopted by the autonomous mobile device, the speed and distance between the autonomous mobile devices, etc. Dispatching of mobile devices, arranging other autonomous mobile devices to continue or stop traveling, to prevent autonomous mobile devices from colliding with each other.
- the autonomous mobile device 72 encounters an object 81 on the path of travel.
- the autonomous mobile device 72 determines that the object 81 is a static object, and therefore chooses a smaller obstacle avoidance path D1 Take obstacle avoidance.
- the autonomous mobile device 72 transmits obstacle avoidance information about the obstacle avoidance path D1 to the dispatch server 60.
- the dispatch server 60 determines that the autonomous mobile device 72 will not collide with the autonomous mobile device 71 during obstacle avoidance. Therefore, the dispatch server 60 sends a signal to the autonomous mobile device 71 indicating that the autonomous mobile device 72 can continue to travel.
- the autonomous mobile device 74 encounters an object 82 on the path of travel. According to the embodiments of FIGS. 2 to 11, the autonomous mobile device 74 determines that the object 82 is a movable object, and therefore chooses to proceed with the larger obstacle avoidance path D2. Avoid obstacles for safety.
- the autonomous mobile device 74 transmits obstacle avoidance information about the obstacle avoidance path D2 to the dispatch server 60. Based on factors such as the distance between the shelves, the obstacle avoidance distance D2, the speed and distance of the autonomous mobile devices 73 and 74, the dispatch server 60 determines that the autonomous mobile device 74 will enter the travel path of the autonomous mobile device 73 when avoiding obstacles. Therefore, the dispatch server 60 sends a signal to the autonomous mobile device 73 to instruct the autonomous mobile device 73 to stop traveling, and wait for the autonomous mobile device 74 to complete obstacle avoidance before proceeding.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (13)
- 一种自主移动装置,其特征在于,包括:感测组件,用于感测物体的信息;处理器,用于依据所述信息判断所述物体是否为可移动物体,并依据判断结果来产生控制信号,其中所述控制信号用于指示所述自主移动装置的避障距离;及驱动组件,用于驱动所述自主移动装置以进行移动,其中当所述物体位于所述自主移动装置的行进路径上时,所述驱动组件以至少所述避障距离驱动所述自主移动装置进行避障。
- 如权利要求1所述的自主移动装置,其特征在于,当所述物体为可移动物体时的避障距离大于当所述物体为静态物体的避障距离。
- 如权利要求1所述的自主移动装置,其特征在于,当所述物体位于所述自主移动装置的行进路径上时,所述移动组件控制所述自主移动装置以所述物体为避障中心、所述避障距离为半径的半圆弧路径作为避障路径来进行避障。
- 如权利要求1所述的自主移动装置,其特征在于,所述感测组件包括影像传感器,所述信息是所述影像传感器捕捉所述物体的影像。
- 如权利要求4所述的自主移动装置,其特征在于,所述处理器通过所述影像传感器于两个时间点所捕捉的影像来判断所述物体是否为可移动物体。
- 如权利要求4所述的自主移动装置,其特征在于,还包括:储存装置,用于储存对比影像。
- 如权利要求5所述的自主移动装置,其特征在于,还包括:通讯装置,用于传送所述自主移动装置的位置信息至远端伺服器,并自所述远端伺服器接收对应所述位置信息中可能的可移动物体特 征,并将所述可能的可移动物体特征存入所述储存装置中以作为所述对比影像。
- 如权利要求6至7中任一项所述的自主移动装置,其特征在于,所述处理器比较所述影像及所述对比影像以判断所述物体是否为可移动物体。
- 如权利要求1所述的自主移动装置,其特征在于,所述感测组件包括距离传感器,所述信息是所述物体与所述自主移动装置之间的距离。
- 如权利要求9所述的自主移动装置,其特征在于,所述处理器通过所述距离传感器于两个时间点所感测的距离来判断所述物体是否为可移动物体。
- 如权利要求1所述的自主移动装置,其特征在于,所述处理器还用于依据所述感测组件于两个时间点所感测的信息来判断所述物体的移动速度,并依据所述移动速度选择所述自主移动装置的避障路径。
- 如权利要求1所述的自主移动装置,其特征在于,所述感测组件还用于感测所述自主移动装置的承载重量,所述处理器依据所述承载重量调整所述避障距离。
- 一种仓储物流系统,其特征在于,包括:多个如权利要求1所述的自主移动装置;及调度伺服器,用于自其中一个所述自主移动装置接收避障信息,并发送行进信号以指示所述其他自主移动装置继续行进或暂停行进。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/768,474 US20240103518A1 (en) | 2019-10-31 | 2020-10-19 | Autonomous mobile device and warehouse logistics system |
EP20882501.8A EP4026666A4 (en) | 2019-10-31 | 2020-10-19 | AUTONOMOUS MOBILE DEVICE AND WAREHOUSE LOGISTICS SYSTEM |
JP2022522381A JP7385851B2 (ja) | 2019-10-31 | 2020-10-19 | 自律移動装置及び倉庫物流システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911052829.5A CN110653837B (zh) | 2019-10-31 | 2019-10-31 | 自主移动装置及仓储物流系统 |
CN201911052829.5 | 2019-10-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021082964A1 true WO2021082964A1 (zh) | 2021-05-06 |
Family
ID=69042512
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/121858 WO2021082964A1 (zh) | 2019-10-31 | 2020-10-19 | 自主移动装置及仓储物流系统 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240103518A1 (zh) |
EP (1) | EP4026666A4 (zh) |
JP (1) | JP7385851B2 (zh) |
CN (2) | CN110653837B (zh) |
WO (1) | WO2021082964A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110653837B (zh) * | 2019-10-31 | 2021-08-13 | 灵动科技(北京)有限公司 | 自主移动装置及仓储物流系统 |
CN113219969A (zh) * | 2021-04-22 | 2021-08-06 | 深圳拓邦股份有限公司 | 洗地机器人避障控制方法、装置及洗地机器人 |
CN117961909A (zh) * | 2024-03-15 | 2024-05-03 | 东莞市库崎智能科技有限公司 | 上下料复合机器人任务分配方法、系统和可读存储介质 |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005288657A (ja) * | 2004-04-05 | 2005-10-20 | Matsushita Electric Ind Co Ltd | ロボット |
CN106054881A (zh) * | 2016-06-12 | 2016-10-26 | 京信通信系统(广州)有限公司 | 一种执行终端的避障方法及执行终端 |
CN106156742A (zh) * | 2016-07-06 | 2016-11-23 | 尚艳燕 | 一种平衡车障碍物规避方法和装置 |
CN107092252A (zh) * | 2017-04-11 | 2017-08-25 | 杭州光珀智能科技有限公司 | 一种基于机器视觉的机器人主动避障方法及其装置 |
CN107894773A (zh) * | 2017-12-15 | 2018-04-10 | 广东工业大学 | 一种移动机器人的导航方法、系统及相关装置 |
CN108958263A (zh) * | 2018-08-03 | 2018-12-07 | 江苏木盟智能科技有限公司 | 一种机器人避障方法及机器人 |
CN109753070A (zh) * | 2019-01-16 | 2019-05-14 | 深圳市海柔创新科技有限公司 | 一种避障方法、装置及仓储机器人 |
CN109955245A (zh) * | 2017-12-26 | 2019-07-02 | 深圳市优必选科技有限公司 | 一种机器人的避障方法、系统及机器人 |
CN209086755U (zh) * | 2018-12-11 | 2019-07-09 | 上海智臻智能网络科技股份有限公司 | 一种机器人及其控制系统 |
JP2019114129A (ja) * | 2017-12-25 | 2019-07-11 | 株式会社ダイヘン | 移動体 |
CN110045739A (zh) * | 2019-05-10 | 2019-07-23 | 湖北汽车工业学院 | 一种智能仓储物料机器人、控制系统及控制方法 |
CN110045364A (zh) * | 2019-03-01 | 2019-07-23 | 上海大学 | 基于渐进式畸变图像特征识别的动态目标跟踪和静态目标检测系统与方法 |
CN110653837A (zh) * | 2019-10-31 | 2020-01-07 | 灵动科技(北京)有限公司 | 自主移动装置及仓储物流系统 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4672175B2 (ja) * | 2000-05-26 | 2011-04-20 | 本田技研工業株式会社 | 位置検出装置、位置検出方法、及び位置検出プログラム |
US8050863B2 (en) * | 2006-03-16 | 2011-11-01 | Gray & Company, Inc. | Navigation and control system for autonomous vehicles |
KR100811886B1 (ko) * | 2006-09-28 | 2008-03-10 | 한국전자통신연구원 | 장애물 회피 진행이 가능한 자율이동로봇 및 그 방법 |
JP4717105B2 (ja) * | 2008-08-29 | 2011-07-06 | 株式会社日立製作所 | 自律移動ロボット装置及びかかる装置における飛び出し衝突回避方法 |
CN101685100B (zh) * | 2008-09-24 | 2011-07-20 | 华晶科技股份有限公司 | 检测被摄物移动速度的摄像装置及其方法 |
CN102131076B (zh) * | 2011-01-19 | 2015-05-20 | 中兴通讯股份有限公司 | 视频监控方法及监控终端 |
CN102708474A (zh) * | 2012-05-04 | 2012-10-03 | 成都智汇科技有限公司 | 基于物联网的智慧物流系统 |
JP5891314B2 (ja) * | 2012-12-13 | 2016-03-22 | 株式会社日立製作所 | 自律走行装置 |
US8965561B2 (en) * | 2013-03-15 | 2015-02-24 | Cybernet Systems Corporation | Automated warehousing using robotic forklifts |
US9607285B1 (en) * | 2015-03-17 | 2017-03-28 | Amazon Technologies, Inc. | Entity monitoring for kiva robotic floors |
US10213082B2 (en) * | 2016-08-30 | 2019-02-26 | Samsung Electronics Co., Ltd. | Robot cleaner |
CN106959696B (zh) * | 2017-05-10 | 2020-03-03 | 北京京东尚科信息技术有限公司 | 运动目标的控制方法和装置 |
US10429847B2 (en) * | 2017-09-22 | 2019-10-01 | Locus Robotics Corp. | Dynamic window approach using optimal reciprocal collision avoidance cost-critic |
KR102532741B1 (ko) * | 2018-02-28 | 2023-05-16 | 삼성전자주식회사 | 자율 주행 장치 및 그 주행 방법 |
CN108501947A (zh) * | 2018-04-03 | 2018-09-07 | 西藏帝亚维新能源汽车有限公司 | 一种纯电动车辆自动制动的控制方法 |
WO2020077481A1 (en) * | 2018-10-15 | 2020-04-23 | Lingdong Technology (Beijing) Co. Ltd | Self-driving vehicle system with steerable camera and indicator |
CN109240313A (zh) * | 2018-11-26 | 2019-01-18 | 智久(厦门)机器人科技有限公司上海分公司 | 无人驾驶叉车的避障方法、装置及系统 |
CN110262482A (zh) * | 2019-06-10 | 2019-09-20 | 华东师范大学 | 一种无人船航速控制方法及无人船 |
CN112445222B (zh) * | 2019-09-05 | 2024-07-16 | 浙江未来精灵人工智能科技有限公司 | 导航方法、装置、存储介质以及终端 |
-
2019
- 2019-10-31 CN CN201911052829.5A patent/CN110653837B/zh active Active
- 2019-10-31 CN CN202110845367.3A patent/CN113580155B/zh active Active
-
2020
- 2020-10-19 JP JP2022522381A patent/JP7385851B2/ja active Active
- 2020-10-19 US US17/768,474 patent/US20240103518A1/en active Pending
- 2020-10-19 EP EP20882501.8A patent/EP4026666A4/en active Pending
- 2020-10-19 WO PCT/CN2020/121858 patent/WO2021082964A1/zh active Application Filing
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005288657A (ja) * | 2004-04-05 | 2005-10-20 | Matsushita Electric Ind Co Ltd | ロボット |
CN106054881A (zh) * | 2016-06-12 | 2016-10-26 | 京信通信系统(广州)有限公司 | 一种执行终端的避障方法及执行终端 |
CN106156742A (zh) * | 2016-07-06 | 2016-11-23 | 尚艳燕 | 一种平衡车障碍物规避方法和装置 |
CN107092252A (zh) * | 2017-04-11 | 2017-08-25 | 杭州光珀智能科技有限公司 | 一种基于机器视觉的机器人主动避障方法及其装置 |
CN107894773A (zh) * | 2017-12-15 | 2018-04-10 | 广东工业大学 | 一种移动机器人的导航方法、系统及相关装置 |
JP2019114129A (ja) * | 2017-12-25 | 2019-07-11 | 株式会社ダイヘン | 移動体 |
CN109955245A (zh) * | 2017-12-26 | 2019-07-02 | 深圳市优必选科技有限公司 | 一种机器人的避障方法、系统及机器人 |
CN108958263A (zh) * | 2018-08-03 | 2018-12-07 | 江苏木盟智能科技有限公司 | 一种机器人避障方法及机器人 |
CN209086755U (zh) * | 2018-12-11 | 2019-07-09 | 上海智臻智能网络科技股份有限公司 | 一种机器人及其控制系统 |
CN109753070A (zh) * | 2019-01-16 | 2019-05-14 | 深圳市海柔创新科技有限公司 | 一种避障方法、装置及仓储机器人 |
CN110045364A (zh) * | 2019-03-01 | 2019-07-23 | 上海大学 | 基于渐进式畸变图像特征识别的动态目标跟踪和静态目标检测系统与方法 |
CN110045739A (zh) * | 2019-05-10 | 2019-07-23 | 湖北汽车工业学院 | 一种智能仓储物料机器人、控制系统及控制方法 |
CN110653837A (zh) * | 2019-10-31 | 2020-01-07 | 灵动科技(北京)有限公司 | 自主移动装置及仓储物流系统 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4026666A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP4026666A4 (en) | 2023-02-08 |
JP7385851B2 (ja) | 2023-11-24 |
CN110653837B (zh) | 2021-08-13 |
CN113580155A (zh) | 2021-11-02 |
US20240103518A1 (en) | 2024-03-28 |
CN113580155B (zh) | 2023-07-18 |
JP2022552335A (ja) | 2022-12-15 |
EP4026666A1 (en) | 2022-07-13 |
CN110653837A (zh) | 2020-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021082964A1 (zh) | 自主移动装置及仓储物流系统 | |
KR102107555B1 (ko) | 차량용 센서 궤적 플래닝 | |
US10875448B2 (en) | Visually indicating vehicle caution regions | |
US10429847B2 (en) | Dynamic window approach using optimal reciprocal collision avoidance cost-critic | |
US10725471B2 (en) | Virtual line-following and retrofit method for autonomous vehicles | |
US9870002B1 (en) | Velocity control of position-controlled motor controllers | |
US20200257311A1 (en) | Cart having leading and following function | |
CN107992091B (zh) | 一种基于信号强度的平衡车跟随方法和系统 | |
WO2021109890A1 (zh) | 具有追踪功能的自动驾驶系统 | |
AU2021230331B2 (en) | Robot obstacle collision prediction and avoidance | |
KR102315678B1 (ko) | 잔디 깎기 로봇 및 그 제어 방법 | |
JP2022518012A (ja) | 自動運転車用自律放送システム | |
WO2021008371A1 (zh) | 结合无人机的智能物流车盘点系统 | |
CN109895825A (zh) | 自动运输装置 | |
CN111717843A (zh) | 一种物流搬运机器人 | |
US20230195124A1 (en) | Management method for mobile object, control method for mobile object, mobile object, and recording medium storing computer program | |
CN106354142A (zh) | 一种基于物联网的智能搬运系统及应用 | |
KR102171934B1 (ko) | 양방향 선도 추미 대차 | |
CN114815809A (zh) | 移动机器人的避障方法、系统、终端设备及存储介质 | |
KR20200055164A (ko) | 매핑기반 자율주행 카트 및 그 제어 방법 | |
JP2022067224A (ja) | 自律移動体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20882501 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022522381 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 17768474 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2020882501 Country of ref document: EP Effective date: 20220404 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |