WO2023109281A1 - 用于控制自主移动机器人的行驶的方法和设备 - Google Patents
用于控制自主移动机器人的行驶的方法和设备 Download PDFInfo
- Publication number
- WO2023109281A1 WO2023109281A1 PCT/CN2022/124836 CN2022124836W WO2023109281A1 WO 2023109281 A1 WO2023109281 A1 WO 2023109281A1 CN 2022124836 W CN2022124836 W CN 2022124836W WO 2023109281 A1 WO2023109281 A1 WO 2023109281A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- autonomous mobile
- mobile robot
- autonomous
- mode
- truck
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000004891 communication Methods 0.000 claims description 18
- 238000001514 detection method Methods 0.000 claims description 13
- 238000005516 engineering process Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 5
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 230000008901 benefit Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 8
- 230000007613 environmental effect Effects 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 7
- 230000008447 perception Effects 0.000 description 5
- 230000006399 behavior Effects 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 231100000817 safety factor Toxicity 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
Definitions
- the present invention relates to a method for controlling the driving of an autonomous mobile robot, a device for controlling the driving of an autonomous mobile robot, a scheduling method, a scheduling system and a computer program product.
- Autonomous mobile robots usually encounter different types of obstacles during the driving process, and the movement conditions of the obstacles are also different.
- the robot In the currently known obstacle avoidance strategies, the robot generally adopts a way of stopping in place or going around to achieve obstacle avoidance.
- the object of the present invention is to provide a method for controlling the driving of an autonomous mobile robot, a device for controlling the driving of an autonomous mobile robot, a dispatching method, a dispatching system and a computer program product to at least solve the current There are some technical problems.
- a method for controlling the travel of an autonomous mobile robot especially an autonomous transport vehicle, the autonomous mobile robot at least switches between a navigation mode and a yield mode, the method comprising the following step:
- S2 Control the autonomous mobile robot to selectively switch from the navigation mode to the yield mode according to the detected obstacles, wherein in the yield mode, the autonomous mobile robot is controlled to avoid and stop toward the side of the road to give the obstacles Things make room.
- the present invention includes the following technical idea: by allowing the robot to selectively switch from the navigation mode to the side-by-side mode, it can adaptively react to detected obstacles. This avoids the obstruction of the passageway caused by multiple mobile devices adopting obstacle avoidance planning or stopping in place at the same time, and greatly improves the success rate of obstacle avoidance. In addition, through this way of actively giving up the passage space, the orderliness of passage in narrow spaces is ensured to a certain extent, and the intelligence level of autonomous mobile robots is improved.
- step S2 includes:
- the shortest path is planned for the autonomous mobile robot from the current position to the edge of the road, in particular to a determined distance from the edge of the road;
- the autonomous mobile robot is guided along the shortest path to the edge of the road, in particular to a defined distance from the edge of the road, and parked there.
- the following technical advantages are achieved: it is not required to plan a complete detour trajectory for the autonomous mobile robot, but the robot can be directly migrated to the side of the road to stop, so that it can make way for obstacles to pass in a short time Driving space, reducing the risk of collision. This is particularly advantageous in the case of oncoming moving objects not being observed or perceived by the autonomous mobile robot in time.
- the distance from the edge of the road when the autonomous mobile robot stops in the yield mode is smaller than the distance from the edge of the road when driving or parking in the navigation mode.
- the autonomous mobile robot since the autonomous mobile robot will eventually remain stationary in the yield mode, it can be considered to stop closer to the edge of the road, thereby ensuring more space for other mobile objects to pass, Reduce the risk of collision.
- step S2 the following judgment is initiated in the step S2 only when the obstacle involves the truck: whether to switch from the navigation mode to the yield mode.
- step S2 also includes:
- the obstacle involves a truck, checking whether the truck is a manual truck or another autonomous truck;
- step S2 also includes:
- the vehicle relates to another autonomous vehicle, obtaining priority information for the other autonomous vehicle relative to the autonomous mobile robot;
- Whether to switch from the navigation mode to the yield mode is judged according to the priority information, wherein, when the priority information indicates that the priority of the autonomous mobile robot is lower than that of another autonomous transport vehicle, the control autonomous The mobile robot switches from navigation mode to yield mode.
- the problem of right of way differences in traffic conflict areas can be advantageously resolved through the setting of priorities, and the smoothness of mixed traffic flows can be ensured.
- obtaining the priority information of another autonomous transport vehicle relative to the autonomous mobile robot includes:
- the priority can be defined in a more reasonable and reliable manner, so as to reduce the number of conflicts in a mixed traffic environment.
- step S2 includes:
- At least one parameter configuration in the yielding mode is influenced on the basis of the basic information, wherein the yielding mode is assigned a different parameter configuration.
- the yield mode can be customized according to the characteristics of obstacles, so as to further improve the success rate and safety of obstacle avoidance.
- setting the distance of the autonomous mobile robot relative to the edge of the road when the obstacle involves a manual truck when the obstacle is involved in another autonomous truck is set to be smaller, control the autonomous mobile robot to perform avoidance to the side of the road with greater speed and/or perform said avoidance with a shorter response time in yield mode.
- the method also includes the following steps before step S2:
- the Yield mode can improve the smoothness of traffic, but it still needs to sacrifice a certain amount of time efficiency. Therefore, on the premise that alternative routes can be planned in advance based on the spatial environment, it is advantageous to give priority to allowing the robot to complete the detour without stopping.
- step S1 also includes:
- the truck is detected in the image by means of image recognition technology based on an artificial neural network.
- the contours of the truck can be recognized particularly well using image processing algorithms, which provides a very reliable data basis for subsequent mode switching decisions.
- step S1 also includes:
- a preliminary determination is made as to whether the obstacle is a truck, wherein the result of the preliminary determination is at least partially taken into account when the truck is detected in the image captured by the camera.
- checking to see if the obstruction involves a manual mover includes:
- An inquiry request whether the detected truck is being manually guided is sent to the detected truck or the dispatch system through the communication network.
- an apparatus for controlling the travel of an autonomous mobile robot, in particular an autonomous truck said apparatus for performing the method according to the first aspect of the present invention, said apparatus comprising:
- a detection module configured to be able to detect obstacles in the surrounding environment of the autonomous mobile robot and/or in the preset trajectory to be driven during the automatic driving of the autonomous mobile robot in the navigation mode according to the preset trajectory;
- a control module configured to control the autonomous mobile robot to selectively switch from the navigation mode to the yield mode according to the detected obstacle, wherein in the yield mode, the autonomous mobile robot is controlled to avoid and stop toward the side of the road , to make room for the obstacle.
- a scheduling method for an autonomous mobile robot comprising the following steps:
- a dispatching system for an autonomous mobile robot, especially an autonomous transport vehicle the dispatching system is used to execute the dispatching method described in the third aspect of the present invention, the dispatching system includes:
- an event detection unit configured to detect an event in which two autonomous mobile robots meet during operation in a navigation mode
- an instruction issuing unit configured to be able to issue a dispatch instruction to at least one of the two autonomous mobile robots to prompt one of the two autonomous mobile robots to switch from a navigation mode to a yield mode, in which the yield mode, One of the two autonomous mobile robots is asked to evade and stop towards the side of the road to make room for the other of the two autonomous mobile robots.
- a computer program product comprising a computer program that implements the method according to the first aspect of the present invention or the method according to the third aspect of the present invention when executed by a computer. scheduling method.
- Fig. 1 shows the flow chart of the method for controlling the driving of an autonomous mobile robot according to an exemplary embodiment of the present invention
- Fig. 2 shows a flowchart of a method step of the method shown in Fig. 1;
- FIG. 3 shows a flowchart of another method step of the method shown in Fig. 1;
- FIG. 4 shows a block diagram of an apparatus for controlling the travel of an autonomous mobile robot according to an exemplary embodiment of the present invention
- FIG. 5 shows a flowchart of a scheduling method for an autonomous mobile robot according to an exemplary embodiment of the present invention
- Figure 6 shows a warehouse according to an exemplary embodiment of the present invention, which warehouse includes a scheduling system according to an exemplary embodiment of the present invention.
- Fig. 7 shows a schematic diagram of detecting obstacles by means of the method according to the present invention in an exemplary scene
- Fig. 8 shows a schematic diagram of switching an autonomous mobile robot to yield mode by means of the method according to the invention in an exemplary scenario
- Fig. 9 shows a schematic diagram of applying the method for controlling the driving of an autonomous mobile robot according to the present invention in another exemplary scenario.
- Fig. 1 shows a flowchart of a method for controlling the driving of an autonomous mobile robot according to an exemplary embodiment of the present invention.
- step S1 during the automatic driving of the autonomous mobile robot according to the preset trajectory in the navigation mode, obstacles are detected in the surrounding environment of the autonomous mobile robot and/or in the preset trajectory to be driven.
- an autonomous mobile robot is, for example, understood as an intelligent walking device integrating environmental perception, dynamic decision-making, path planning, behavior control, etc., which can be specifically configured as an autonomous transport vehicle ( For example, forklift AGV), this kind of autonomous handling vehicle can automatically perform material handling tasks without human intervention.
- an autonomous transport vehicle For example, forklift AGV
- the navigation mode is understood as, for example, the automatic guidance mode of the autonomous mobile robot according to a preset trajectory. It can also be calculated by the dispatch center and provided directly to the autonomous mobile robot.
- "detecting" obstacles not only includes the detection of obstacles in the surrounding environment by means of the robot's own or the environment sensing equipment in the warehouse, but also includes, for example, receiving or predicting the position of the robot in real time through communication with at least one server. Information about potential obstacles in the navigation planning route.
- step S2 the autonomous mobile robot is controlled to selectively switch from the navigation mode to the yield mode according to the detected obstacles, in which the autonomous mobile robot is controlled to avoid and stop toward the side of the road to give the obstacles Things make room.
- selective switching means: the mode switching is not initiated directly on the basis of the presence of an obstacle, but in particular also in conjunction with at least one characteristic of the obstacle to determine whether this should or is necessary. kind of switching.
- the “yielding mode” can also be synonymously referred to as the “side-by-side mode", in which, for example, the autonomous mobile robot can initially plan the shortest distance from its current position to the edge of the road, in particular to a certain distance from the edge of the road. path.
- the shortest path for the autonomous mobile robot to the right edge of the road can be planned.
- an autonomous mobile robot stops at a distance from the edge of the road in yield mode that is smaller than the required safety distance for normal driving in navigation mode.
- the prescribed safety distance is 10 cm when the autonomous mobile robot is driving normally, while the distance to the edge of the road can be, for example, 5 cm when parking in yield mode.
- controlling the autonomous mobile robot to avoid and stop toward the side of the road may include, for example, causing the autonomous mobile robot to migrate to the side of the road along the avoidance trajectory while decelerating, and to stop completely when it reaches the edge of the road.
- the present invention does not limit this, which means that specific motion parameter settings in the yield mode can be selected according to specific scenarios (such as desired response speed, energy consumption, safety factors, impact on goods, etc.).
- FIG. 2 shows a flowchart of one method step of the method shown in FIG. 1 .
- the method step S1 in FIG. 1 exemplarily includes steps S11-S16.
- step S11 the environment data in the surrounding environment of the robot is acquired, such environment data is provided in real time by the environment perception device of the autonomous mobile robot or the warehouse (in which the autonomous mobile robot is located), for example.
- step S12 it is checked according to the acquired environment data whether there is an obstacle in the surrounding environment of the autonomous mobile robot and/or in the preset trajectory that it is going to travel.
- a two-dimensional and/or three-dimensional contour map of the surrounding environment of the autonomous mobile robot can be generated based on the point cloud data obtained from the scan, however, the contour map can be for example Compared with the pre-stored warehouse map, it is judged that there is an obstacle when the difference value in a specific area is greater than the threshold.
- step S12 If it is determined in step S12 that there is currently no obstacle, it is possible, for example, to jump back from step S12 back to step S11 and continuously obtain environmental data and perform this check.
- step S12 it may be determined whether the autonomous mobile robot is circumventable relative to the obstacle in combination with the acquired environmental data.
- the location information of the autonomous mobile robot can be determined first by combining the warehouse map data and the environment perception data, and then based on the location information, it can be checked whether the autonomous mobile robot is currently in an open space or a roadway in the warehouse. As an example, if it is in an open field, it means that it has the ability to go around. On the contrary, if the autonomous mobile robot is in a roadway, it means that it does not have the ability to go around.
- a detour route is planned for the autonomous mobile robot.
- step S15 at least one other type of sensor other than the camera is used to obtain the basic information of the obstacle, and based on the basic information, preliminary Predict whether the obstacle involves a truck.
- This other type of sensor can be, for example, a sensor of the autonomous mobile robot itself and/or one of lidar sensors, radar sensors, ultrasonic sensors, infrared sensors or Various.
- the basic information includes, for example, obstacle identification information, volume information, motion information, category information, and the like.
- step S15 if one or more of the following information is identified in step S15, it can be preliminarily determined that the obstacle involves a truck:
- the speed pattern and/or motion trajectory of the identified obstacle conforms to the motion characteristics of the truck.
- step S16 an image of the surrounding environment of the autonomous mobile robot is acquired, and the transport vehicle is identified in the image by means of an image recognition technology based on an artificial neural network.
- the preliminary determination result obtained in step S15 is at least partially taken into account during the image recognition process.
- the obstacle is confirmed to be a truck in step S16 only when both the preliminary judgment result and the image-based inspection result indicate that the obstacle involves a truck. It is also possible to confirm that the obstacle is a truck in step S16 if one of the results of the initial inspection and the image-based inspection indicates that the obstacle is a truck.
- the acquired images are captured, for example, by a camera of the autonomous mobile robot and/or installed in the surrounding environment of the autonomous mobile robot.
- the image data source is a monitoring camera in the warehouse
- the current position of the autonomous mobile robot can be uploaded to the warehouse server through the communication interface, and then the warehouse server will call at least one monitoring camera near the autonomous mobile robot based on the current position information fed back.
- the real-time monitoring images of the camera are provided for obstacle identification.
- FIG. 3 shows a flowchart of another method step of the method shown in FIG. 1 .
- method step S2 in FIG. 1 exemplarily includes steps S21-S28.
- method step S16 in FIG. 2 is additionally shown here for the sake of illustration.
- step S16 a category check of the obstacle has already been carried out, for example by means of image recognition technology.
- step S21 it is judged according to the result of the category check whether the detected obstacle involves a transport vehicle.
- step S22 an other type of obstacle avoidance operation is performed in combination with the basic information of the obstacle.
- step S23 a preparation for judging the mode switching is initiated.
- step S24 if it has been determined that the obstacle is a transport vehicle, for example, it may be further checked whether the transport vehicle is a manual transport vehicle or another autonomous mobile robot (eg, an autonomous transport vehicle).
- the transport vehicle is a manual transport vehicle or another autonomous mobile robot (eg, an autonomous transport vehicle).
- the above-mentioned check results can be taken into account individually or in combination, and a conclusion can be drawn from this as to whether the detected truck is being manually guided.
- step S24 If it is determined in step S24 that the detected transport vehicle is manually guided by personnel, then for example, in step S27 based on this detection result, at least one parameter configuration can be determined for the yielding mode to be implemented, and in step S28, autonomous The mobile robot switches to the yield mode according to this parameter configuration.
- the parameterization can include, for example, the distance of the autonomous mobile robot from the edge of the road when it stops in yield mode.
- step S24 if it is determined in step S24 that the transport vehicle involves another autonomous transport vehicle, then in step S25, the priority information of another autonomous transport vehicle relative to the autonomous mobile robot is obtained, and thus it is judged whether the priority of the autonomous mobile robot is lower than another autonomous van.
- the scheduling sequence of another autonomous guided vehicle relative to an autonomous mobile robot is a scheduling sequence of another autonomous guided vehicle relative to an autonomous mobile robot.
- step S26 the autonomous mobile robot is not pulled over, but instead is kept, for example, in a navigation mode and e.g. Appropriate speed and trajectory control allow the autonomous mobile robot to complete the passage first. Additionally or alternatively, it may also be contemplated to transmit instructions to another autonomous truck to prompt it to perform a yielding maneuver, either through direct interaction with the other autonomous truck and/or through communication with the dispatch system.
- step S25 If in step S25 it is found that the autonomous mobile robot holds a lower priority, for example in step S27 parameterize the yield mode and in step S28 control the autonomous mobile robot to switch from the navigation mode into the yield mode.
- an autonomous mobile robot can be set to a smaller distance from the edge of the road when stopping in yield mode when the obstacle involves a human truck than if the obstacle involves another autonomous truck , controlling the autonomous mobile robot to perform avoidance to the side of the road at a greater speed in the yield mode and/or perform the avoidance with a shorter response time.
- Fig. 4 shows a block diagram of an apparatus for controlling the driving of an autonomous mobile robot according to an exemplary embodiment of the present invention.
- the device 1 comprises a detection module 10 and a control module 20 connected to each other in communication technology.
- the device 1 can be arranged, for example, directly on the autonomous mobile robot, in the environment of the autonomous mobile robot (eg a warehouse) or on a server located in the cloud, as required.
- the detection module 10 is used, for example, to detect obstacles in the surrounding environment of the autonomous mobile robot and/or in the preset trajectory to be driven during the automatic driving of the autonomous mobile robot according to the preset trajectory in the navigation mode.
- the detection module 10 includes or is configured as sensors such as lidar sensors, RGBD cameras, monocular cameras, stereo vision cameras, infrared sensors, ultrasonic sensors, inertial sensors, GPS sensors, and radio frequency receivers.
- the detection module 10 is designed as a communication interface and can communicate, for example, with at least one autonomous mobile robot and/or a control system in order to receive the required environmental data from a corresponding environmental perception unit.
- the control module 20 is, for example, used to control the autonomous mobile robot to selectively switch from the navigation mode to the yield mode according to the detected obstacles.
- the control module 20 is also used, for example, to control the autonomous mobile robot to avoid and stop toward the side of the road in the yield mode, so as to make room for obstacles.
- the control module 20 may include or be configured as a processor, for example, so as to perform calculation and processing operations on at least one condition, and can issue control instructions to the kinematic mechanism of the autonomous mobile robot when a specific condition is met, so as to control the autonomous mobile robot, for example The lateral and/or longitudinal guidance of the robot intervenes.
- Fig. 5 shows a flowchart of a scheduling method for an autonomous mobile robot according to an exemplary embodiment of the present invention.
- step 510 an event in which two autonomous mobile robots meet during operation in a navigation mode is detected.
- step 520 in response to detecting the event, dispatch instructions are issued to at least one of the two autonomous mobile robots to cause one of the two autonomous mobile robots to switch from the navigation mode to the yield mode.
- the yield mode one autonomous mobile robot is made to yield and stop toward the side of the road to make room for another autonomous mobile robot.
- the object, content and/or sending order of the scheduling instructions can be determined.
- send a scheduling command to the autonomous mobile robot with a lower priority to pull over and make an emergency stop and send a scheduling command to keep driving in the current navigation mode to the autonomous mobile robot with a higher priority.
- dispatch instructions for mode switching may be sent to only one of the two autonomous mobile robots.
- FIG. 6 shows a warehouse 60 according to an exemplary embodiment of the present invention.
- the warehouse 60 includes a scheduling system 600 according to an exemplary embodiment of the present invention, and the scheduling system 600 includes an event discovery unit 610 and an instruction issuing unit 620 connected to each other in communication technology.
- the warehouse 60 further includes, for example, monitoring cameras 601 for detecting obstacles in various areas in the warehouse 60 , and the monitoring cameras 602 are arranged in multiple locations in the warehouse 60 in a distributed manner, for example. Furthermore, the warehouse 60 also includes, for example, a communication interface 602 for communication via a network.
- two autonomous mobile robots 61 and 62 are moving in the warehouse 60, and the event discovery unit 610 receives data from each monitoring camera 601 through the communication interface 602, so as to analyze and process the images of each position of the warehouse 60 in real time.
- the event discovery unit 610 determines that two autonomous mobile robots have occurred. The encounter event of the robots 61, 62.
- the instruction issuing unit 620 generates a dispatch instruction to switch from the navigation mode to the yield mode, and sends this dispatch instruction to at least one of the two autonomous mobile robots 61, 62 (for example, the one with lower priority) by means of the communication interface 602. that) to prompt one of the two autonomous mobile robots to switch from navigation mode to yield mode.
- FIG. 7 shows a schematic diagram of the detection of obstacles by means of the method according to the invention in an exemplary scenario.
- an autonomous mobile robot 100 in the form of a forklift AGV is being automatically guided in a navigation mode along a pre-planned trajectory.
- the manual forklift 200 manually guided by the driver 201 through the manipulation screen 202 is approaching the forklift AGV 100 head-on at a certain speed.
- the forklift AGV 100 and the oncoming manual forklift 200 are currently located in an aisle of the logistics warehouse, in which the traversable space is limited, for example, by the shelves 71 and 72 on the left and right, respectively.
- the forklift AGV 100 for example, is significantly smaller than the manual forklift 200 in terms of its volume size (especially height).
- the forklift AGV 100 not only scans the road environment in front of it with the help of the lidar sensor 101 with a certain field of view, but also obtains information about obstacles in the surrounding environment with the help of the communication interface 102 and the camera 103.
- the manual forklift 100 mainly relies on the driver 201 in the cockpit to observe potential obstacles nearby with naked eyes.
- the forklift AGV 100 has discovered the oncoming artificial forklift 200 in time through multi-sensor fusion technology, and the driver 201 in the artificial forklift 200, for example, due to the height of the cockpit and the viewing angle (Blind area of vision) and other factors did not immediately recognize the forklift AGV 100 driving close to the ground.
- Fig. 8 shows a schematic diagram of switching an autonomous mobile robot into a yield mode by means of the method according to the invention in an exemplary scenario.
- the forklift AGV 100 has recognized that the obstacle ahead involves the manual forklift 200 by means of corresponding image recognition technology or fusion detection technology. Therefore, the AGV 100 of the forklift immediately performs corresponding environmental perception to obtain the shape of the surrounding roadway. Based on the corresponding roadway form, for example, by controlling the lateral guidance of the forklift AGV 100, it is offset from the initial position in the roadway along the direction indicated by the arrow 701 (for example, to the right here) until the forklift AGV 100 and the right side The distance between road boundaries 72 reaches the minimum safety distance. Once it is found that the minimum safe distance is reached, the forklift AGV 100 is controlled to stop there and remain stationary.
- the forklift AGV 100 has carried out the parking behavior of this sideways avoidance by means of the method according to the present invention, so for the manual forklift 200, it for example has enough passage space and for example even in the most unfavorable case (no Observing forklift AGV for a while), for example also can complete safe passage along track 702 from the left side of channel under the situation of not significantly changing lateral guidance.
- Fig. 9 shows a schematic diagram of applying the method for controlling the driving of an autonomous mobile robot according to the present invention in another exemplary scenario.
- the scenario shown here differs from FIGS. 7-8 in that the oncoming obstacle here does not involve a manual forklift, but another automatically guided forklift AGV 300 .
- the forklift AGV 100 already knows, through corresponding environmental awareness, that there is a forklift AGV 300 in front of it, which is also driving autonomously.
- the forklift AGV 100 has, for example, exchanged information 110 with another forklift AGV 300 via the communication interface 102, and based on this, it is determined that it has a lower priority than the opposite forklift AGV 300.
- the forklift AGV 100 is also controlled to enter the yield mode from the navigation mode, but compared with the scene shown in FIG.
- the minimum safety distance is set to be larger in order, for example, for the forklift AGV 100 to return more quickly from the yield mode to the navigation mode after the avoidance maneuver has ended.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
一种用于控制自主移动机器人的行驶的方法和设备,涉及智能物流领域,自主移动机器人至少在导航模式和让行模式之间切换工作,方法包括以下步骤:S1:在自主移动机器人在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人的周围环境中和/或在待行驶的预设轨迹中探测障碍物;S2:根据所探测到的障碍物控制自主移动机器人从导航模式选择性地切换到让行模式,其中,在让行模式中控制自主移动机器人朝着道路一侧避让并停车,以给障碍物让出空间。
Description
本发明涉及一种用于控制自主移动机器人行驶的方法、一种用于控制自主移动机器人行驶的设备、一种调度方法、一种调度系统和一种计算机程序产品。
随着电子商务、现代化工厂等领域的兴起,越来越多地使用智能化仓储系统来完成物品的拣选、搬运、存储等。目前在智能仓储物流领域,为了减轻人工拣选员的负担并提高拣选作业效率,一般通过自主移动机器人与人的协作来完成物料的拣选与搬运工作。
自主移动机器人在行驶过程中通常会遇到不同类别的障碍物,而且障碍物的运动状况也各不相同。在目前已知的避障策略中,机器人一般采取原地停等或绕行的方式来实现避障。
然而,这些已知的避障方式存在诸多局限性。特别是,如果两个正在执行搬运任务的AGV(Automated Guided Vehicle,自动引导车)在巷道相遇,则有可能都采取停等或朝着相同方向避障绕行,这导致巷道彻底堵死。此外,如果AGV与被手动引导的搬运车相遇时发生优先行驶不清楚的情况,那么由AGV和被手动引导的搬运车组成的混合运行就可能导致问题。在该情况中,人类驾驶员通常需要手动控制搬运车退出阻挡AGV的行驶路线,这会大大降低人工效率。
在这种背景下,期待提供一种改进的避障策略,旨在以更合理的方式化解人车混行或多车混行时的矛盾并提高避障成功率。
发明内容
本发明的目的在于提供一种用于控制自主移动机器人行驶的方法、一种用于控制自主移动机器人行驶的设备、一种调度方法、一种调度系统和 一种计算机程序产品,以至少解决现有技术中的部分问题。
根据本发明的第一方面,提供一种用于控制自主移动机器人、尤其自主搬运车的行驶的方法,所述自主移动机器人至少在导航模式和让行模式之间切换工作,所述方法包括以下步骤:
S1:在自主移动机器人在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人的周围环境中和/或在待行驶的预设轨迹中探测障碍物;以及
S2:根据所探测到的障碍物控制自主移动机器人从导航模式选择性地切换到让行模式,其中,在所述让行模式中控制自主移动机器人朝着道路一侧避让并停车,以给障碍物让出空间。
本发明尤其包括以下技术构思:通过让机器人从导航模式选择性地切换到靠边让行模式,可以对探测到的障碍物适应性地做出反应。由此避免多个可移动设备同时采取避障规划或原地停等而导致的通行道路受阻,极大地提高了避障成功率。此外,通过这种主动让出通行空间的方式,在一定程度上确保了窄小空间通行的有序性,提高了自主移动机器人的智能化水平。
可选地,所述步骤S2包括:
在所述让行模式中,为自主移动机器人规划从当前位置至道路边缘、尤其至距道路边缘确定距离处的最短路径;以及
沿着所述最短路径将自主移动机器人引导至道路边缘处、尤其引导至距道路边缘确定距离处,并使自主移动机器人在那里停车。
由此,实现以下技术优点:在此不要求为自主移动机器人规划出完整绕行轨迹,而是使机器人直接迁移到道路一侧停车,这样可以在较短时间内为待通行的障碍物让出行驶空间,减小碰撞风险。这尤其在对面移动对象未及时观察到或感知到自主移动机器人的情况下是特别有利的。
可选地,自主移动机器人在让行模式中停车时相对于道路边缘的距离小于在导航模式中行驶或停车时相对于道路边缘的距离。
由此,实现以下技术优点:由于自主移动机器人在让行模式中最终会保持在静止状态,因此可以考虑使其更加贴近道路边缘地停靠,从而确保给另外的移动对象留出更大通行空间,减小碰撞风险。
可选地,仅当障碍物涉及搬运车的情况下才在所述步骤S2中发起如下判断:是否从导航模式切换到让行模式。
由此,实现以下技术优点:频繁促使机器人靠边停车可能会干扰其正常作业效率。因此,通过在障碍物类型方面的这种过滤筛选,可以使机器人仅在可能涉及特定混行场景时才执行针对这种模式切换的判断,这不但避免机器人的非必要起停,而且也减小了执行这种判断的计算开销。
可选地,所述步骤S2还包括:
在所述障碍物涉及搬运车的情况下,检查所述搬运车涉及人工搬运车还是涉及另一自主搬运车;以及
基于检查的结果来判断是否从导航模式切换到让行模式,其中,在涉及人工搬运车的情况下使自主移动机器人直接从导航模式切换到让行模式。
由此,实现以下技术优点:在多种驾驶行为并存的场景中,统一标准的缺失会给自动引导带来决策困难。在这种混行场景中,直接让自主移动机器人适当回退并由此切换到偏于保守的让行模式中能够极大地提高安全性。
可选地,所述步骤S2还包括:
在所述搬运车涉及另一自主搬运车的情况下,获取另一自主搬运车相对于所述自主移动机器人的优先级信息;以及
根据所述优先级信息判断是否从导航模式切换到让行模式,其中,在所述优先级信息指出所述自主移动机器人的优先级低于另一自主搬运车的优先级的情况下,控制自主移动机器人从导航模式切换到让行模式。
由此,实现以下技术优点:通过优先级的设定能够有利地化解通行冲突区的路权差异问题,确保混合交通流的流畅性。
可选地,获取另一自主搬运车相对于自主移动机器人的优先级信息包括:
借助通信网络接收和/或借助传感器检测另一自主搬运车的型号、订单处理紧迫程度、另一自主搬运车探测到所述自主移动机器人的可能性和/或另一自主搬运车相对于自主移动机器人的调度顺序。
由此,实现以下技术优点:能够以更合理、可靠地方式定义优先级,以减少混行环境下的冲突次数。
可选地,所述步骤S2包括:
在已经确认使自主移动机器人从导航模式切换到让行模式的情况下,获取关于障碍物的基本信息;以及
根据所述基本信息来影响所述让行模式中的至少一种参数配置,其中,尤其针对所探测到的障碍物涉及人工搬运车和另一自主搬运车的情况为所述让行模式分配不同的参数配置。
由此,实现以下技术优点:可以按照障碍物特性而个性化地定制让行模式,以进一步提高避障成功率和安全性。
可选地,相比于所述障碍物涉及另一自主搬运车的情况,在障碍物涉及人工搬运车的情况下,将自主移动机器人在让行模式中停车时相对于道路边缘的距离设定得更小、控制自主移动机器人在让行模式中以更大的速度执行向道路一侧的避让和/或以更短的响应时间执行所述避让。
由此,实现以下技术优点:由于人工驾舱的特殊高度设置和角度设置,人类驾驶员一般不易发现贴近地面行驶的自主移动机器人。通过让自主移动机器人给人工搬运车留出更大通行空间并更迅速地完成躲避则可以改善混行环境下的交通安全性。
可选地,所述方法在步骤S2之前还包括如下步骤:
附加地基于地图数据和传感器数据获取自主移动机器人的位置信息;
基于位置信息检查自主移动机器人相对于所述障碍物的可绕行性;
仅当所述可绕行性不满足预设条件的情况下才在所述步骤S2中发起向所述让行模式的切换。
由此,实现以下技术优点:让行模式作为一种尤其适用于窄小空间的保守避障策略,虽然能够提高通行流畅性,但仍需牺牲一定时间效率。因此,在基于所处空间环境能够提前规划出替代路线的前提下,优先考虑让机器人通过不停车方式完成绕行是有利的。
可选地,所述步骤S1还包括:
获取自主移动机器人的周围环境的图像,所述图像由自主移动机器人的和/或安装在自主移动机器人的周围环境中的摄像头拍摄;以及
借助基于人工神经网络的图像识别技术在所述图像中辨识搬运车。
由此,实现以下技术优点:利用图像处理算法能够特别良好地识别出 搬运车的轮廓,这为后续的模式切换决策提供了非常可靠的数据基础。
可选地,所述步骤S1还包括:
借助自主移动机器人的和/或安装在周围环境中的除摄像头以外的至少一个其他类型的传感器获取障碍物的基本信息;
基于所述基本信息初步判断障碍物是否涉及搬运车,其中,在借助摄像头拍摄的图像中识别搬运车时至少部分地考虑所述初步判断的结果。
由此,尤其实现以下技术优点:通过多传感器融合的探测方式能够提高障碍物识别的准确性,这特别有效地减小了假阳性错误的出现,并相应地避免不必要的模式切换。
可选地,检查障碍物是否涉及人工搬运车包括:
检查一个或多个人员是否位于所探测到的搬运车的内部空间中;
检查所探测到的搬运车是否具有自主行驶功能;和/或
通过通信网络向所探测到的搬运车或调度系统发送所探测到的搬运车是否正在被人工引导的询问请求。
由此,实现以下技术优点:能够实现人工搬运车的有效识别,从而提高模式切换的合理性。
根据本发明的第二方面,提供一种用于控制自主移动机器人、尤其自主搬运车的行驶的设备,所述设备用于执行根据本发明的第一方面所述的方法,所述设备包括:
探测模块,其被配置为能够在自主移动机器人在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人的周围环境中和/或在待行驶的预设轨迹中探测障碍物;以及
控制模块,其被配置为能够根据所探测到的障碍物控制自主移动机器人从导航模式选择性地切换到让行模式,其中,在让行模式中控制自主移动机器人朝着道路一侧避让并停车,以给障碍物让出空间。
根据本发明的第三方面,提供一种用于自主移动机器人、尤其自主搬运车的调度方法,所述调度方法包括以下步骤:
探测两个自主移动机器人在导航模式中运行期间相遇的事件;以及
响应于探测到所述事件,向所述两个自主移动机器人中的至少一个发出调度指令,以促使两个自主移动机器人中的一个从导航模式切换到让行 模式,在所述让行模式中,要求所述两个自主移动机器人中的一个朝着道路一侧避让并停车,以给所述两个自主移动机器人中的另一个让出空间。
根据本发明的第四方面,提供一种用于自主移动机器人、尤其自主搬运车的调度系统,所述调度系统用于执行本发明的第三方面所述的调度方法,所述调度系统包括:
事件发现单元,其被配置为能够探测两个自主移动机器人在导航模式中运行期间相遇的事件;以及
指令发出单元,其被配置为能够向两个自主移动机器人中的至少一个发出调度指令,以促使两个自主移动机器人中的一个从导航模式切换到让行模式,在所述让行模式中,要求两个自主移动机器人中的一个朝着道路一侧避让并停车,以给两个自主移动机器人中的另一个让出空间。
根据本发明的第五方面,提供一种计算机程序产品,其包括计算机程序,所述计算机程序在计算机执行时实施根据本发明的第一方面所述的方法或根据本发明的第三方面所述的调度方法。
下面,通过参看附图更详细地描述本发明,可以更好地理解本发明的原理、特点和优点。附图包括:
图1示出了根据本发明的一个示例性实施例的用于控制自主移动机器人的行驶的方法的流程图;
图2示出了图1所示的方法的一个方法步骤的流程图;
图3示出了图1所示的方法的另一方法步骤的流程图;
图4示出了根据本发明的一个示例性实施例的用于控制自主移动机器人的行驶的设备的框图;
图5示出了根据本发明的一个示例性实施例的用于自主移动机器人的调度方法的流程图;
图6示出了根据本发明的一个示例性实施例的仓库,该仓库包括根据本发明的一个示例性实施例的调度系统;以及
图7示出了在一个示例性场景中借助根据本发明的方法来探测障碍物的示意图;
图8示出了在一个示例性场景中借助根据本发明的方法来使自主移动机器人切换到让行模式的示意图;以及
图9示出了在另一示例性场景中应用根据本发明的用于控制自主移动机器人的行驶的方法的示意图。
为了使本发明所要解决的技术问题、技术方案以及有益的技术效果更加清楚明白,以下将结合附图以及多个示例性实施例对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用于解释本发明,而不是用于限定本发明的保护范围。
图1示出了根据本发明的一个示例性实施例的用于控制自主移动机器人的行驶的方法的流程图。
在步骤S1中,在自主移动机器人在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人的周围环境中和/或在待行驶的预设轨迹中探测障碍物。
在本发明的意义上,自主移动机器人例如理解为集环境感知、动态决策、路径规划、行为控制等多功能于一体的智能行走设备,其在自动化仓储领域尤其可被具体构造为自主搬运车(例如叉车AGV),这种自主搬运车在无需人工干预的情况下即可自动执行物料搬运任务。
这里,导航模式例如理解为自主移动机器人的按照预设轨迹的自动引导模式,该预设轨迹例如可以由自主移动机器人的导航定位单元根据适当的起点和终点信息而规划出,此外该预设轨迹也可以由调度中心计算并直接提供给自主移动机器人。
在该步骤中,“探测”障碍物不仅包括可以借助机器人自身的或仓库中的环境感知设备检测周围环境中的障碍物,而且还例如包括通过与至少一个服务器的通信来实时接收或预测位于机器人的导航规划路线中的潜在障碍物信息。
在步骤S2中,根据所探测到的障碍物控制自主移动机器人从导航模式选择性地切换到让行模式,在该让行模式中控制自主移动机器人朝着道路一侧避让并停车,以给障碍物让出空间。
在本发明的意义上,“选择性地切换”表示:不是直接基于障碍物的存在性来发起模式切换,而是尤其还应结合障碍物的至少一个特性来判断是否应当或是否有必要执行这种切换。
在此,“让行模式”也可被同义地称为“靠边模式”,在该模式中例如可以首先为自主移动机器人规划从当前位置至道路边缘、尤其至距道路边缘确定距离处的最短路径。在最简单情况下,可以规划自主移动机器人向道路右侧边缘的最短路径。然而也能想到的是,根据自主移动机器人分别相对于道路两侧的距离大小和/或相对于障碍物的位置来选择左右两侧中的一侧作为目标侧。接下来,可以沿着该最短路径将自主移动机器人引导至道路边缘处、尤其引导至距道路边缘确定距离处,并使自主移动机器人在那里停车。
作为示例,自主移动机器人在让行模式中停车时相对于道路边缘的距离小于在导航模式中正常行驶时所要求的安全距离。例如,在自主移动机器人正常行驶时规定的安全距离为10cm,而在让行模式中停车情况下相对于道路边缘的距离例如可以为5cm。
应注意,“控制自主移动机器人朝着道路一侧避让并停车”例如可以包括:使自主移动机器人一边减速一边沿着避让轨迹迁移到道路一侧,并在到达道路边缘附近时完全停下。然而也可能的是,使自主移动机器人以确定速度(尤其较高速度)移动至道路边缘附近,并在到达边缘时才执行急停。本发明对此不进行限制,这意味着可以根据具体场景(例如期望的响应速度、能耗、安全因素、对货品的影响等)来选择让行模式中的具体运动参数设置。
图2示出了图1所示的方法的一个方法步骤的流程图。如图2所示,图1中的方法步骤S1示例性地包括步骤S11-S16。
在步骤S11中,获取机器人的周围环境中的环境数据,这种环境数据例如由自主移动机器人的或仓库(自主移动机器人处于该仓库中)的环境感知设备实时提供。
在步骤S12中,根据所获取的环境数据检查:在自主移动机器人的周围环境中和/或在其即将行驶的预设轨迹中是否存在障碍物。作为示例,在结合激光雷达数据执行避障测距的情况下,例如可以基于扫描获得的点云 数据生成自主移动机器人的周围环境的二维和/或三维轮廓图,然而例如可以将该轮廓图与预存储的仓库地图进行比对,并在特定区域的差异值大于阈值的情况下判断出存在障碍物。然而也可能的是,基于其他障碍物识别算法实现障碍物存在性的检验,本发明对此不进行具体限制。
如果在步骤S12中判断出目前不存在障碍物,则例如可以从步骤S12重新跳转回到步骤S11并持续地进行环境数据的获取并执行这种检查。
如果在步骤S12中发现障碍物,则例如可以在步骤S13中结合已经获取的环境数据判断自主移动机器人相对于该障碍物是否具备可绕行性。这里,例如首先可以结合仓库地图数据和环境感知数据确定自主移动机器人的位置信息,然后基于位置信息检查:自主移动机器人目前处于仓库中的开阔场地还是巷道。作为示例,如果处于开阔场地则例如意味着具备可绕行性,反之如果自主移动机器人处于巷道,则例如表示不具备可绕行性。
如果基于位置信息判断出自主移动机器人针对目前面临的障碍物具备可绕行性,则例如在步骤S14中为自主移动机器人规划绕行线路。
如果在步骤S13中判断出可绕行性不满足预设条件,则在接下来的步骤S15中首先结合除摄像头以外的至少一个其他类型的传感器获取障碍物的基本信息,并基于该基本信息初步预判该障碍物是否涉及搬运车。在此,这种其他类型的传感器例如可以是自主移动机器人自身的传感器和/或安装在周围环境中的(例如仓库中的)激光雷达传感器、雷达传感器、超声波传感器、红外传感器中的一种或多种。所述基本信息例如包括障碍物的标识信息、体积信息、运动信息、类别信息等。
作为示例,如果在步骤S15中识别到以下信息中的一种或多种,则能够初步判断出障碍物涉及搬运车:
-识别到障碍物表面携带的传感器敏感的特殊标识;
-识别到障碍物的外形符合搬运车的常规轮廓;以及
-识别到障碍物的速度模式和/或运动轨迹符合搬运车的运动特性。
接下来,在步骤S16中,获取自主移动机器人的周围环境的图像,借助基于人工神经网络的图像识别技术在该图像中辨识搬运车。在这里,在执行图像识别的过程中尤其至少部分地考虑步骤S15中得到的初步判断结果。作为示例,仅当初步判断结果和借助图像的检查结果均指出障碍物涉 及搬运车时才在步骤S16中确认该障碍物为搬运车。同样可能的是,只要初检结果和借助图像的检查结果之一指出障碍物涉及搬运车,则在步骤S16中确认该障碍物为搬运车。
在该步骤中,所获取的图像例如由自主移动机器人的和/或安装在自主移动机器人的周围环境中的摄像头所拍摄。当图像数据来源为仓库中的监控摄像头时,例如可以首先借助通信接口将自主移动机器人的当前位置上传至仓库服务器,然后仓库服务器会根据所反馈的当前位置信息调用自主移动机器人附近的至少一个监控摄像头的实时监控图像并将其提供用于障碍物判别。
图3示出了图1所示的方法的另一方法步骤的流程图。如图3所示,图1中的方法步骤S2示例性地包括步骤S21-S28。此外在此为便于说明,还附加地示出了图2中的方法步骤S16。
在步骤S16中例如已经借助图像识别技术对障碍物执行了类别检查。
接下来,在步骤S21中根据类别检查的结果判断所探测到的障碍物是否涉及搬运车。
如果确认所探测到的障碍物不涉及搬运车,而是属于其他类型的对象,则例如在步骤S22中结合障碍物的基本信息执行一个其他类型的避障操作。
如果已经确认障碍物涉及搬运车,则在步骤S23中发起对模式切换的判断准备工作。
在步骤S24中,在已经判断出障碍物为搬运车的情况下,例如还可以进一步检查搬运车涉及人工搬运车还是涉及另一自主移动机器人(例如自主搬运车)。
为了实现这种在“手动引导和自动引导”之间的区分,例如可以实施以下操作:
-检查一个或多个人员是否位于所探测到的搬运车的内部空间中;
-检查所探测到的搬运车是否具有自主行驶功能;和/或
-通过通信网络向所探测到的搬运车或调度系统发送所探测到的搬运车是否正在被人工引导的询问请求。
在此,例如可以单独地或融合地考虑上述检查结果并由此得出所识别到的搬运车是否正在被手动引导的结论。
如果在步骤S24中判断出所探测到的搬运车由人员手动引导,则例如可以在步骤S27中基于这种探测结果为待实施的让行模式确定至少一种参数配置,并在步骤S28中让自主移动机器人按照这种参数配置切换到让行模式中。在此,该参数配置例如可以包括自主移动机器人在让行模式中停车时相对于道路边缘的距离。
相反,如果在步骤S24中判断出搬运车涉及另一自主搬运车,则在步骤S25中获取另一自主搬运车相对于自主移动机器人的优先级信息,并由此判断自主移动机器人的优先级是否低于另一自主搬运车。
为了实现这种在优先级方面的判断,例如可以借助通信网络接收和/或借助传感器检测:
另一自主搬运车的型号;
另一自主搬运车的订单处理紧迫程度;
另一自主搬运车探测到所述自主移动机器人的可能性大小;和/或
另一自主搬运车相对于自主移动机器人的调度顺序。
如果在步骤S25中判断出自主移动机器人的优先级高于另一自主搬运车的优先级,则在步骤S26中不使自主移动机器人靠边停靠,而是例如将其保持在导航模式中并例如通过适当的速度和轨迹控制而使该自主移动机器人率先完成通行。附加地或替代地,还可考虑通过与另一自主搬运车的直接交互和/或通过与调度系统的通信来向另一自主搬运车传递指令,以促使其执行让行操作。
如果在步骤S25中发现自主移动机器人持有较低优先级,则例如在步骤S27中对让行模式进行参数配置并在步骤S28中控制自主移动机器人从导航模式切换到该让行模式中。
为了针对“人车混行”和“车车混行”场景采取合适的让行措施,例如可以针对所探测到的障碍物涉及人工搬运车和另一自主搬运车的情况为让行模式分配不同的参数配置。例如,相比于障碍物涉及另一自主搬运车的情况,在障碍物涉及人工搬运车的情况下,可以将自主移动机器人在让行模式中停车时相对于道路边缘的距离设定得更小、控制自主移动机器人在让行模式中以更大的速度执行向道路一侧的避让和/或以更短的响应时间执行所述避让。
图4示出了根据本发明的一个示例性实施例的用于控制自主移动机器人的行驶的设备的框图。
如图4所示,设备1包括在通信技术上彼此连接的探测模块10和控制模块20。在此,该设备1例如可以根据需要而直接布置在自主移动机器人上、布置在自主移动机器人的周围环境(例如仓库)中或者布置在位于云端的服务器上。
探测模块10例如用于在自主移动机器人在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人的周围环境中和/或在待行驶的预设轨迹中探测障碍物。为此,探测模块10例如包括或构造为激光雷达传感器、RGBD相机、单目摄像机、立体视觉摄像机、红外传感器、超声波传感器、惯性传感器、GPS传感器、射频接收器等传感器。然而也能够想到的是,探测模块10被构造为通信接口并例如能够与至少一个自主移动机器人和/或调度系统通信,以便从相应的环境感知单元接收所需的环境数据。
控制模块20例如用于根据所探测到的障碍物控制自主移动机器人从导航模式选择性地切换到让行模式。在此,控制模块20例如还用于在让行模式中控制自主移动机器人朝着道路一侧避让并停车,以给障碍物让出空间。为此,控制模块20例如可以包括或构造为处理器,以便执行在至少一个条件方面的计算处理操作,并能够在特定条件满足时向自主移动机器人的运动机构发出控制指令,以例如对自主移动机器人的横向引导和/或纵向引导施加干预。
图5示出了根据本发明的一个示例性实施例的用于自主移动机器人的调度方法的流程图。
在步骤510中,探测两个自主移动机器人在导航模式中运行期间相遇的事件。
在步骤520中,响应于探测到该事件,向两个自主移动机器人中的至少一个发出调度指令,以促使两个自主移动机器人中的一个从导航模式切换到让行模式。在该让行模式中,使一个自主移动机器人朝着道路一侧避让并停车,以给另外的自主移动机器人让出空间。
在此,例如可以基于两个自主移动机器人的优先级信息来确定调度指令的发出对象、内容和/或发送顺序。作为示例,向优先级低的那个自主移 动机器人发送靠边急停的调度指令,并向优先级较高的自主移动机器人发送保持当前导航模式行驶的调度指令。作为另一示例,可以仅向两个自主移动机器人中的一个发送关于模式切换的调度指令。
图6示出了根据本发明的一个示例性实施例的仓库60。该仓库60包括根据本发明的一个示例性实施例的调度系统600,该调度系统600包括在通信技术上彼此连接的事件发现单元610和指令发出单元620。
如图6所示,仓库60例如还包括用于检测仓库60中的各个区域中的障碍物情况的监控摄像机601,该监控摄像机602例如分布式地布置在仓库60中的多个位置处。此外,该仓库60例如还包括用于经由网络进行通信的通信接口602。
根据一个实施例,两个自主移动机器人61、62在该仓库60中运动,事件发现单元610借助通信接口602接收各个监控摄像机601的数据,以便实时地对仓库60各个位置的图像进行分析处理。在此,如果例如已经基于仓库60中部署的监控摄像机601识别两个自主移动机器人61、62在一个较窄区域内迎面相向而行,并即将相遇,则事件发现单元610确定发生两个自主移动机器人61、62的相遇事件。于是,指令发出单元620生成从导航模式向让行模式切换的调度指令,并借助通信接口602将这种调度指令发送给两个自主移动机器人61、62中的至少一个(例如优先级较低的那个),以促使两个自主移动机器人中的一个从导航模式切换到让行模式。
图7示出了在一个示例性场景中借助根据本发明的方法来探测障碍物的示意图。
在图7所示场景中,叉车AGV形式的自主移动机器人100正在导航模式中沿着预先规划出的轨迹自动引导。同时,由驾驶员201通过操纵屏幕202手动引导的人工叉车200正以确定速度迎面接近叉车AGV 100。还注意到,叉车AGV 100与迎面驶来的人工叉车200目前位于物流仓库的一个巷道中,在该巷道中,例如分别通过左侧和右侧的货架71和72限制出可通行空间。
在该实施例中,叉车AGV 100例如在其体积大小(尤其高度)方面明显小于人工叉车200。此外,叉车AGV 100不仅借助具有确定视野范围的激光雷达传感器101扫描其前方道路环境,而且同时还借助通信接口102 和摄像头103获取关于周围环境中的障碍物信息。而人工叉车100则主要依赖座舱内的驾驶员201通过肉眼观察附近的潜在障碍物。在这种情况下,极有可能发生的是:叉车AGV 100已通过多传感器融合技术及时地发现正迎面驶来的人工叉车200,而人工叉车200中的驾驶员201例如由于座舱高度、观察角度(视野盲区)等因素而未立刻辨识到贴近地面行驶的叉车AGV 100。
在此,即使叉车AGV 100通过减速或适当绕行来采取躲避,但仍可能由于人工叉车200的忽视而造成事故隐患。此外,即使人工叉车200滞后地探测到风险并停止,一般也需通过倒车回退或急转弯等方式来避开叉车AGV 100。
图8示出了在一个示例性场景中借助根据本发明的方法来使自主移动机器人切换到让行模式的示意图。
在该示例性场景中,叉车AGV 100例如已经借助相应的图像识别技术或融合探测技术识别到前方障碍物涉及人工叉车200。于是,叉车AGV 100随即执行相应的环境感知,以获取周边的巷道形态。基于相应的巷道形态,例如通过控制叉车AGV 100的横向引导而使其从在巷道中的初始位置沿着箭头701所指方向(在此例如为向右)偏移,直至叉车AGV 100与右侧道路边界72之间的距离达到最小安全距离。一旦发现达到该最小安全距离,则控制叉车AGV 100在那里停车并保持静止状态。
在此,由于叉车AGV 100借助根据本发明的方法执行了这种靠边避让的停车行为,因此对于人工叉车200而言,其例如具有足够的通行空间并例如即使在最不利情况下(没有在第一时间观察到叉车AGV),例如也可以在不明显改变横向引导的情况下从通道的左侧沿着轨迹702完成安全通行。
图9示出了在另一示例性场景中应用根据本发明的用于控制自主移动机器人的行驶的方法的示意图。
在此示出的场景与图7-8的不同之处在于:在此迎面驶来的障碍物不涉及人工叉车,而是涉及另一自动引导的叉车AGV 300。在这种情况下,叉车AGV 100例如已经通过相应的环境感知了解到在其前方存在同样自主行驶的叉车AGV 300。此外,叉车AGV 100例如还借助通信接口102与另 一叉车AGV 300进行了信息交互110,并基于此确定:其自身相对于对面的叉车AGV 300具有较低优先级。
于是,在这种场景中例如同样控制叉车AGV 100从导航模式进入让行模式,然而与图8所示场景相比,在此例如可将叉车AGV 100靠边停车时相对于右侧道路边界72的最小安全距离设定得更大一些,以便例如使叉车AGV 100在避让操作结束之后更迅速地从让行模式恢复到导航模式。
尽管这里详细描述了本发明的特定实施方式,但它们仅仅是为了解释的目的而给出的,而不应认为它们对本发明的范围构成限制。在不脱离本发明精神和范围的前提下,各种替换、变更和改造可被构想出来。
Claims (17)
- 一种用于控制自主移动机器人(100)、尤其自主搬运车的行驶的方法,所述自主移动机器人(100)至少在导航模式和让行模式之间切换工作,所述方法包括以下步骤:S1:在自主移动机器人(100)在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人(100)的周围环境中和/或在待行驶的预设轨迹中探测障碍物;S2:根据所探测到的障碍物控制自主移动机器人(100)从导航模式选择性地切换到让行模式,其中,在所述让行模式中控制自主移动机器人(100)朝着道路一侧避让并停车,以给障碍物让出空间。
- 根据权利要求1所述的方法,其中,所述步骤S2包括:在所述让行模式中,为自主移动机器人(100)规划从当前位置至道路边缘、尤其至距道路边缘确定距离处的最短路径;以及沿着所述最短路径将自主移动机器人(100)引导至道路边缘处、尤其引导至距道路边缘确定距离处,并使自主移动机器人(100)在那里停车。
- 根据权利要求1或2所述的方法,其中,自主移动机器人(100)在让行模式中停车时相对于道路边缘的距离小于在导航模式中行驶或停车时相对于道路边缘的距离。
- 根据权利要求1至3中任一项所述的方法,其中,仅当所述障碍物涉及搬运车的情况下才在所述步骤S2中发起如下判断:是否从导航模式切换到让行模式。
- 根据权利要求4所述的方法,其中,所述步骤S2还包括:在所述障碍物涉及搬运车的情况下,检查所述搬运车涉及人工搬运车(200)还是涉及另一自主搬运车(300);以及基于检查的结果来判断是否从导航模式切换到让行模式,其中,在涉 及人工搬运车(200)的情况下使自主移动机器人(100)直接从导航模式切换到让行模式。
- 根据权利要求5所述的方法,其中,所述步骤S2还包括:在所述搬运车涉及另一自主搬运车(300)的情况下,获取另一自主搬运车(300)相对于所述自主移动机器人(100)的优先级信息;根据所述优先级信息判断是否从导航模式切换到让行模式,其中,在所述优先级信息指出所述自主移动机器人(100)的优先级低于另一自主搬运车(300)的优先级的情况下,控制自主移动机器人(100)从导航模式切换到让行模式。
- 根据权利要求6所述的方法,其中,获取另一自主搬运车(300)相对于自主移动机器人(100)的优先级信息包括:借助通信网络接收和/或借助传感器检测另一自主搬运车(300)的型号、订单处理紧迫程度、另一自主搬运车(300)探测到所述自主移动机器人(100)的可能性和/或另一自主搬运车(300)相对于自主移动机器人(100)的调度顺序。
- 根据权利要求1至7中任一项所述的方法,其中,所述步骤S2包括:在已经确认使自主移动机器人(100)从导航模式切换到让行模式的情况下,获取关于所述障碍物的基本信息;以及根据所述基本信息来影响所述让行模式中的至少一种参数配置,其中,尤其针对所探测到的障碍物涉及人工搬运车(200)和另一自主搬运车(300)的情况为所述让行模式分配不同的参数配置。
- 根据权利要求8所述的方法,相比于所述障碍物涉及另一自主搬运车(300)的情况,在所述障碍物涉及人工搬运车(200)的情况下,将自主移动机器人(100)在让行模式中停车时相对于道路边缘的距离设定得更小、控制自主移动机器人(100)在让行模式中以更大的速度执行向道路一 侧的避让和/或以更短的响应时间执行所述避让。
- 根据权利要求1至9中任一项所述的方法,其中,所述方法在步骤S2之前还包括如下步骤:附加地基于地图数据和传感器数据获取自主移动机器人(100)的位置信息;基于位置信息检查自主移动机器人(100)相对于所述障碍物的可绕行性;以及仅当所述可绕行性不满足预设条件的情况下才在所述步骤S2中发起向所述让行模式的切换。
- 根据权利要求4所述的方法,其中,所述步骤S1还包括:获取自主移动机器人(100)的周围环境的图像,所述图像由自主移动机器人(100)的和/或安装在自主移动机器人(100)的周围环境中的摄像头拍摄;以及借助基于人工神经网络的图像识别技术在所述图像中辨识搬运车。
- 根据权利要求11所述的方法,其中,所述步骤S1还包括:借助自主移动机器人(100)的和/或安装在周围环境中的除摄像头以外的至少一个其他类型的传感器获取障碍物的基本信息;以及基于所述基本信息初步判断障碍物是否涉及搬运车,其中,在借助摄像头拍摄的图像中识别搬运车时至少部分地考虑所述初步判断的结果。
- 根据权利要求5所述的方法,其中,检查障碍物是否涉及人工搬运车(200)包括:检查一个或多个人员(201)是否位于所探测到的搬运车的内部空间中;检查所探测到的搬运车是否具有自主行驶功能;和/或通过通信网络向所探测到的搬运车或调度系统发送所探测到的搬运车是否正在被人工引导的询问请求。
- 一种用于控制自主移动机器人(100)、尤其自主搬运车的行驶的设备(1),所述设备(1)用于执行根据权利要求1至13中任一项所述的方法,所述设备(1)包括:探测模块(10),其被配置为能够在自主移动机器人(100)在导航模式中按照预设轨迹自动行驶期间,在自主移动机器人(100)的周围环境中和/或在待行驶的预设轨迹中探测障碍物;以及控制模块(20),其被配置为能够根据所探测到的障碍物控制自主移动机器人(100)从导航模式选择性地切换到让行模式,其中,在让行模式中控制自主移动机器人(100)朝着道路一侧避让并停车,以给障碍物让出空间。
- 一种用于自主移动机器人、尤其自主搬运车的调度方法,所述调度方法包括以下步骤:探测两个自主移动机器人(61,62)在导航模式中运行期间相遇的事件;以及响应于探测到所述事件,向所述两个自主移动机器人(61,62)中的至少一个发出调度指令,以促使两个自主移动机器人(61,62)中的一个从导航模式切换到让行模式,在所述让行模式中,要求所述两个自主移动机器人(61,62)中的一个朝着道路一侧避让并停车,以给所述两个自主移动机器人(61,62)中的另一个让出空间。
- 一种用于自主移动机器人、尤其自主搬运车的调度系统(600),所述调度系统(600)用于执行根据权利要求15所述的调度方法,所述调度系统(600)包括:事件发现单元(610),其被配置为能够探测两个自主移动机器人(100)在导航模式中运行期间相遇的事件;以及指令发出单元(620),其被配置为能够向两个自主移动机器人(100)中的至少一个发出调度指令,以促使两个自主移动机器人(100)中的一个从导航模式切换到让行模式,在所述让行模式中,要求两个自主移动机器人(61,62)中的一个朝着道路一侧避让并停车,以给两个自主移动机器 人(61,62)中的另一个让出空间。
- 一种计算机程序产品,其包括计算机程序,所述计算机程序在计算机执行时实施根据权利要求1至13中任一项所述的方法或根据权利要求15所述的调度方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111529568.9 | 2021-12-14 | ||
CN202111529568.9A CN116263600A (zh) | 2021-12-14 | 2021-12-14 | 用于控制自主移动机器人的行驶的方法和设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023109281A1 true WO2023109281A1 (zh) | 2023-06-22 |
Family
ID=86722220
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/124836 WO2023109281A1 (zh) | 2021-12-14 | 2022-10-12 | 用于控制自主移动机器人的行驶的方法和设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116263600A (zh) |
WO (1) | WO2023109281A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117032226A (zh) * | 2023-08-08 | 2023-11-10 | 贵州师范学院 | 一种机器人的自动避障方法 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106324619A (zh) * | 2016-10-28 | 2017-01-11 | 武汉大学 | 一种变电站巡检机器人自动避障方法 |
WO2018110568A1 (ja) * | 2016-12-12 | 2018-06-21 | 日本電産シンポ株式会社 | 障害物の回避動作を行う移動体およびそのコンピュータプログラム |
CN109108974A (zh) * | 2018-08-29 | 2019-01-01 | 广州市君望机器人自动化有限公司 | 机器人避让方法、装置、后台服务端及存储介质 |
US20190033882A1 (en) * | 2017-07-28 | 2019-01-31 | Crown Equipment Corporation | Traffic management for materials handling vehicles in a warehouse environment |
CN111930127A (zh) * | 2020-09-02 | 2020-11-13 | 广州赛特智能科技有限公司 | 一种机器人障碍物识别及避障方法 |
CN113074728A (zh) * | 2021-03-05 | 2021-07-06 | 北京大学 | 基于跳点寻路与协同避障的多agv路径规划方法 |
CN113189987A (zh) * | 2021-04-19 | 2021-07-30 | 西安交通大学 | 基于多传感器信息融合的复杂地形路径规划方法及系统 |
-
2021
- 2021-12-14 CN CN202111529568.9A patent/CN116263600A/zh active Pending
-
2022
- 2022-10-12 WO PCT/CN2022/124836 patent/WO2023109281A1/zh unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106324619A (zh) * | 2016-10-28 | 2017-01-11 | 武汉大学 | 一种变电站巡检机器人自动避障方法 |
WO2018110568A1 (ja) * | 2016-12-12 | 2018-06-21 | 日本電産シンポ株式会社 | 障害物の回避動作を行う移動体およびそのコンピュータプログラム |
US20190033882A1 (en) * | 2017-07-28 | 2019-01-31 | Crown Equipment Corporation | Traffic management for materials handling vehicles in a warehouse environment |
CN109108974A (zh) * | 2018-08-29 | 2019-01-01 | 广州市君望机器人自动化有限公司 | 机器人避让方法、装置、后台服务端及存储介质 |
CN111930127A (zh) * | 2020-09-02 | 2020-11-13 | 广州赛特智能科技有限公司 | 一种机器人障碍物识别及避障方法 |
CN113074728A (zh) * | 2021-03-05 | 2021-07-06 | 北京大学 | 基于跳点寻路与协同避障的多agv路径规划方法 |
CN113189987A (zh) * | 2021-04-19 | 2021-07-30 | 西安交通大学 | 基于多传感器信息融合的复杂地形路径规划方法及系统 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117032226A (zh) * | 2023-08-08 | 2023-11-10 | 贵州师范学院 | 一种机器人的自动避障方法 |
CN117032226B (zh) * | 2023-08-08 | 2024-02-02 | 贵州师范学院 | 一种机器人的自动避障方法 |
Also Published As
Publication number | Publication date |
---|---|
CN116263600A (zh) | 2023-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11332132B2 (en) | Method of handling occlusions at intersections in operation of autonomous vehicle | |
JP6413962B2 (ja) | 走行制御装置 | |
US20200257317A1 (en) | Autonomous and user controlled vehicle summon to a target | |
US9493163B2 (en) | Driving support apparatus for vehicle | |
KR101812088B1 (ko) | 스마트 팩토리 구현을 위한 원격제어방식의 무인반송시스템 | |
US20140129075A1 (en) | Vehicle Control Using Modeled Swarming Behavior | |
CN107074280B (zh) | 用于运行车辆的方法和设备 | |
CN111258318A (zh) | 一种环卫车自动驾驶系统及其控制方法 | |
US11127301B1 (en) | Systems and methods for adapting operation of an assistance system according to the presence of a trailer | |
US20210072032A1 (en) | Vehicle Perception-Data Gathering System and Method | |
CN114475664B (zh) | 一种拥堵路段自动驾驶车辆变道协调控制方法 | |
JP7236307B2 (ja) | 車両制御装置、車両制御方法、及びプログラム | |
KR102328506B1 (ko) | 무인 공공정보 수집 시스템 및 방법 | |
CN113228131B (zh) | 用于提供周围环境数据的方法和系统 | |
CN112849141B (zh) | 用于自主车道变更的传感器共享的过程和系统 | |
WO2023109281A1 (zh) | 用于控制自主移动机器人的行驶的方法和设备 | |
CN112180911A (zh) | 用于监控自动驾驶车辆的控制系统的方法 | |
AU2020204206A1 (en) | A system for controlling a plurality of autonomous vehicles on a mine site | |
CN116368047A (zh) | 用于根据情况确定至少部分自主运行的机动车的观测区域的方法和控制装置 | |
US10466703B2 (en) | Method for controlling at least one vehicle, which moves at least partially autonomously within an operating environment, and structure | |
KR20240046095A (ko) | 엣지컴퓨팅 및 인공지능 융합 기술 기반 무인이동체 원격주행제어 시스템 | |
JP7412464B2 (ja) | 車両制御装置、自律分散型交通管制システムおよび車両制御方法 | |
US20230084313A1 (en) | Methods and systems for autonomous vehicle collision avoidance | |
JP7350540B2 (ja) | 運転制御方法及び運転制御装置 | |
JP7258677B2 (ja) | 運転制御方法及び運転制御装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22906033 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |