CN116263600A - Method and device for controlling the travel of an autonomous mobile robot - Google Patents
Method and device for controlling the travel of an autonomous mobile robot Download PDFInfo
- Publication number
- CN116263600A CN116263600A CN202111529568.9A CN202111529568A CN116263600A CN 116263600 A CN116263600 A CN 116263600A CN 202111529568 A CN202111529568 A CN 202111529568A CN 116263600 A CN116263600 A CN 116263600A
- Authority
- CN
- China
- Prior art keywords
- autonomous mobile
- mobile robot
- mode
- autonomous
- obstacle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000004590 computer program Methods 0.000 claims abstract description 7
- 238000004891 communication Methods 0.000 claims description 15
- 238000001514 detection method Methods 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 239000000969 carrier Substances 0.000 claims 1
- 230000007613 environmental effect Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 231100000817 safety factor Toxicity 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
Abstract
The invention relates to the field of intelligent logistics. The present invention provides a method for controlling travel of an autonomous mobile robot that switches operation between at least a navigation mode and a yield mode, the method comprising the steps of: s1: detecting an obstacle in the surrounding environment of the autonomous mobile robot and/or in a preset trajectory to be travelled during the autonomous mobile robot automatically travelling in a navigation mode according to the preset trajectory; s2: and controlling the autonomous mobile robot to selectively switch from a navigation mode to a yielding mode according to the detected obstacle, wherein the autonomous mobile robot is controlled to avoid and park towards one side of the road in the yielding mode so as to make room for the obstacle. The invention also relates to a device for controlling the travel of an autonomous mobile robot, a scheduling method, a scheduling system and a computer program product.
Description
Technical Field
The present invention relates to a method for controlling the travel of an autonomous mobile robot, to a device for controlling the travel of an autonomous mobile robot, to a scheduling method, to a scheduling system and to a computer program product.
Background
With the rise of the fields of electronic commerce, modern factories and the like, intelligent warehousing systems are increasingly used for completing the picking, carrying, storage and the like of articles. At present, in the field of intelligent warehouse logistics, in order to lighten the burden of a manual picker and improve the picking operation efficiency, the picking and carrying work of materials is generally finished through cooperation of an autonomous mobile robot and a person.
Autonomous mobile robots typically encounter different types of obstacles during travel, and the movement of the obstacles varies. In the currently known obstacle avoidance strategies, robots generally adopt a mode of stopping in situ or the like or detouring to realize obstacle avoidance.
However, these known obstacle avoidance approaches have a number of limitations. In particular, if two AGVs (Automated Guided Vehicle, auto guided vehicles) that are performing a transport task meet at a lane, it is possible that both take stops or detours in the same direction, which results in a lane that is completely blocked. In addition, if the priority travel is unclear when the AGV encounters a manually guided truck, the hybrid operation of the AGV and the manually guided truck may cause problems. In this case, a human driver is often required to manually control the truck to exit the travel route of the blocked AGV, which can greatly reduce human efficiency.
Under such a background, it is desirable to provide an improved obstacle avoidance strategy aimed at resolving contradictions in mixed traffic or multi-traffic in a more rational manner and improving obstacle avoidance success rate.
Disclosure of Invention
It is an object of the present invention to provide a method for controlling autonomous mobile robot travel, an apparatus for controlling autonomous mobile robot travel, a scheduling method, a scheduling system and a computer program product for solving at least part of the problems of the prior art.
According to a first aspect of the present invention, there is provided a method for controlling the travel of an autonomous mobile robot, in particular an autonomous truck, said autonomous mobile robot switching operation between at least a navigation mode and a yielding mode, said method comprising the steps of:
s1: detecting an obstacle in the surrounding environment of the autonomous mobile robot and/or in a preset trajectory to be travelled during the autonomous mobile robot automatically travelling in a navigation mode according to the preset trajectory; and
s2: and controlling the autonomous mobile robot to selectively switch from a navigation mode to a yielding mode according to the detected obstacle, wherein the autonomous mobile robot is controlled to avoid and park towards one side of the road in the yielding mode so as to make room for the obstacle.
The invention comprises the following technical conception: by letting the robot selectively switch from the navigation mode to the limbal-rest mode, it is possible to react adaptively to detected obstacles. Therefore, the obstruction of the traffic road caused by the fact that a plurality of movable devices simultaneously take obstacle avoidance planning or stop in place and the like is avoided, and the obstacle avoidance success rate is greatly improved. In addition, through the mode of actively giving up the passing space, the passing order of the narrow space is ensured to a certain extent, and the intelligent level of the autonomous mobile robot is improved.
Optionally, the step S2 includes:
in the let-off mode, planning a shortest path for the autonomous mobile robot from the current position to the road edge, in particular to a determined distance from the road edge; and
guiding the autonomous mobile robot along said shortest path to the road edge, in particular to a determined distance from the road edge, and stopping the autonomous mobile robot there.
Thus, the following technical advantages are achieved: the robot is not required to plan a complete detour track for the autonomous mobile robot, but is directly moved to one side of the road to stop, so that a running space can be reserved for the obstacle to be passed in a short time, and the collision risk is reduced. This is particularly advantageous in case autonomous mobile robots are not observed or perceived in time for an opposing mobile object.
Optionally, the autonomous mobile robot parks in the yield mode by a distance relative to the road edge that is less than the distance relative to the road edge when driving or parking in the navigation mode.
Thus, the following technical advantages are achieved: because the autonomous mobile robot is eventually kept in a stationary state in the yielding mode, it is considered to stop closer to the road edge, thereby ensuring a larger traffic space for additional mobile objects and reducing collision risk.
Optionally, the following determination is initiated in said step S2 only if the obstacle relates to a truck: whether to switch from navigation mode to yield mode.
Thus, the following technical advantages are achieved: frequent driving of the robot to stop alongside may interfere with its normal operating efficiency. Thus, by such filtering screening in terms of obstacle types, the robot can be made to perform a judgment for such mode switching only when a particular mixed-line scenario may be involved, which not only avoids unnecessary start-stops of the robot, but also reduces the computational overhead of performing such judgment.
Optionally, the step S2 further includes:
in the event that the obstacle relates to a truck, checking whether the truck relates to a manual truck or to another autonomous truck; and
Whether to switch from the navigation mode to the yield mode is determined based on a result of the check, wherein the autonomous mobile robot is directly switched from the navigation mode to the yield mode in the case of involving the manual carrier.
Thus, the following technical advantages are achieved: in a scenario where multiple driving behaviors coexist, the lack of unified criteria can present decision making difficulties for automatic guidance. In the mixed traveling scene, the autonomous mobile robot is directly enabled to properly roll back and is switched to a conservative yielding mode, so that the safety can be greatly improved.
Optionally, the step S2 further includes:
acquiring priority information of the other autonomous transport vehicle relative to the autonomous mobile robot under the condition that the transport vehicle relates to the other autonomous transport vehicle; and
judging whether to switch from a navigation mode to a yielding mode according to the priority information, wherein the autonomous mobile robot is controlled to switch from the navigation mode to the yielding mode under the condition that the priority information indicates that the priority of the autonomous mobile robot is lower than that of another autonomous carrier.
Thus, the following technical advantages are achieved: the road right difference problem of the traffic conflict area can be solved advantageously through the setting of the priority, and the fluency of the mixed traffic flow is ensured.
Optionally, acquiring the priority information of the other autonomous transport vehicle relative to the autonomous mobile robot includes:
the model of the further autonomous transport vehicle, the order processing urgency, the possibility of the further autonomous transport vehicle detecting the autonomous mobile robot and/or the scheduling sequence of the further autonomous transport vehicle relative to the autonomous mobile robot are received by means of the communication network and/or detected by means of the sensor.
Thus, the following technical advantages are achieved: the priority can be defined in a more reasonable and reliable manner to reduce the number of collisions in a mixed-line environment.
Optionally, the step S2 includes:
in case it has been confirmed that the autonomous mobile robot is caused to switch from the navigation mode to the yield mode, acquiring basic information about the obstacle; and
and influencing at least one parameter configuration in the yielding mode according to the basic information, wherein different parameter configurations are allocated to the yielding mode especially for the situation that the detected obstacle relates to a manual carrier and another autonomous carrier.
Thus, the following technical advantages are achieved: the walk-giving mode can be customized individually according to the characteristics of the obstacle, so that the obstacle avoidance success rate and the safety are further improved.
Optionally, in the case where the obstacle relates to a manual carrier, the autonomous mobile robot is set to be smaller with respect to the distance to the road edge when parking in the yield mode, the autonomous mobile robot is controlled to perform avoidance to the road side at a greater speed in the yield mode, and/or to perform the avoidance at a shorter response time than in the case where the obstacle relates to another autonomous carrier.
Thus, the following technical advantages are achieved: because of the special height and angle settings of the manual cockpit, human drivers are generally not likely to find autonomous mobile robots that travel close to the ground. The traffic safety in the mixed traveling environment can be improved by allowing the autonomous mobile robot to leave a larger traffic space for the artificial carrier and completing avoidance more rapidly.
Optionally, the method further comprises the following steps before step S2:
acquiring position information of the autonomous mobile robot additionally based on the map data and the sensor data;
checking the detourability of the autonomous mobile robot with respect to the obstacle based on the position information;
and only if the detourability does not meet a preset condition, initiating switching to the yielding mode in the step S2.
Thus, the following technical advantages are achieved: the line mode is used as a conservation obstacle avoidance strategy which is particularly suitable for a narrow space, and although the traffic smoothness can be improved, a certain time efficiency is sacrificed. Therefore, on the premise that an alternative route can be planned in advance based on the space environment, it is advantageous to give priority to the robot to complete the detouring in a non-stop manner.
Optionally, the step S1 further includes:
acquiring an image of the surroundings of the autonomous mobile robot, the image being taken by a camera of the autonomous mobile robot and/or mounted in the surroundings of the autonomous mobile robot; and
the truck is identified in the image by means of an image recognition technique based on an artificial neural network.
Thus, the following technical advantages are achieved: the contour of the truck can be recognized particularly well by means of image processing algorithms, which provides a very reliable data basis for the subsequent mode switching decisions.
Optionally, the step S1 further includes:
acquiring basic information of the obstacle by means of at least one other type of sensor than a camera of the autonomous mobile robot and/or installed in the surrounding environment;
whether the obstacle relates to a truck is initially determined based on the basic information, wherein the result of the initial determination is at least partially taken into account in identifying the truck in an image captured by means of a camera.
In this way, the following technical advantages are achieved in particular: the detection mode of multi-sensor fusion can improve the accuracy of obstacle recognition, which particularly effectively reduces the occurrence of false positive errors and correspondingly avoids unnecessary mode switching.
Optionally, checking whether the obstacle relates to a manual cart comprises:
checking whether one or more persons are located in the interior space of the detected truck;
checking whether the detected carrier has an autonomous driving function; and/or
An inquiry request is sent over a communication network to the detected truck or dispatch system as to whether the detected truck is being manually guided.
Thus, the following technical advantages are achieved: the effective identification of the manual carrier can be realized, so that the rationality of mode switching is improved.
According to a second aspect of the present invention, there is provided an apparatus for controlling the travel of an autonomous mobile robot, in particular an autonomous truck, for performing the method according to the first aspect of the present invention, the apparatus comprising:
a detection module configured to be able to detect an obstacle in a surrounding environment of the autonomous mobile robot and/or in a preset trajectory to be traveled during the autonomous mobile robot automatically travels in a navigation mode according to the preset trajectory; and
And a control module configured to control the autonomous mobile robot to selectively switch from a navigation mode to a yielding mode according to the detected obstacle, wherein the autonomous mobile robot is controlled to avoid and park toward one side of the road in the yielding mode to yield space for the obstacle.
According to a third aspect of the present invention, there is provided a scheduling method for an autonomous mobile robot, in particular an autonomous carrier, the scheduling method comprising the steps of:
detecting an event that two autonomous mobile robots meet during operation in a navigation mode; and
in response to detecting the event, a scheduling instruction is issued to at least one of the two autonomous mobile robots to cause one of the two autonomous mobile robots to switch from a navigation mode to a yield mode in which one of the two autonomous mobile robots is required to avoid and park toward the road side to yield space for the other of the two autonomous mobile robots.
According to a fourth aspect of the present invention, there is provided a dispatch system for an autonomous mobile robot, in particular an autonomous truck, for performing the dispatch method of the third aspect of the present invention, the dispatch system comprising:
An event discovery unit configured to be able to detect an event that two autonomous mobile robots meet during operation in a navigation mode; and
an instruction issuing unit configured to be able to issue a scheduling instruction to at least one of the two autonomous mobile robots to cause the one of the two autonomous mobile robots to switch from a navigation mode to a yielding mode in which one of the two autonomous mobile robots is required to avoid and park toward the road side to yield a space for the other of the two autonomous mobile robots.
According to a fifth aspect of the present invention, there is provided a computer program product comprising a computer program which, when executed by a computer, implements the method according to the first aspect of the present invention or the scheduling method according to the third aspect of the present invention.
Drawings
The principles, features and advantages of the present invention may be better understood by describing the present invention in more detail with reference to the drawings. The drawings include:
fig. 1 illustrates a flowchart of a method for controlling travel of an autonomous mobile robot according to an exemplary embodiment of the present invention;
FIG. 2 shows a flow chart of one method step of the method shown in FIG. 1;
FIG. 3 shows a flow chart of another method step of the method shown in FIG. 1;
fig. 4 shows a block diagram of an apparatus for controlling travel of an autonomous mobile robot according to an exemplary embodiment of the present invention;
FIG. 5 illustrates a flowchart of a scheduling method for an autonomous mobile robot according to an exemplary embodiment of the present invention;
FIG. 6 illustrates a warehouse including a scheduling system according to an exemplary embodiment of the present invention, according to an exemplary embodiment of the present invention; and
fig. 7 shows a schematic diagram of the detection of an obstacle by means of the method according to the invention in an exemplary scenario;
FIG. 8 shows a schematic diagram of switching an autonomous mobile robot to a let-go mode by means of a method according to the invention in an exemplary scenario; and
fig. 9 shows a schematic diagram of an application of the method for controlling travel of an autonomous mobile robot according to the invention in another exemplary scenario.
Detailed Description
In order to make the technical problems, technical solutions and advantageous technical effects to be solved by the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and a plurality of exemplary embodiments. It should be understood that the detailed description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 illustrates a flowchart of a method for controlling travel of an autonomous mobile robot according to an exemplary embodiment of the present invention.
In step S1, during the automatic travel of the autonomous mobile robot in the navigation mode according to the preset trajectory, an obstacle is detected in the surroundings of the autonomous mobile robot and/or in the preset trajectory to be traveled.
In the sense of the invention, autonomous mobile robots are understood to be intelligent walking devices which integrate, for example, environmental awareness, dynamic decisions, path planning, behavior control and the like, and which in the field of automated warehousing can be particularly configured as autonomous vehicles (e.g. fork truck AGVs) which can automatically perform material handling tasks without manual intervention.
The navigation mode is understood here to mean, for example, an automatic guidance mode of the autonomous mobile robot according to a predetermined trajectory which can be planned, for example, by a navigation positioning unit of the autonomous mobile robot on the basis of suitable start and end point information, and which can also be calculated by a dispatch center and directly provided to the autonomous mobile robot.
In this step, "detecting" an obstacle includes not only detecting an obstacle in the surrounding environment by means of the robot's own or in a warehouse environment-aware device, but also, for example, receiving or predicting potential obstacle information located in the navigation planning path of the robot in real time by communication with at least one server.
In step S2, the autonomous mobile robot is controlled to selectively switch from a navigation mode to a yielding mode in which the autonomous mobile robot is controlled to avoid and park toward the road side according to the detected obstacle, so as to yield a space for the obstacle.
In the sense of the present invention, "selectively switching" means: instead of initiating a mode switch directly based on the presence of an obstacle, it should be determined whether or not such a switch should be performed or is necessary, in particular in combination with at least one characteristic of the obstacle.
The "let-go mode" may also be synonymously referred to herein as "side-by-side mode" in which, for example, a shortest path from the current position to the road edge, in particular to a determined distance from the road edge, may be planned first for the autonomous mobile robot. In the simplest case, the shortest path of the autonomous mobile robot to the right edge of the road can be planned. However, it is also conceivable to select one of the left and right sides as the target side depending on the magnitude of the distance of the autonomous mobile robot with respect to the road sides and/or the position with respect to the obstacle, respectively. The autonomous mobile robot can then be guided along this shortest path to the road edge, in particular to a defined distance from the road edge, and can be stopped there.
As an example, the distance relative to the road edge when the autonomous mobile robot parks in the yield mode is smaller than the safety distance required when traveling normally in the navigation mode. For example, the prescribed safety distance is 10cm when the autonomous mobile robot is traveling normally, whereas the distance to the road edge in the case of parking in the yield mode may be 5cm, for example.
It should be noted that "controlling the autonomous mobile robot to avoid and park toward the road side" may include, for example: the autonomous mobile robot is caused to move to one side of the road along the avoidance trajectory while decelerating and completely stops when reaching the vicinity of the road edge. However, it is also possible to move the autonomous mobile robot near the road edge at a certain speed (in particular a higher speed) and to perform a scram when the edge is reached. The invention is not limited in this regard, meaning that the particular motion parameter settings in the let-down mode may be selected according to the particular scenario (e.g., desired response speed, energy consumption, safety factors, impact on the good, etc.).
Fig. 2 shows a flow chart of one method step of the method shown in fig. 1. As shown in fig. 2, method step S1 in fig. 1 illustratively comprises steps S11-S16.
In step S11, environmental data in the surroundings of the robot are acquired, such environmental data being provided in real time, for example by an environmental awareness device of the autonomous mobile robot or of a warehouse in which the autonomous mobile robot is located.
In step S12, it is checked based on the acquired environmental data: whether there is an obstacle in the surroundings of the autonomous mobile robot and/or in a preset trajectory where it is going to travel. As an example, in the case where obstacle avoidance ranging is performed in combination with laser radar data, for example, a two-dimensional and/or three-dimensional profile of the surrounding environment of the autonomous mobile robot may be generated based on the point cloud data obtained by scanning, however, for example, the profile may be compared with a pre-stored warehouse map, and it is determined that an obstacle is present in the case where a difference value of a specific area is greater than a threshold value. However, it is also possible to carry out the checking of the presence of an obstacle on the basis of other obstacle recognition algorithms, which is not particularly limited by the invention.
If it is determined in step S12 that no obstacle is present, it is possible, for example, to re-jump from step S12 back to step S11 and continue the acquisition of the environmental data and perform such a check.
If an obstacle is found in step S12, it may be determined in step S13 whether the autonomous mobile robot has detourability with respect to the obstacle, for example, in combination with the environmental data that has been acquired. Here, for example, the location information of the autonomous mobile robot may be first determined in combination with the warehouse map data and the environment-aware data, and then checked based on the location information: autonomous mobile robots are currently located in open areas in warehouses or roadways. By way of example, if in open ground it means, for example, having detourability, whereas if an autonomous mobile robot is in a roadway, it means, for example, not having detourability.
If it is determined based on the position information that the autonomous mobile robot has detourability against the obstacle currently faced, a detour line is planned for the autonomous mobile robot, for example in step S14.
If it is determined in step S13 that the detourability does not meet the preset condition, in the following step S15, basic information of the obstacle is first acquired in combination with at least one other type of sensor than the camera, and whether the obstacle relates to a truck is preliminarily predicted based on the basic information. Such other types of sensors may be, for example, sensors of the autonomous mobile robot itself and/or one or more of a lidar sensor, a radar sensor, an ultrasonic sensor, an infrared sensor installed in the surrounding environment (e.g. in a warehouse). The basic information includes, for example, identification information, volume information, movement information, category information, and the like of the obstacle.
As an example, if one or more of the following information is identified in step S15, it can be preliminarily determined that the obstacle relates to the truck:
-identifying a specific identification carried by the obstacle surface that is sensitive to the sensor;
-identifying that the profile of the obstacle corresponds to the general profile of the truck; and
-identifying that the speed pattern and/or the movement trajectory of the obstacle corresponds to the movement characteristics of the truck.
Next, in step S16, an image of the surroundings of the autonomous mobile robot is acquired, and the truck is recognized in the image by means of an image recognition technique based on an artificial neural network. Here, the preliminary determination result obtained in step S15 is considered, in particular, at least in part in performing image recognition. As an example, it is confirmed in step S16 that the obstacle is a truck only when both the preliminary determination result and the inspection result by means of the image indicate that the obstacle relates to the truck. It is also possible to confirm that the obstacle is a truck in step S16 as long as one of the initial inspection result and the inspection result by means of the image indicates that the obstacle relates to a truck.
In this step, the acquired image is taken, for example, by a camera of the autonomous mobile robot and/or installed in the surroundings of the autonomous mobile robot. When the image data source is a monitoring camera in the warehouse, for example, the current position of the autonomous mobile robot can be uploaded to the warehouse server firstly by means of the communication interface, and then the warehouse server can call a real-time monitoring image of at least one monitoring camera near the autonomous mobile robot according to the fed back current position information and provide the real-time monitoring image for obstacle discrimination.
Fig. 3 shows a flow chart of further method steps of the method shown in fig. 1. As shown in fig. 3, method step S2 in fig. 1 illustratively includes steps S21-S28. For ease of illustration, method step S16 in fig. 2 is additionally shown.
In step S16, a category check has been performed on the obstacle, for example, by means of image recognition techniques.
Next, in step S21, it is determined whether the detected obstacle relates to a truck or not, based on the result of the category check.
If it is confirmed that the detected obstacle does not relate to a truck but belongs to another type of object, for example, another type of obstacle avoidance operation is performed in combination with the basic information of the obstacle in step S22.
If it has been confirmed that the obstacle relates to the truck, a judgment preparation for mode switching is initiated in step S23.
In step S24, in the case where it has been determined that the obstacle is a truck, it is also possible to further check whether the truck relates to a manual truck or to another autonomous mobile robot (e.g., an autonomous truck), for example.
To achieve this distinction between "manual guidance and automatic guidance", the following operations may be implemented, for example:
-checking whether one or more persons are located in the interior space of the detected truck;
-checking whether the detected truck has an autonomous driving function; and/or
-sending an inquiry request to the detected truck or the dispatch system via the communication network if the detected truck is being manually guided.
The above-described test results can be considered individually or in combination, for example, and a conclusion can be drawn therefrom as to whether the identified truck is being guided manually.
If it is determined in step S24 that the detected truck is guided manually by a person, it is possible, for example, in step S27, to determine at least one parameter configuration for the yielding mode to be implemented based on the detection result and to let the autonomous mobile robot switch into the yielding mode in accordance with this parameter configuration in step S28. The parameter configuration may here comprise, for example, the distance of the autonomous mobile robot relative to the road edge when parking in the yield mode.
In contrast, if it is determined in step S24 that the carrier relates to another autonomous carrier, priority information of the other autonomous carrier with respect to the autonomous mobile robot is acquired in step S25, and thereby it is determined whether the priority of the autonomous mobile robot is lower than that of the other autonomous carrier.
In order to achieve such a determination in terms of priority, it is possible, for example, to receive via the communication network and/or to detect via the sensor:
The model of the other autonomous transport vehicle;
order processing urgency of another autonomous transport vehicle;
the magnitude of the likelihood that the autonomous mobile robot is detected by another autonomous transport vehicle; and/or
A dispatch sequence of another autonomous carrier relative to the autonomous mobile robot.
If it is determined in step S25 that the priority of the autonomous mobile robot is higher than the priority of the other autonomous transport vehicle, the autonomous mobile robot is not parked alongside in step S26, but is kept in a navigation mode for example and brought to finish the pass first, for example by appropriate speed and trajectory control. Additionally or alternatively, it is also contemplated that instructions may be communicated to another autonomous vehicle by direct interaction with the other autonomous vehicle and/or by communication with the dispatch system to cause it to perform a yielding operation.
If the autonomous mobile robot is found to hold a lower priority in step S25, the yield mode is parameter configured in step S27 and the autonomous mobile robot is controlled to switch from the navigation mode into the yield mode in step S28, for example.
In order to take appropriate yielding measures for the "people and vehicles mixed" and "vehicles mixed" scenarios, for example, different parameter configurations may be assigned to the yielding mode for the case where the detected obstacle relates to a manual carrier and another autonomous carrier. For example, in the case where the obstacle relates to the manual carrier, the distance of the autonomous mobile robot with respect to the road edge in the step-down mode may be set smaller, the autonomous mobile robot may be controlled to perform the step-down to the road side at a greater speed in the step-down mode, and/or to perform the step-down at a shorter response time than in the case where the obstacle relates to another autonomous carrier.
Fig. 4 shows a block diagram of an apparatus for controlling travel of an autonomous mobile robot according to an exemplary embodiment of the present invention.
As shown in fig. 4, the device 1 comprises a detection module 10 and a control module 20 communicatively connected to each other. The device 1 can be arranged here, for example, directly on the autonomous mobile robot, in the surroundings of the autonomous mobile robot (for example, a warehouse) or on a cloud-located server, as required.
The detection module 10 is for example used for detecting obstacles in the surroundings of the autonomous mobile robot and/or in a preset trajectory to be travelled during an automatic travelling of the autonomous mobile robot in a navigation mode following the preset trajectory. For this purpose, the detection module 10 comprises or is configured as a laser radar sensor, RGBD camera, monocular camera, stereo vision camera, infrared sensor, ultrasonic sensor, inertial sensor, GPS sensor, radio frequency receiver, or the like, for example. However, it is also conceivable for the detection module 10 to be configured as a communication interface and to be able to communicate with at least one autonomous mobile robot and/or a dispatch system, for example, in order to receive the required environmental data from the respective environmental awareness unit.
The control module 20 is for example used to control the autonomous mobile robot to selectively switch from a navigation mode to a limp mode in dependence of the detected obstacle. Here, the control module 20 is also used to control the autonomous mobile robot to avoid and park toward the road side, for example, in the yielding mode, to make room for an obstacle. To this end, the control module 20 may for example comprise or be configured as a processor in order to perform computational processing operations in respect of at least one condition and be able to issue control instructions to the movement mechanism of the autonomous mobile robot when a specific condition is fulfilled, for example to apply interventions on the lateral guidance and/or the longitudinal guidance of the autonomous mobile robot.
Fig. 5 shows a flowchart of a scheduling method for an autonomous mobile robot according to an exemplary embodiment of the present invention.
In step 510, an event is detected where two autonomous mobile robots meet during operation in a navigation mode.
In response to detecting the event, a scheduling instruction is issued to at least one of the two autonomous mobile robots to cause one of the two autonomous mobile robots to switch from a navigation mode to a yield mode in step 520. In the let-off mode, one autonomous mobile robot is let out and stopped toward the road side to give room for the other autonomous mobile robot.
Here, for example, the issuing object, the content and/or the transmission order of the scheduling instruction may be determined based on priority information of the two autonomous mobile robots. As an example, a scheduling instruction for a side emergency stop is sent to the autonomous mobile robot with a low priority, and a scheduling instruction for keeping the current navigation mode running is sent to the autonomous mobile robot with a higher priority. As another example, a scheduling instruction regarding mode switching may be transmitted to only one of the two autonomous mobile robots.
Fig. 6 illustrates a warehouse 60 according to an exemplary embodiment of the present invention. The warehouse 60 comprises a scheduling system 600 according to an exemplary embodiment of the present invention, the scheduling system 600 comprising an event discovery unit 610 and an instruction issue unit 620 communicatively connected to each other.
As shown in fig. 6, the warehouse 60 further includes, for example, a monitoring camera 601 for detecting obstacle situations in various areas in the warehouse 60, the monitoring cameras 602 being, for example, distributed arranged at a plurality of positions in the warehouse 60. The repository 60 further comprises a communication interface 602 for communicating via a network, for example.
According to one embodiment, two autonomous mobile robots 61, 62 move in the warehouse 60, and an event discovery unit 610 receives data of each monitoring camera 601 via a communication interface 602 to analyze and process images of each location of the warehouse 60 in real time. Here, if, for example, it has been recognized based on the monitoring camera 601 deployed in the warehouse 60 that the two autonomous mobile robots 61, 62 face each other in a narrow area and are about to meet, the event discovery unit 610 determines that an meeting event of the two autonomous mobile robots 61, 62 occurs. Then, the instruction issuing unit 620 generates a scheduling instruction for switching from the navigation mode to the yield mode, and transmits such scheduling instruction to at least one (for example, the lower priority one) of the two autonomous mobile robots 61, 62 via the communication interface 602 to cause one of the two autonomous mobile robots to switch from the navigation mode to the yield mode.
Fig. 7 shows a schematic diagram of the detection of an obstacle by means of the method according to the invention in an exemplary scenario.
In the scenario shown in fig. 7, an autonomous mobile robot 100 in the form of a forklift AGV is automatically guided along a pre-planned trajectory in a navigation mode. At the same time, a manual fork truck 200, manually guided by the driver 201 through the maneuvering screen 202, is approaching the fork truck AGV100 at a determined speed. It is also noted that the fork truck AGV100 and the oncoming human fork truck 200 are currently located in a aisle of the logistics warehouse where the traversable space is limited, for example by the left and right hand racks 71 and 72, respectively.
In this embodiment, the forklift AGV100 is significantly smaller than the manual forklift 200, for example, in terms of its bulk size (particularly height). Furthermore, the forklift AGV100 scans its road environment ahead not only by means of the lidar sensor 101 with a defined field of view, but at the same time also acquires information about obstacles in the surrounding environment by means of the communication interface 102 and the camera 103. While the manual fork lift truck 100 relies primarily on the driver 201 in the cabin to visually inspect for potential obstructions nearby. In this case, it is highly likely that: the forklift AGV100 has timely found the oncoming artificial forklift 200 through the multi-sensor fusion technology, but the driver 201 in the artificial forklift 200 does not recognize the forklift AGV100 traveling close to the ground immediately due to factors such as cabin height, viewing angle (blind spot), and the like.
Here, even if the forklift AGV 100 takes evasion by decelerating or properly bypassing, accident hazards may be caused by neglecting the manual forklift 200. In addition, even if the manual fork truck 200 detects a risk and stops with a delay, it is generally necessary to avoid the fork truck AGV 100 by reversing, turning a back, turning a sharp turn, or the like.
Fig. 8 shows a schematic diagram of switching an autonomous mobile robot into a let-down mode by means of the method according to the invention in an exemplary scenario.
In this exemplary scenario, the forklift AGV 100 has identified that a forward obstacle relates to the manual forklift 200, for example, by means of a corresponding image recognition technique or fusion detection technique. The forklift AGV 100 then performs a corresponding environmental awareness to obtain the surrounding roadway morphology. Based on the corresponding lane configuration, the forklift AGV 100 is offset from its initial position in the lane in the direction indicated by arrow 701 (here, for example, to the right), for example, by controlling its lateral guidance, until the distance between the forklift AGV 100 and the right lane boundary 72 reaches a minimum safety distance. Once this minimum safe distance is found, the forklift AGV 100 is controlled to park there and remain stationary.
In this case, since the forklift AGV 100 performs such an edge-to-edge parking action by means of the method according to the invention, it is possible for the manual forklift 200 to have sufficient traffic space, for example, and to perform a safe traffic along the trajectory 702 from the left side of the aisle, for example, without significantly changing the lateral guidance, even in the worst case (no forklift AGV is observed at the first time).
Fig. 9 shows a schematic diagram of an application of the method for controlling travel of an autonomous mobile robot according to the invention in another exemplary scenario.
The scenario illustrated here differs from that of fig. 7-8 in that: the oncoming obstacle here does not relate to a manual forklift but rather to another automatically guided forklift AGV 300. In this case, the forklift AGV 100 has, for example, already known, through corresponding environmental awareness, that there is a likewise autonomous forklift AGV 300 in front of it. In addition, the forklift AGV 100 also interacts 110 with another forklift AGV 300, for example, by means of the communication interface 102, and on the basis of this, determines: which itself has a lower priority relative to the opposing forklift AGV 300.
In this case, the forklift AGV 100 is controlled to enter the yielding mode from the navigation mode, for example, however, the minimum safety distance with respect to the right road boundary 72 can be set to be larger than in the case of fig. 8, for example, when the forklift AGV 100 stops at the side, so that the forklift AGV 100 can be restored to the navigation mode from the yielding mode more quickly after the end of the yielding operation, for example.
Although specific embodiments of the invention have been described in detail herein, they are presented for purposes of illustration only and are not to be construed as limiting the scope of the invention. Various substitutions, alterations, and modifications can be made without departing from the spirit and scope of the invention.
Claims (17)
1. Method for controlling the travel of an autonomous mobile robot (100), in particular an autonomous carrier vehicle, the autonomous mobile robot (100) switching operation at least between a navigation mode and a yielding mode, the method comprising the steps of:
s1: detecting obstacles in the surroundings of the autonomous mobile robot (100) and/or in a preset trajectory to be travelled during the autonomous mobile robot (100) automatically travelling in a navigation mode according to the preset trajectory;
s2: the autonomous mobile robot (100) is controlled to selectively switch from a navigation mode to a yielding mode according to the detected obstacle, wherein the autonomous mobile robot (100) is controlled to avoid and park toward the road side in the yielding mode to yield space for the obstacle.
2. The method according to claim 1, wherein the step S2 comprises:
in the let-off mode, planning a shortest path for the autonomous mobile robot (100) from the current position to the road edge, in particular to a determined distance from the road edge; and
guiding the autonomous mobile robot (100) along the shortest path to the road edge, in particular to a determined distance from the road edge, and stopping the autonomous mobile robot (100) there.
3. The method according to claim 1 or 2, wherein the autonomous mobile robot (100) is parked in the yield mode at a distance from the road edge that is smaller than the distance from the road edge when driving or parking in the navigation mode.
4. A method according to any one of claims 1 to 3, wherein the following determination is initiated in the step S2 only if the obstacle relates to a truck: whether to switch from navigation mode to yield mode.
5. The method of claim 4, wherein the step S2 further comprises:
in case the obstacle relates to a truck, checking whether the truck relates to a manual truck (200) or to another autonomous truck (300); and
whether to switch from the navigation mode to the yield mode is determined based on a result of the check, wherein the autonomous mobile robot (100) is directly switched from the navigation mode to the yield mode in the case of involving the manual carrier (200).
6. The method of claim 5, wherein the step S2 further comprises:
acquiring priority information of the other autonomous vehicle (300) with respect to the autonomous mobile robot (100) in case the vehicle relates to the other autonomous vehicle (300);
And judging whether to switch from a navigation mode to a yielding mode according to the priority information, wherein the autonomous mobile robot (100) is controlled to switch from the navigation mode to the yielding mode under the condition that the priority information indicates that the priority of the autonomous mobile robot (100) is lower than that of another autonomous carrier (300).
7. The method of claim 6, wherein obtaining priority information of another autonomous vehicle (300) relative to the autonomous mobile robot (100) comprises:
-receiving by means of a communication network and/or detecting by means of sensors the model number of the further autonomous vehicle (300), the order handling urgency, the possibility of the further autonomous vehicle (300) detecting the autonomous mobile robot (100) and/or the scheduling order of the further autonomous vehicle (300) with respect to the autonomous mobile robot (100).
8. The method according to any one of claims 1 to 7, wherein the step S2 comprises:
acquiring basic information about the obstacle in case it has been confirmed that the autonomous mobile robot (100) is to be switched from a navigation mode to a yield mode; and
at least one parameter configuration in the yielding mode is influenced as a function of the basic information, wherein the yielding mode is assigned a different parameter configuration, in particular for the case of detected obstacles involving a manual carrier (200) and another autonomous carrier (300).
9. The method according to claim 8, wherein in case the obstacle relates to a manual truck (200), the autonomous mobile robot (100) is set smaller in distance relative to the road edge when parking in yield mode, the autonomous mobile robot (100) is controlled to perform avoidance to the road side in yield mode at a higher speed and/or to perform the avoidance at a shorter response time than in case the obstacle relates to another autonomous truck (300).
10. The method according to any one of claims 1 to 9, wherein the method further comprises, prior to step S2, the steps of:
acquiring position information of the autonomous mobile robot (100) additionally based on the map data and the sensor data;
checking the roundabout of the autonomous mobile robot (100) with respect to the obstacle based on the position information; and
and only if the detourability does not meet a preset condition, initiating switching to the yielding mode in the step S2.
11. The method of claim 4, wherein the step S1 further comprises:
acquiring an image of the surroundings of the autonomous mobile robot (100), said image being captured by a camera of the autonomous mobile robot (100) and/or mounted in the surroundings of the autonomous mobile robot (100); and
The truck is identified in the image by means of an image recognition technique based on an artificial neural network.
12. The method of claim 11, wherein the step S1 further comprises:
acquiring basic information of the obstacle by means of at least one other type of sensor than a camera of the autonomous mobile robot (100) and/or installed in the surrounding environment; and
whether the obstacle relates to a truck is initially determined based on the basic information, wherein the result of the initial determination is at least partially taken into account in identifying the truck in an image captured by means of a camera.
13. The method of claim 5, wherein checking whether the obstacle relates to a manual cart (200) comprises:
checking whether one or more persons (201) are located in the interior space of the detected truck;
checking whether the detected carrier has an autonomous driving function; and/or
An inquiry request is sent over a communication network to the detected truck or dispatch system as to whether the detected truck is being manually guided.
14. Device (1) for controlling the travel of an autonomous mobile robot (100), in particular an autonomous carrier, the device (1) being for performing the method according to any one of claims 1 to 13, the device (1) comprising:
A detection module (10) configured to be able to detect obstacles in the surroundings of the autonomous mobile robot (100) and/or in a preset trajectory to be travelled during an automatic travelling of the autonomous mobile robot (100) in a navigation mode according to the preset trajectory; and
a control module (20) configured to control the autonomous mobile robot (100) to selectively switch from a navigation mode to a yielding mode in accordance with the detected obstacle, wherein the autonomous mobile robot (100) is controlled to avoid and park toward a road side in the yielding mode to yield space for the obstacle.
15. A scheduling method for autonomous mobile robots, in particular autonomous carriers, the scheduling method comprising the steps of:
detecting an event that two autonomous mobile robots (61, 62) meet during operation in a navigation mode; and
in response to detecting the event, a scheduling instruction is issued to at least one of the two autonomous mobile robots (61, 62) to cause one of the two autonomous mobile robots (61, 62) to switch from a navigation mode to a yielding mode in which one of the two autonomous mobile robots (61, 62) is required to avoid and park toward the road side to yield space for the other of the two autonomous mobile robots (61, 62).
16. A scheduling system (600) for an autonomous mobile robot, in particular an autonomous carrier, the scheduling system (600) being for performing the scheduling method according to claim 15, the scheduling system (600) comprising:
an event discovery unit (610) configured to be able to detect events where two autonomous mobile robots (100) meet during operation in a navigation mode; and
an instruction issuing unit (620) configured to be able to issue a scheduling instruction to at least one of the two autonomous mobile robots (100) to cause one of the two autonomous mobile robots (100) to switch from a navigation mode to a yielding mode in which one of the two autonomous mobile robots (61, 62) is required to avoid and park toward the road side to yield space for the other of the two autonomous mobile robots (61, 62).
17. A computer program product comprising a computer program which, when executed by a computer, implements the method according to any one of claims 1 to 13 or the scheduling method according to claim 15.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111529568.9A CN116263600A (en) | 2021-12-14 | 2021-12-14 | Method and device for controlling the travel of an autonomous mobile robot |
PCT/CN2022/124836 WO2023109281A1 (en) | 2021-12-14 | 2022-10-12 | Method and device for controlling driving of autonomous mobile robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111529568.9A CN116263600A (en) | 2021-12-14 | 2021-12-14 | Method and device for controlling the travel of an autonomous mobile robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116263600A true CN116263600A (en) | 2023-06-16 |
Family
ID=86722220
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111529568.9A Pending CN116263600A (en) | 2021-12-14 | 2021-12-14 | Method and device for controlling the travel of an autonomous mobile robot |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116263600A (en) |
WO (1) | WO2023109281A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117032226B (en) * | 2023-08-08 | 2024-02-02 | 贵州师范学院 | Automatic obstacle avoidance method for robot |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106324619A (en) * | 2016-10-28 | 2017-01-11 | 武汉大学 | Automatic obstacle avoiding method of substation inspection robot |
TWI665538B (en) * | 2016-12-12 | 2019-07-11 | 日商日本電產新寶股份有限公司 | A vehicle performing obstacle avoidance operation and recording medium storing computer program thereof |
WO2019023443A1 (en) * | 2017-07-28 | 2019-01-31 | Crown Equipment Corporation | Traffic management for materials handling vehicles in a warehouse environment |
CN109108974B (en) * | 2018-08-29 | 2020-07-07 | 广州市君望机器人自动化有限公司 | Robot avoidance method and device, background server and storage medium |
CN111930127B (en) * | 2020-09-02 | 2021-05-18 | 广州赛特智能科技有限公司 | Robot obstacle identification and obstacle avoidance method |
CN113074728B (en) * | 2021-03-05 | 2022-07-22 | 北京大学 | Multi-AGV path planning method based on jumping point routing and collaborative obstacle avoidance |
CN113189987A (en) * | 2021-04-19 | 2021-07-30 | 西安交通大学 | Complex terrain path planning method and system based on multi-sensor information fusion |
-
2021
- 2021-12-14 CN CN202111529568.9A patent/CN116263600A/en active Pending
-
2022
- 2022-10-12 WO PCT/CN2022/124836 patent/WO2023109281A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2023109281A1 (en) | 2023-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101511923B1 (en) | Vehicle remote operation system and on-board device | |
US9493163B2 (en) | Driving support apparatus for vehicle | |
CN106470886B (en) | Method for establishing the ambient enviroment model of means of transport | |
JP6413962B2 (en) | Travel control device | |
CN113728210A (en) | Autonomous and user-controlled vehicle summons to targets | |
CN109823339B (en) | Vehicle traffic light intersection traffic control method and control system | |
CN109733392B (en) | Obstacle avoidance method and device | |
CN111258318A (en) | Automatic driving system of sanitation vehicle and control method thereof | |
CN107207006A (en) | The operator that the information on the parking stall on identifying is reported to the long-distance operating device to the self-stopping parking assistance system that can be controlled via long-distance operating device for motor vehicle | |
EP3919336B1 (en) | Travel control method and travel control device for vehicle | |
JP6047083B2 (en) | Parking assistance system | |
CN107074280B (en) | Method and device for operating a vehicle | |
JP7058236B2 (en) | Vehicle control devices, vehicle control methods, and programs | |
JP7058234B2 (en) | Vehicle control device, information providing device, information providing system, vehicle control method, information providing method, and program | |
US11543820B2 (en) | Vehicle control apparatus, vehicle control method, and storage medium | |
CN114475664B (en) | Automatic driving vehicle lane-changing coordination control method for congested road section | |
WO2023109281A1 (en) | Method and device for controlling driving of autonomous mobile robot | |
CN113795802A (en) | Autonomous mine car operation | |
CN111216711B (en) | Control method and system for intelligent driving automobile to automatically enter and exit station | |
US20230111226A1 (en) | Method for supporting an automatically driving vehicle | |
US20210269041A1 (en) | Vehicle Control Method and Vehicle Control Device | |
CN113196358A (en) | Signal lamp system analysis system required by automatic driving path | |
US10466703B2 (en) | Method for controlling at least one vehicle, which moves at least partially autonomously within an operating environment, and structure | |
JP7220192B2 (en) | VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM | |
US11491985B2 (en) | Process and system for sensor sharing for an autonomous lane change |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |