CN112346445A - Distribution robot, obstacle avoidance method thereof and computer storage medium - Google Patents

Distribution robot, obstacle avoidance method thereof and computer storage medium Download PDF

Info

Publication number
CN112346445A
CN112346445A CN201910726087.3A CN201910726087A CN112346445A CN 112346445 A CN112346445 A CN 112346445A CN 201910726087 A CN201910726087 A CN 201910726087A CN 112346445 A CN112346445 A CN 112346445A
Authority
CN
China
Prior art keywords
robot
obstacle
information
distribution
delivery robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910726087.3A
Other languages
Chinese (zh)
Inventor
贺智威
朱元远
刘祖兵
邢栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Candela Shenzhen Technology Innovations Co Ltd
Original Assignee
Candela Shenzhen Technology Innovations Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Candela Shenzhen Technology Innovations Co Ltd filed Critical Candela Shenzhen Technology Innovations Co Ltd
Priority to CN201910726087.3A priority Critical patent/CN112346445A/en
Publication of CN112346445A publication Critical patent/CN112346445A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a distribution robot and an obstacle avoidance method thereof, and a computer storage medium, wherein the obstacle avoidance method of the distribution robot comprises the following steps: acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle. Through the mode, the delivery robot can be prevented from colliding with the barrier, and the safety performance of the delivery robot is improved.

Description

Distribution robot, obstacle avoidance method thereof and computer storage medium
Technical Field
The application relates to the technical field of intelligent distribution, in particular to a distribution robot, an obstacle avoidance method thereof and a computer storage medium.
Background
In the field of intelligent delivery, the most expensive link with the lowest efficiency is at the front end and the rear end, namely at the two ends closest to the sender and the receiver, because the users at the front end and the rear end are dispersed, an express company needs to be provided with more couriers to meet the requirement of improving the efficiency, and in the prior art, in order to reduce the labor cost, a robot is used for delivery.
The robot runs according to a pre-planned path, and due to the fact that the robot is unmanned, when the robot runs in the running path, the robot is difficult to continue running or the robot collides with an obstacle, hidden dangers of economic loss and life safety are caused, and therefore how to achieve obstacle avoidance of the robot is very important.
Disclosure of Invention
In order to solve the above problems, the present application provides a delivery robot, an obstacle avoidance method thereof, and a computer storage medium, which can prevent the delivery robot from colliding with an obstacle, and improve the safety performance of the delivery robot.
The technical scheme adopted by the application is as follows: provided is an obstacle avoidance method for a delivery robot, the method including: acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle.
Wherein, acquire the environmental information around the delivery robot, include: collecting depth information by using a top laser radar, a front laser radar, a rear laser radar and an ultrasonic sensor assembly which are arranged on a distribution robot, and collecting image information by using a camera assembly which is arranged on the distribution robot; generating first environment information around the distribution machine according to the depth information and the image information; determining an obstacle in a current travel path of the delivery robot based on the environmental information, including: acquiring prestored second environment information corresponding to the current driving path; and comparing the first environmental information with the second environmental information to determine the obstacles in the current driving path of the delivery robot.
Wherein, based on the obstacle, determining a movement state of the delivery robot to avoid the obstacle, includes: determining position information and/or relative speed information between the obstacle and the delivery robot; based on the position information and/or the relative speed information, the movement state of the dispensing robot is changed to avoid the obstacle.
Wherein determining position information and/or relative velocity information between the obstacle and the delivery robot comprises: acquiring a first distance and a first azimuth angle between an obstacle and a delivery robot at a first moment; acquiring a second distance and a second azimuth angle between the obstacle and the distribution robot at a second moment; and determining relative speed information between the obstacle and the delivery robot based on the first distance, the first azimuth, the second distance, the second azimuth and the time difference between the first time and the second time.
Wherein, based on the obstacle, determining a movement state of the delivery robot to avoid the obstacle, includes: judging whether the distance between the barrier and the distribution robot is larger than a preset first distance threshold value or not; if so, switching the current driving path into another preset driving path; and if not, controlling the distribution robot to replan the current path.
Wherein, control distribution robot and carry out replanning to current route, include: judging whether the distance between the barrier and the distribution robot is larger than a preset second distance threshold value or not; if yes, controlling the distribution robot to pause or decelerate; and if not, controlling the distribution robot to back up according to the current driving path.
Wherein, the method also comprises: determining a special area around the delivery robot based on the environmental information; the special area comprises at least one of a sunken area, an illegal lane area, a water area and a lawn area; when the current path is re-planned, the path is forbidden to be planned on the special area.
Wherein, the method also comprises: judging whether collision information is detected; the collision information is generated by collision of a collision avoidance mechanism arranged on the distribution robot; and if so, controlling the distribution robot to pause.
Another technical scheme adopted by the application is as follows: there is provided a delivery robot comprising a processor and a memory coupled to the processor, the memory for storing program data, the processor for executing the program data to implement the method as described above.
Another technical scheme adopted by the application is as follows: there is provided a computer storage medium having stored therein program data for, when executed by a processor, implementing a method as described above.
The obstacle avoidance method of the delivery robot comprises the following steps: acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle. Through the mode, the obstacle can be determined by analyzing the surrounding environment information, and the corresponding obstacle avoidance scheme can be determined according to the obstacle, so that on one hand, the delivery robot can be prevented from colliding hard objects to cause damage to the robot or a delivery cabinet for carrying, and on the other hand, the delivery robot can be prevented from colliding human-caused potential safety hazards.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
fig. 1 is a schematic structural diagram of a delivery robot provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of an obstacle avoidance method for a delivery robot according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram comparing first environmental information and second environmental information provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of position and velocity detection of an obstacle according to an embodiment of the present application;
FIG. 5 is a schematic flow chart illustrating adjustment of a travel path according to an embodiment of the present disclosure;
FIG. 6 is a schematic illustration of a stationary barrier provided by an embodiment of the present application;
fig. 7 is another schematic structural diagram of a dispensing robot provided in the embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer storage medium provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a distribution robot provided in an embodiment of the present application, where the distribution robot 10 may be specifically used for distributing distribution cabinets, and the distribution robot 10 may be an outdoor robot for distributing the distribution cabinets outdoors or a building robot for distributing the distribution cabinets in a building.
Taking the outdoor robot as an example, the dispensing robot 10 includes a main body 100, a driving mechanism 200, a sensor assembly 300, and a controller (not shown).
The driving mechanism 200 is provided on the main body 100 so as to drive the main body 100 to move. The sensor assembly 300 is also provided on the main body 100 and serves to detect peripheral obstacles during the movement or stoppage of the dispensing robot 10, thereby obtaining obstacle information. Specifically, the obstacle is not limited to only a step, a barrier, and the like, but also includes traffic information such as a traffic light.
The controller may receive the obstacle information and generate a control command according to the obstacle information to control the driving mechanism 200, so that the driving mechanism 200 drives the main body 100 to avoid the obstacle.
Specifically, the main body 100 includes a bearing portion 110 and a plate portion 120, and the plate portion 120 is vertically disposed at one end of the bearing portion 110, that is, the plate portion 120 extends upward from one end of the bearing portion 110 and may be perpendicular to the bearing portion 110.
In the embodiment, the supporting portion 110 is mainly used for supporting the dispensing cabinets, and is similar to a plate-shaped structure, and the driving mechanism 200 is disposed on the supporting portion 110. Specifically, the driving mechanism 200 includes a chassis disposed at the bottom of the bearing portion 110, driving members, and driving wheels, preferably four driving wheels, respectively mounted at four corners of the chassis. The driving piece is used for driving the driving wheel to rotate or turn.
The sensor assembly 300 is disposed on the supporting portion 110 and the plate portion 120.
In the above embodiment, the plate body 120 is disposed on the supporting portion 110, on one hand, the plate body can serve as a blocking plate, so that the distribution cabinet can be better supported on the supporting portion 110, and on the other hand, since the height and the installation position of the supporting portion 110 are limited, the plate body 120 is disposed for installing the sensor assembly 300, so that the environment can be better detected in all directions, and a better detection effect can be achieved.
Specifically, sensor assembly 300 includes a lidar assembly 310, lidar assembly 310 including a front lidar 311, a rear lidar 312, and a top lidar 313; the front lidar 311 is disposed at one end of the supporting portion 110 close to the plate portion 120, the rear lidar 312 is disposed at the other end of the supporting portion 110 far away from the front lidar 311, and the top lidar 313 is disposed at one end of the plate portion 120 far away from the supporting portion 110.
When an obstacle is detected by lidar component 310, relevant obstacle information, such as size, orientation, etc., is generated to the controller. The controller then generates relevant instruction information according to the obstacle information, and controls the main body 100 to stop, turn, move, or the like by the driving mechanism 200 so as to avoid the obstacle.
In an exemplary embodiment, sensor assembly 300 further includes an ultrasonic sensor assembly (not shown) disposed on plate portion 120 and/or carrier portion 110, the ultrasonic sensor assembly configured to assist lidar assembly 310 in environmental detection. Alternatively, the detection area of lidar assembly 310 and the detection area of the ultrasonic sensor assembly may be at least partially non-coincident.
Specifically, if there may be a blind spot in the detection area of lidar assembly 310, and if the middle area may not be scanned due to the height relationship between rear lidar 312 and top lidar 313, an ultrasonic sensor assembly may be disposed in the middle of plate portion 120 for scan compensation.
Meanwhile, the ultrasonic sensor assembly can be used for detecting glass, and when a plurality of strip-shaped obstacles are detected or the ultrasonic sensor detects a planar obstacle, the obstacle can be judged to be glass.
Further, two ends of the bearing part 110 are respectively provided with an avoidance inclined plane 111, and the front laser radar 311 and the rear laser radar 312 are arranged on the avoidance inclined plane 111. So that the front lidar 311 and the rear lidar 312 are not damaged by the collision when the dispensing robot 10 collides.
Specifically, the avoidance slope 111 faces outward and obliquely upward, and therefore, the avoidance slope can also be used for arranging an ultrasonic sensor assembly, and therefore, the ultrasonic sensor assembly can also detect obliquely upward.
In an embodiment, the two ends of the supporting portion 110 are respectively provided with a buffer 112 for buffering when the dispensing robot 10 collides, so as to prevent the force of the collision from directly damaging the body of the dispensing robot 10.
Be provided with protection casing 314 on the top laser radar 313, protection casing 314 is used for protecting top laser radar 313, and the big or small inclination of protection casing 314 is identical with top laser radar 313's scanning angle. So that the protective shield 314 does not interfere with the operation of the lidar 313; the protective cover 314 is made of transparent material to avoid influencing the testing effect.
The obstacle avoidance method of the dispensing robot 10 is described below with several embodiments.
Referring to fig. 2, fig. 2 is a schematic flowchart of an obstacle avoidance method for a dispensing robot according to an embodiment of the present application, where the method includes:
step 21: environmental information around the delivery robot is acquired.
Optionally, the depth information may be collected by specifically using a top lidar, a front lidar, a rear lidar and an ultrasonic assembly mounted on the delivery robot, and the image information may be collected by using a camera assembly mounted on the delivery robot; first environment information around the distribution machine is generated based on the depth information and the image information.
The laser radar is a sensor for detecting characteristic quantities such as the position, the speed and the like of a target by emitting a laser beam, optionally, the laser radar rotates according to a set frequency, emits the laser beam to a range of 360 degrees, or emits the laser beam to a designated fan-shaped area, receives the reflected laser beam, and calculates the depth of an obstacle according to a time difference.
Ultrasonic waves (mechanical waves with a frequency of 20kHz or more) are special acoustic waves and have basic physical properties such as acoustic refraction, reflection, interference and the like. The ultrasonic transmitter transmits an ultrasonic signal to a certain direction outside, timing is started when the ultrasonic signal is transmitted, the ultrasonic wave is transmitted through air, the ultrasonic wave is reflected and transmitted back immediately when encountering an obstacle in the transmission process, and the ultrasonic receiver stops timing immediately when receiving a reflected wave. The propagation speed of the ultrasonic wave in the air is 340m/s, and the distance length(s) from the transmitting point to the obstacle can be measured by the timer by recording the time t, namely: and s is 340 t/2.
The camera assembly can specifically include binocular camera, long burnt camera, vehicle event data recorder etc. wherein, binocular camera can be used for acquireing the degree of depth image, and long burnt camera can be used for acquireing the image of long distance, can shoot the traffic lights for example.
Further, the processor integrates the depth information and the image information detected by each sensor to form an environment layer. Optionally, different depths may be labeled with different colors in the layer, for example, the larger the depth value, the larger the pixel value of the corresponding color.
Optionally, in an embodiment, the sensor may further cooperate with a camera to acquire environment information, the camera may be installed on each of the front, the back, the left, and the right of the delivery robot to acquire an environment image, and the processor processes the environment image by semantic segmentation after acquiring the environment image.
The task of semantic segmentation is to classify each pixel in an image, belongs to a classical dense prediction algorithm, and has the core technology of accurate extraction of a feature layer and accurate recovery of detail information. The application of Full Convolution Networks (FCN) greatly improves the accuracy of the semantic segmentation algorithm. The convolutional layer obtains output characteristics through linear combination of the convolutional kernel and the original characteristics, and a mode of stacking the convolutional layer is often adopted for increasing the receptive field and obtaining long-distance semantic information.
Different areas in the environment image can be divided differently through the result of semantic segmentation, and different areas are labeled by different colors. Such as buildings, roads, lawns, pools, etc.
Furthermore, a depth image obtained through depth information obtained by the laser radar and the ultrasonic sensor and a color image collected by the camera and a semantic segmentation image are further obtained, and the depth image and the semantic segmentation image are combined to obtain more accurate environment information.
Step 22: an obstacle in a current travel path of the delivery robot is determined based on the environmental information.
Alternatively, since the environment information includes depth information and image information, information of an obstacle (including the type, position, speed of the obstacle) may be acquired through deep learning. Specifically, a neural network model may be established in advance, sample data (including depth information and image information) may be input to the neural network model for training, and then the newly input depth information and image information may be learned through the trained neural network model to determine obstacle information.
Optionally, pre-stored second environment information corresponding to the current driving path is acquired; and comparing the first environmental information with the second environmental information to determine the obstacles in the current driving path of the delivery robot.
The second environment information is environment information which is acquired in advance under the condition that no obstacle exists. For example, the collection may be performed manually, such as manually holding a camera to collect the information about the surroundings of the route, or the collection of the information about the surroundings may be performed by a delivery robot after the obstacle on the route is considered to be cleared.
When comparing the first environmental information and the second environmental information, whether an obstacle exists is determined by comparing the similarity of each area in the two image layers mainly through the comparison between the first image layer corresponding to the first environmental information and the second image layer corresponding to the second environmental information.
As shown in fig. 3, fig. 3 is a schematic diagram illustrating comparison between first environment information and second environment information provided in an embodiment of the present application, where a left side is a second layer corresponding to the second environment information, a right side is a first layer corresponding to the first environment information, a shaded portion represents a distribution robot, and an arrow represents a traveling path direction of the distribution robot.
Through comparison of the first image layer and the second image layer, if the obstacle A is detected, the obstacle A can be determined to be an obstacle to be considered on the driving path of the distribution robot; if the obstacle B is detected, it can be determined that the obstacle B is not on the travel path of the delivery robot, and it is not necessary to consider it.
Step 23: based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle.
Alternatively, the delivery robot may predict the trajectory of the obstacle based on learning the input depth information and image information, and then determine the transportation trajectory of the delivery robot to avoid the obstacle. For example, it is possible to determine whether or not the dispensing robot and the obstacle collide with each other at a future time, based on the type, the moving direction, and the moving speed of the obstacle.
Wherein different movement states can be determined from the position information and the relative speed information between the obstacle and the delivery robot.
For example, the dispensing robot may be controlled to slow down or pause when the obstacle is farther from the dispensing robot, and may be controlled to pause or reverse when the obstacle is closer to the dispensing robot.
Specifically, a first distance and a first azimuth between an obstacle and a delivery robot are acquired at a first moment; acquiring a second distance and a second azimuth angle between the obstacle and the distribution robot at a second moment; and determining relative speed information between the obstacle and the delivery robot based on the first distance, the first azimuth, the second distance, the second azimuth and the time difference between the first time and the second time.
As shown in fig. 4, fig. 4 is a schematic diagram of position and speed detection of an obstacle according to an embodiment of the present application. The obstacle is detected to be at a point C at the first moment, the distance is OC (point O is the position of the sensor), the obstacle is detected to be at a point C ' at the second moment, the distance is OC ', and the included angle between OC and OC is theta, so that the distance of CC ' can be calculated through a certain functional relation, and the moving speed of the obstacle can be calculated by further combining the time difference between the first moment and the second moment.
Referring to fig. 5, fig. 5 is a schematic flow chart illustrating adjustment of a driving route according to an embodiment of the present application, and step 23 may specifically include:
step 231: and judging whether the distance between the barrier and the distribution robot is greater than a preset first distance threshold value.
Step 232: and switching the current driving path into another preset driving path.
Optionally, the delivery robots are connected to a server via a network, and one server schedules a plurality of delivery robots. The server stores a plurality of driving paths from the starting position to the end position in advance, distributes the respective driving paths for the plurality of delivery robots to carry out scheduling, and acquires the position information sent by the delivery robots in real time to determine the current position of each delivery robot in the path. For example, as shown in the following table:
route of travel Distribution robot
Route 1 Robot A
Route 2 Robot B
Route 3 (Idle)
In one embodiment, robot a travels on route 1 and robot B travels on route 2, and if an obstacle is detected on route 1, the route of robot a may be switched to route 3 according to the scheduling of the server.
In another embodiment, it may be determined whether the moving direction of the robot B and the moving direction of the robot a are consistent (whether both the moving directions are from the initial position to the end position or from the end position to the initial position), and if they are consistent, the path of the robot a may be switched to the path 2. In addition, the maximum number of robots to be operated on each route may be set in advance, and if the number of robots to be operated on the target route is smaller than the set maximum number, the routes of other delivery robots may be switched to the target route.
Optionally, each delivery robot may also store a plurality of pieces of travel route information in advance, and when an obstacle is encountered in one of the routes, the delivery robot may switch to another route flexibly.
Step 233: and controlling the distribution robot to replan the current path.
The path replanning means that the robot avoids obstacles in modes of pause, speed change, turning and the like, can be separated from the current path for a short time, and returns to the original driving path after avoiding the obstacles.
Optionally, when the current path is replanned, it may be determined whether a distance between the obstacle and the delivery robot is greater than a preset second distance threshold; if yes, controlling the distribution robot to pause or decelerate; and if not, controlling the distribution robot to back up according to the current driving path.
In an embodiment, the movement state of the dispensing robot may be determined according to a distance between the dispensing robot and the obstacle.
If the distance between the obstacle and the distribution robot is larger than the preset second distance threshold value, the obstacle is far away, the distribution robot can slow down slowly, and if the obstacle departs from the path after a period of time, the distribution robot can recover the initial speed to continue running. For example, when a user appears in front of the path, and the robot detects that the obstacle (human) is far away, the robot actively decelerates, and generally, the user sees that the delivery robot actively avoids when passing through the path, and then the delivery robot departs from the running path of the robot, and the robot detects that the obstacle disappears again, and the initial speed is recovered.
If the distance between the obstacle and the distribution robot is smaller than the preset second distance threshold value, the obstacle is relatively close, the distribution robot pauses or retreats immediately, and if the obstacle deviates from the path after a period of time, the distribution robot can resume the initial speed to continue running. For example, pets, children, etc. passing suddenly from the side, the dispensing robot pauses or backs up immediately in order to avoid causing injury.
Alternatively, the adjustment of the speed, direction, etc. may be made in dependence on the relative speed between the obstacle and the delivery robot when the current path is re-planned.
For example, the obstacle is a stationary object, and if the obstacle is a stone or the like, the dispensing robot may determine the volume of the obstacle and determine whether the obstacle can be bypassed.
As shown in fig. 6, fig. 6 is a schematic view of a stationary obstacle provided in the embodiment of the present application, and specifically, a distance L between the obstacle and an edge (e.g., a wall) of a runnable path may be determined, and it is determined whether a width of the dispensing robot may pass through the path of the distance L. If so, the driving mechanism of the distribution robot continuously adjusts the posture of the distribution robot so as to enable the distribution robot to pass through the area.
In another embodiment, if the delivery robot cannot avoid the obstacle by bypassing, the delivery robot may request to eliminate the obstacle by reversing to reselect another path, or by reporting to the server.
It is understood that when the delivery robot performs the path planning, it needs to be noticed that the new path cannot reach some special areas, such as a sunken area, an illegal lane area, a water area, a lawn area, etc. For these areas, when the current path is re-planned, planning of the path on a particular area is prohibited.
In addition, whether the delivery robot is impacted by external force or not can be detected by using an anti-collision mechanism of the delivery robot, and if the delivery robot is impacted by external force, the delivery robot is immediately controlled to pause or reverse and an alarm is given.
Different from the prior art, the obstacle avoidance method for the delivery robot provided by the embodiment includes: acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle. Through the mode, the obstacle can be determined by analyzing the surrounding environment information, and the corresponding obstacle avoidance scheme can be determined according to the obstacle, so that on one hand, the delivery robot can be prevented from colliding hard objects to cause damage to the robot or a delivery cabinet for carrying, and on the other hand, the delivery robot can be prevented from colliding human-caused potential safety hazards.
Referring to fig. 7, fig. 7 is another structural schematic diagram of a dispensing robot according to an embodiment of the present application, where the dispensing robot 70 includes a processor 71 and a memory 72 connected to the processor 71, the memory 72 is used for storing program data, and the processor 71 is used for executing the program data to implement the following method: acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: collecting depth information by using a top laser radar, a front laser radar, a rear laser radar and an ultrasonic sensor assembly which are arranged on a distribution robot, and collecting image information by using a camera assembly which is arranged on the distribution robot; generating first environment information around the distribution machine according to the depth information and the image information; acquiring prestored second environment information corresponding to the current driving path; and comparing the first environmental information with the second environmental information to determine the obstacles in the current driving path of the delivery robot.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: determining position information and/or relative speed information between the obstacle and the delivery robot; based on the position information and/or the relative speed information, the movement state of the dispensing robot is changed to avoid the obstacle.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: acquiring a first distance and a first azimuth angle between an obstacle and a delivery robot at a first moment; acquiring a second distance and a second azimuth angle between the obstacle and the distribution robot at a second moment; and determining relative speed information between the obstacle and the delivery robot based on the first distance, the first azimuth, the second distance, the second azimuth and the time difference between the first time and the second time.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: judging whether the distance between the barrier and the distribution robot is larger than a preset first distance threshold value or not; if so, switching the current driving path into another preset driving path; and if not, controlling the distribution robot to replan the current path.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: judging whether the distance between the barrier and the distribution robot is larger than a preset second distance threshold value or not; if yes, controlling the distribution robot to pause or decelerate; and if not, controlling the distribution robot to back up according to the current driving path.
Optionally, in an embodiment, the processor 71 is further configured to execute the sequence data to implement the following method: determining a special area around the delivery robot based on the environmental information; the special area comprises at least one of a sunken area, an illegal lane area, a water area and a lawn area; when the current path is re-planned, the path is forbidden to be planned on the special area.
In addition, the dispensing robot 70 may further include a driving mechanism, a sensor assembly, and the like, which will not be described herein.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a computer storage medium 80 according to an embodiment of the present application, where the computer storage medium 80 stores program data 81, and the program data 81, when executed by a processor, is used to implement the following method:
acquiring environmental information around the distribution robot; determining an obstacle in a current travel path of the delivery robot based on the environmental information; based on the obstacle, a movement state of the delivery robot is determined to avoid the obstacle.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units in the other embodiments described above may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made according to the content of the present specification and the accompanying drawings, or which are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. An obstacle avoidance method for a delivery robot, the method comprising:
acquiring environmental information around the distribution robot;
determining an obstacle in a current travel path of the delivery robot based on the environmental information;
determining a motion state of the delivery robot based on the obstacle to avoid the obstacle.
2. The method of claim 1,
the acquiring environmental information around the delivery robot includes:
acquiring depth information by using a top laser radar, a front laser radar, a rear laser radar and an ultrasonic sensor assembly which are arranged on the distribution robot, and acquiring image information by using a camera assembly which is arranged on the distribution robot;
generating first environment information around the distribution machine according to the depth information and the image information;
the determining of the obstacle in the current travel path of the delivery robot based on the environmental information includes:
acquiring prestored second environment information corresponding to the current driving path;
and comparing the first environmental information with the second environmental information to determine the obstacles in the current driving path of the delivery robot.
3. The method of claim 1,
the determining a motion state of the delivery robot to avoid the obstacle based on the obstacle includes:
determining position information and/or relative speed information between the obstacle and the delivery robot;
changing a movement state of the delivery robot to avoid the obstacle based on the position information and/or the relative speed information.
4. The method of claim 3,
the determining position information and/or relative velocity information between the obstacle and the delivery robot includes:
acquiring a first distance and a first azimuth between the obstacle and the delivery robot at a first moment;
acquiring a second distance and a second azimuth angle between the obstacle and the distribution robot at a second moment;
determining relative speed information between the obstacle and the delivery robot based on the first distance, the first azimuth, the second distance, the second azimuth, and a time difference between the first time and the second time.
5. The method of claim 1,
the determining a motion state of the delivery robot to avoid the obstacle based on the obstacle includes:
judging whether the distance between the barrier and the distribution robot is larger than a preset first distance threshold value or not;
if so, switching the current driving path into another preset driving path;
and if not, controlling the distribution robot to replan the current path.
6. The method of claim 5,
the controlling the delivery robot to re-plan the current path includes:
judging whether the distance between the barrier and the distribution robot is larger than a preset second distance threshold value or not;
if yes, controlling the distribution robot to pause or decelerate;
and if not, controlling the distribution robot to back up according to the current driving path.
7. The method of claim 5,
the method further comprises the following steps:
determining a special area around the delivery robot based on the environmental information; wherein the special area comprises at least one of a sunken area, an illegal lane area, a water area and a lawn area;
and when the current path is replanned, prohibiting the path from being planned on the special area.
8. The method of claim 5,
the method further comprises the following steps:
judging whether collision information is detected; wherein the impact information is generated by the impact of an anti-collision mechanism arranged on the distribution robot;
and if so, controlling the distribution robot to pause.
9. A dispensing robot, characterized in that the dispensing robot comprises a processor and a memory connected to the processor for storing program data, the processor being adapted to execute the program data for implementing the method according to any of claims 1-8.
10. A computer storage medium, characterized in that program data are stored in the computer storage medium, which program data, when being executed by the processor, are adapted to carry out the method of any one of claims 1-8.
CN201910726087.3A 2019-08-07 2019-08-07 Distribution robot, obstacle avoidance method thereof and computer storage medium Pending CN112346445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910726087.3A CN112346445A (en) 2019-08-07 2019-08-07 Distribution robot, obstacle avoidance method thereof and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910726087.3A CN112346445A (en) 2019-08-07 2019-08-07 Distribution robot, obstacle avoidance method thereof and computer storage medium

Publications (1)

Publication Number Publication Date
CN112346445A true CN112346445A (en) 2021-02-09

Family

ID=74366695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910726087.3A Pending CN112346445A (en) 2019-08-07 2019-08-07 Distribution robot, obstacle avoidance method thereof and computer storage medium

Country Status (1)

Country Link
CN (1) CN112346445A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113807551A (en) * 2021-09-15 2021-12-17 上汽通用五菱汽车股份有限公司 Distribution method, robot, device and readable storage medium
CN114253257A (en) * 2021-11-23 2022-03-29 广东嘉腾机器人自动化有限公司 Mobile robot path driving control method and storage device
CN114838736A (en) * 2022-04-29 2022-08-02 深圳优地科技有限公司 Obstacle avoidance path planning method and device, unmanned vehicle and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0556031A2 (en) * 1992-02-10 1993-08-18 Honda Giken Kogyo Kabushiki Kaisha System for obstacle avoidance path planning for multiple-degree-of-freedom mechanism
CN101943916A (en) * 2010-09-07 2011-01-12 陕西科技大学 Kalman filter prediction-based robot obstacle avoidance method
CN105008955A (en) * 2013-03-04 2015-10-28 松下知识产权经营株式会社 Vehicular obstacle detection device, and vehicular obstacle detection system
EP2952301A1 (en) * 2014-06-05 2015-12-09 Aldebaran Robotics Humanoid robot with collision avoidance and trajectory recovery capabilities
CN108733045A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 Robot and its barrier-avoiding method and computer readable storage medium
CN109017786A (en) * 2018-08-09 2018-12-18 北京智行者科技有限公司 Vehicle obstacle-avoidance method
CN109508019A (en) * 2018-12-28 2019-03-22 北京猎户星空科技有限公司 A kind of motion planning and robot control method, apparatus and storage medium
CN109583384A (en) * 2018-11-30 2019-04-05 百度在线网络技术(北京)有限公司 Barrier-avoiding method and device for automatic driving car

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0556031A2 (en) * 1992-02-10 1993-08-18 Honda Giken Kogyo Kabushiki Kaisha System for obstacle avoidance path planning for multiple-degree-of-freedom mechanism
CN101943916A (en) * 2010-09-07 2011-01-12 陕西科技大学 Kalman filter prediction-based robot obstacle avoidance method
CN105008955A (en) * 2013-03-04 2015-10-28 松下知识产权经营株式会社 Vehicular obstacle detection device, and vehicular obstacle detection system
EP2952301A1 (en) * 2014-06-05 2015-12-09 Aldebaran Robotics Humanoid robot with collision avoidance and trajectory recovery capabilities
CN108733045A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 Robot and its barrier-avoiding method and computer readable storage medium
CN109017786A (en) * 2018-08-09 2018-12-18 北京智行者科技有限公司 Vehicle obstacle-avoidance method
CN109583384A (en) * 2018-11-30 2019-04-05 百度在线网络技术(北京)有限公司 Barrier-avoiding method and device for automatic driving car
CN109508019A (en) * 2018-12-28 2019-03-22 北京猎户星空科技有限公司 A kind of motion planning and robot control method, apparatus and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113807551A (en) * 2021-09-15 2021-12-17 上汽通用五菱汽车股份有限公司 Distribution method, robot, device and readable storage medium
CN114253257A (en) * 2021-11-23 2022-03-29 广东嘉腾机器人自动化有限公司 Mobile robot path driving control method and storage device
CN114838736A (en) * 2022-04-29 2022-08-02 深圳优地科技有限公司 Obstacle avoidance path planning method and device, unmanned vehicle and storage medium

Similar Documents

Publication Publication Date Title
CN112346445A (en) Distribution robot, obstacle avoidance method thereof and computer storage medium
CN111090277B (en) Method, apparatus and storage medium for travel control
US10908264B2 (en) LIDAR with an adaptive high-intensity zone
US10656277B1 (en) Adaptive control of ladar system camera using spatial index of prior ladar return data
CN108007452B (en) Method and device for updating environment map according to obstacle and robot
US6542111B1 (en) Path prediction for vehicular collision warning system
Fletcher et al. The MIT–Cornell collision and why it happened
JP5062364B2 (en) Autonomous mobile body and control method thereof
US20100026555A1 (en) Obstacle detection arrangements in and for autonomous vehicles
US8095313B1 (en) Method for determining collision risk for collision avoidance systems
KR101338246B1 (en) Collision prevention system of mobile robot in unknown environment and method thereof
US20240085921A1 (en) System for obstacle detection
US20180157925A1 (en) System and method for object recognition and ranging by deformation of projected shapes in a multimodal vision and sensing system for autonomous devices
KR102328506B1 (en) System and method for colleting unmanned public information
Liang et al. Detection of malicious intent in non-cooperative drone surveillance
CN107643751A (en) The slope recognition methods of intelligent walking equipment and system
JP2998125B2 (en) Object size determination method, object size determination device, distance detection method, and distance detection device using laser radar
JP6895741B2 (en) Obstacle detection device and obstacle detection method
US11731662B2 (en) Autonomous vehicle system for detecting pedestrian presence
JP6900214B2 (en) Obstacle detection device and obstacle detection method
KR102581196B1 (en) Airport robot and computer readable recording medium of performing operating method of thereof
JP3218865B2 (en) Route search device incorporating time axis into search space
CN114019972A (en) Obstacle avoidance method and device for unmanned sweeper, electronic equipment and storage medium
Patel et al. Active sensing for high-speed offroad driving
JP2020001562A (en) Detection device, method of preventing substance deposition on detection device, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination