CN111338352A - Robot autonomous path planning method - Google Patents
Robot autonomous path planning method Download PDFInfo
- Publication number
- CN111338352A CN111338352A CN202010214045.4A CN202010214045A CN111338352A CN 111338352 A CN111338352 A CN 111338352A CN 202010214045 A CN202010214045 A CN 202010214045A CN 111338352 A CN111338352 A CN 111338352A
- Authority
- CN
- China
- Prior art keywords
- point
- value
- robot
- obstacle
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000002093 peripheral effect Effects 0.000 claims abstract description 5
- 230000004888 barrier function Effects 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 10
- 230000011218 segmentation Effects 0.000 claims description 9
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 claims description 6
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000011156 evaluation Methods 0.000 claims description 3
- 238000013139 quantization Methods 0.000 claims description 3
- 238000012360 testing method Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 abstract 3
- 238000010586 diagram Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention relates to a method for planning an autonomous path of a robot, which comprises the steps of judging whether an obstacle exists or not according to a distance measurement signal fed back by a laser distance measurement sensor through peripheral contour information fed back by the laser distance measurement sensor in the normal walking process of the robot, dividing the obstacle to obtain obstacle characteristic points in front of the walking of the robot, and judging whether inclusion relation exists or not by putting the obstacle characteristic points into an envelope line track, so that the autonomous path planning is carried out, and the running efficiency of the robot in a complex running environment is improved.
Description
Technical Field
The invention belongs to the technical field of robot control, and particularly relates to a method for planning an autonomous path of a robot.
Background
With the continuous development of robot control technology, intelligent transfer robots are gradually and widely applied to industrial scenes such as warehouse logistics, production and transportation, e-commerce sorting and the like. In the process of applying the intelligent carrying robot, the safety of the robot is focused by people, and the active safety obstacle avoidance sensing device of the robot is largely used in robot design, such as: ultrasonic distance sensors, infrared distance sensors, laser ranging sensors, point cloud obstacle sensors, and the like. However, most robots using the distance measuring sensor only realize a mode that the robot actively stops waiting and continues to move according to a predetermined path after the obstacle is removed. If the obstacle exists all the time, the robot always waits in place, and the running efficiency of the robot is greatly reduced.
It is desirable to provide a technical solution that can perform autonomous detour after a robot detects an obstacle and complete autonomous passing through the obstacle, so as to solve the above technical problems.
Disclosure of Invention
In order to solve the problems, the invention provides a method which can perform autonomous detour after a robot detects an obstacle and complete autonomous path planning of the robot passing through the obstacle, so that the operation efficiency of the robot in a complex operation environment is improved.
The specific technical scheme is as follows: a method for planning an autonomous path of a robot comprises the following steps:
101, running the robot according to a preset walking path below a scheduling system, and acquiring the coordinate position of the robot in a map in real time;
102, in the normal walking process of the robot, calculating the absolute coordinate position and the course angle of the robot in a map system through a guiding and positioning algorithm according to peripheral contour information fed back by a laser ranging sensor;
103, judging whether an obstacle exists according to a ranging signal fed back by the laser ranging sensor, if no obstacle signal is fed back within a certain distance, continuing to execute the step 101, and continuing to operate forwards; if an obstacle is found, go to step 104;
the segmentation method comprises the following steps:
(1) taking the initial angle value as the values of the closest point, the left boundary point and the right boundary point;
(2) taking down an angle value (the angle step is determined by a sensor), and calculating the variable quantity corresponding to the last value;
(3) if the variation is less than-200 mm, taking the angle value up to the last as an obstacle; if the variation is less than 200mm, updating the left boundary value and the value of the nearest point; if the variation is larger than 200, then taking 10 values backwards to judge whether the variation is larger than 200mm, if so, taking the last angle value as an obstacle, and if not, updating the left boundary value and the value of the nearest point;
(4) if the terminal angle is reached, the current data is used as an obstacle, and if the terminal angle is not reached, the step (2) is returned;
105, dividing and extracting the characteristic points of the obstacles, and then dividing a path where the robot will walk to form an envelope curve;
the segmentation method comprises the following steps:
if the minimum curvature radius of the curve section is more than or equal to 10000mm, the whole section is equivalent to a straight line;
if the minimum radius of curvature of the curved segment is less than 10000mm, the curved segment needs to be cut:
(1) calculating a starting point u and a corresponding curvature radius as a starting value of the small section;
(2) taking the value of the increasing step of u to be 0.005, calculating a new curvature radius, and if u is more than or equal to 1.0, generating a small straight line segment and finishing segmentation;
(3) if the initial curvature radius of the small segment is larger than 50000mm and the new curvature radius is smaller than 50000, a small straight line segment is generated, the new u and the new curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(4) if the initial curvature radius of the small segment is less than 50000mm and the new curvature radius is more than 50000mm, a small arc segment is generated, the new u and curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(5) if the difference value between the new curvature radius and the initial curvature radius of the small segment is larger than 5 percent of the initial curvature radius of the small segment, generating the small segment, updating the initial value of the small segment by using the new u and the curvature radius, and returning to the step (2);
after the robot path is divided, the envelope range of the robot when the robot runs to each divided section needs to be calculated according to each divided section, and the calculation method is as follows:
1. when the cut segment is a straight segment:
in order to avoid the motion deviation of the robot and the deviation caused by equivalent cutting, the mechanical size of the robot needs to be expanded, and at present, four edges of the robot are all expanded by 50mm outwards; from straight-line segment EF, the corresponding rectangle can be generated with four vertices ABCD, and the coordinates of the A, B, C, D four points are computable;
vehicle width: w = actual vehicle width + 50 × 2;
the front length of the vehicle is as follows: l1 = actual front vehicle length + 50;
the rear length of the vehicle is as follows: l2 = actual rear length + 50;
2. when the cutting segment is a curved segment:
the curve obtained by cutting is not a circular arc, and is equivalent to the circular arc for the convenience of calculation; the starting point and the end point of the circular arc are the starting point and the end point of the curve, the equivalent radius of the circular arc is the average value of the radius of the starting point and the radius of the end point of the curve, the equivalent center point O can be obtained according to the geometric property of the circular arc, and a partial circular ring O-ABCD can be generated according to the cutting curve section EF;
the judgment method comprises the following steps:
the area swept by the vehicle motion is equivalent to a plurality of rectangles and partial circular rings, and whether the characteristic point is in the envelope curve or not, namely whether the characteristic point is in one or more rectangles and partial circular arcs or not is judged;
(1) judging whether a point is in the rectangle
Assuming the presence of point P and rectangle ABCD, the sufficient condition for point P within the rectangle is:
(2) judging whether a point is in a partial ring
Assuming the presence of point P and partial circles 0-ABCD, the sufficient condition for point P to be within the partial circle is:
Step 107, when judging that the obstacle feature point is in the envelope rectangle or part of the envelope, starting to plan the autonomous path, specifically, the method comprises the following steps:
A. and (3) basic type value point selection:
selecting an angle: selecting a vector direction formed by the projection of the previous barrier and the barrier on the initial path as an angle of the current value point;
and (3) selecting coordinates: extending the semi-vehicle width (including the expansion width) outwards along the vertical direction of the angle of the model point on the basis of the obstacle feature point to obtain a model point coordinate;
B. and (3) expanding analysis:
(1) the model value points selected by the basic model value points are not always accessible, the rectangular formed by the model value points needs to be further subjected to obstacle detection, and if the detection fails, the rectangular formed by the model value points continues to extend outwards until the detection succeeds or the allowable distance is exceeded;
(2) the obstacle detouring method includes that the obstacle inevitably passes through one side of the obstacle in the process of detouring the obstacle, so that edge detection is firstly carried out when a new obstacle is analyzed, namely two model value points are respectively calculated according to left and right characteristic points of an object, and the better one is selected, so that the direction of detouring the obstacle is determined;
defining the quality of the evaluation function quantization type value point:
wherein, before distance is the distance from the last type point to the present type point; the leftDistance is the distance from the current value point to the terminal point of the initial path segment; viaDiffAngle is the angle difference between the current value point and the previous value point; originalsistance is the distance between the local point and the projected point; the denominator represents the sum of the left and right type value point parameters of the corresponding parameter; the smaller the Q value is, the higher the quality of the corresponding type value point is;
(3) tests show that the dense type value points easily cause extremely uneven change of curvature radius, so that the type value points with the distance from the previous type value point or the barrier boundary type value point smaller than half the vehicle length are abandoned;
C. extra type value point:
since the curvature radius is not uniform due to the model value points generated only by the feature points, additional model value points need to be added according to the model value points calculated by the feature points;
(1) the adjacent type value points are expanded in the same direction:
when the distance between the two type value points is greater than half the vehicle length, extending the middle point of the two type value points by 100mm along the vertical direction of the type value point vector to be used as an additional type value point;
(2) the adjacent type value points are not expanded in the same direction:
a connecting line of the last basic model value point and the current basic model value point is called a chord;
if the projection of the chord on the last basic type value point direction is larger than the half car length, extending the projection distance of the last basic type value point along the angular bisector of the chord direction in the self direction to be used as an additional type value point;
if the projection of the chord on the reverse direction of the current basic value point is larger than the half car length, extending the projection distance of the current basic value point along the angular bisector of the chord direction in the reverse direction of the current basic value point and the angular bisector of the chord direction to be used as an additional value point;
acquiring a coordinate sequence of a path required to be traveled by the robot according to the calculation;
step 108, the robot continuously detects whether a new obstacle appears in the process of planning the path autonomously, if so, the robot continues to execute step 104 to compare the characteristic points of the obstacle with the path which is planned autonomously again, and then plans the path again;
and step 109, returning to the original planned path to walk when the robot does not detect that the obstacle exists on the running path.
Further, it is preferable that the first and second liquid crystal layers,
the method for planning the path for the robot to avoid the obstacle automatically further comprises the following steps in the inter-system cooperation relationship:
after judging that the barrier is in the range of the future operation path, the robot also needs to report to a managed and controlled scheduling system that the robot needs to enter an autonomous path planning mode.
After the dispatching system receives that the robot enters the autonomous path planning, the dispatching system does not issue a path sequence to the robot any more, and only receives the position, angle and other operation data reported by the robot.
After the operation data of the autonomous path planning robot received by the scheduling system is received, other robots running nearby need to be scheduled to carry out active avoidance, and mutual interference caused by the fact that multiple robots start autonomous detouring in the same area at the same time is avoided.
After the robot finishes autonomous obstacle detouring and returns to a preset path point, information is sent to the scheduling system to receive unified management of the scheduling system again, the scheduling system recalculates a planned path from a point which is recovered by the robot last and sends the planned path to the robot in batches, and the robot recovers management and control of the scheduling system.
The invention has the beneficial effects that: the robot can autonomously bypass after detecting the obstacle, the method for planning the autonomous path of the robot passing through the obstacle is completed, and the operation efficiency of the robot in a complex operation environment can be improved.
Drawings
Fig. 1 is a basic flowchart of a path planning method for a robot to autonomously detour an obstacle according to the present invention.
Fig. 2 is a schematic drawing of a straight-line segment division envelope of the present invention.
Fig. 3 is a schematic diagram of the curve segment division envelope plotting of the present invention.
Fig. 4 is an envelope curve plotting effect diagram.
Detailed Description
In order to make the technical problems and technical solutions solved by the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Examples
The logic for the robot to perform autonomous path planning is performed as described in the implementation steps of fig. 1; the path planning method for the robot to automatically detour around the obstacle is suitable for robot control in various driving modes, such as: the robot comprises a differential structure mobile robot, a steering wheel structure mobile robot, a Maclam wheel structure mobile robot, a crawler structure mobile robot and an axle structure mobile robot.
101, running the robot according to a preset walking path below a scheduling system, and acquiring the coordinate position of the robot in a map in real time; the robot obtains a sequence of paths needing to be traveled in the future from the scheduling system, and path matching is carried out according to a map system stored by the robot.
102, in the normal walking process of the robot, calculating the absolute coordinate position and the course angle of the robot in a map system through a guiding and positioning algorithm according to peripheral contour information fed back by a laser ranging sensor; namely, the laser ranging sensor transmits laser ranging plane lattice data to a robot control system through a laser reflection ranging signal in the running process of the robot.
103, judging whether an obstacle exists according to a ranging signal fed back by the laser ranging sensor, if no obstacle signal is fed back within a certain distance, continuing to execute the step 101, and continuing to operate forwards; if an obstacle is found, go to step 104; namely the peripheral contour information fed back by the laser ranging sensor in the moving process of the robot; if an obstacle is found, the robot can acquire obstacle contour data at the same time.
And 104, if the obstacle signal is judged to exist in front of the robot walking, dividing the obstacle according to the plane lattice data scanned by the laser ranging sensor to obtain the characteristic point of the obstacle. The robot stores the coordinate position in the map system and the outline of the barrier to carry out overlapping calculation, the partition of the barrier is completed, and the barrier is abstracted into a plurality of characteristic points such as a nearest point, a left boundary point, a right boundary point, a series of intermediate points and the like. The cutting method comprises the following steps:
the obstacle raw data is given by a laser ranging sensor through a set of polar coordinates, and the obstacle can be represented as a closest point, a left boundary point, a right boundary point and a series of middle points.
The segmentation method comprises the following steps:
(1) taking the initial angle value as the values of the closest point, the left boundary point and the right boundary point;
(2) taking down an angle value (the angle step is determined by a sensor), and calculating the variable quantity corresponding to the last value;
(3) if the variation is less than-200 mm, taking the angle value up to the last as an obstacle; if the variation is less than 200mm, updating the left boundary value and the value of the nearest point; if the variation is larger than 200, then taking 10 values backwards to judge whether the variation is larger than 200mm, if so, taking the last angle value as an obstacle, and if not, updating the left boundary value and the value of the nearest point;
(4) if the terminal angle is reached, the current data is used as an obstacle, and if the terminal angle is not reached, the step (2) is returned;
the point cloud data can be abstracted into the obstacles by the method, and polar coordinate data are converted into rectangular coordinate data according to the offset parameters of the laser radar and the robot coordinates for calculation.
And 105, dividing and extracting the characteristic points of the obstacles, and then dividing a path where the robot will travel to form an envelope. The method comprises the steps of dividing a robot operation section according to the curvature radius of a straight line section and a curve section of a preset operation path of the robot, and drawing the envelope curve of the robot on each divided path.
The segmentation method comprises the following steps:
if the minimum curvature radius of the curve section is more than or equal to 10000mm, the whole section is equivalent to a straight line;
if the minimum radius of curvature of the curved segment is less than 10000mm, the curved segment needs to be cut:
(1) calculating a starting point u and a corresponding curvature radius as a starting value of the small section;
(2) taking the value of the increasing step of u to be 0.005, calculating a new curvature radius, and if u is more than or equal to 1.0, generating a small straight line segment and finishing segmentation;
(3) if the initial curvature radius of the small segment is larger than 50000mm and the new curvature radius is smaller than 50000, a small straight line segment is generated, the new u and the new curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(4) if the initial curvature radius of the small segment is less than 50000mm and the new curvature radius is more than 50000mm, a small arc segment is generated, the new u and curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(5) if the difference value between the new curvature radius and the initial curvature radius of the small segment is larger than 5 percent of the initial curvature radius of the small segment, generating the small segment, updating the initial value of the small segment by using the new u and the curvature radius, and returning to the step (2);
after the robot path is divided, the envelope range of the robot when the robot runs to each divided section needs to be calculated according to each divided section, and the calculation method is as follows:
1. when the cut segment is a straight segment.
In order to avoid the motion deviation of the robot and the deviation caused by equivalent cutting, the mechanical size of the robot needs to be expanded, and at present, four edges of the robot are all expanded by 50mm outwards. From straight line segment EF, a corresponding rectangle can be generated with four vertices ABCD and the coordinates of the four points of A, B, C, D are calculable, as shown in fig. 2.
Vehicle width: w = actual vehicle width + 50 × 2;
the front length of the vehicle is as follows: l1 = actual front vehicle length + 50;
the rear length of the vehicle is as follows: l2 = actual rear length + 50;
2. when the cutting segment is a curved segment.
The cut curve is not a circular arc, and is equivalent to a circular arc for calculation. The starting point and the end point of the circular arc are the starting point and the end point of the curve, the equivalent radius of the circular arc is the average value of the radius of the starting point and the radius of the end point of the curve, and the equivalent central point O can be obtained according to the geometric properties of the circular arc. As shown in fig. 3, a partial circular ring O-ABCD may be generated from the cutting curve segment EF.
and 106, after the envelope curve of the robot is formed, acquiring the overall contour track of the whole robot running in the curve, and at the moment, putting the characteristic points of the obstacles into the envelope curve track to judge whether the inclusion relation exists. After the operation envelope curve of the robot on the preset path is drawn, the robot performs plane comparison with the coordinates of the feature points of the obstacles, the robot autonomously judges whether the feature points of the obstacles exist in the envelope curve range of the robot, and if yes, the autonomous detour obstacle path planning process is started, as shown in fig. 4.
The judgment method comprises the following steps:
the area swept by the vehicle motion is equivalent to a plurality of rectangles and partial circles, and whether the characteristic point is in the envelope curve or not, namely whether the characteristic point is in one or more rectangles and partial arcs or not is judged.
(1) Judging whether a point is in the rectangle
Corresponding to fig. 1, assuming that there is a point P and a rectangle ABCD, the sufficient condition of the point P within the rectangle is:
(2) judging whether a point is in a partial ring
Corresponding to fig. 3, assuming that there is a point P and a partial circle 0-ABCD, the sufficient conditions for the point P within the partial circle are:
Step 107, when judging that the obstacle feature point is in the envelope rectangle or part of the envelope, starting to perform autonomous path planning, wherein the path planning method comprises the following steps:
D. and (3) basic type value point selection:
selecting an angle: and selecting the vector direction formed by the previous barrier and the projection of the barrier on the initial path as the angle of the current value point.
And (3) selecting coordinates: and extending the semi-vehicle width (including the expansion width) outwards along the vertical direction of the angle of the model point on the basis of the obstacle feature point to obtain the coordinate of the model point.
E. And (3) expanding analysis:
(1) the model value points selected by the basic model value points are not always accessible, the rectangles formed by the model value points need to be further subjected to obstacle detection, and if the detection fails, the rectangles continue to extend outwards until the detection succeeds or the allowable distance is exceeded.
(2) The obstacle detouring method is characterized in that the obstacle inevitably passes through one side of the obstacle in the process of detouring the obstacle, so that edge detection is firstly carried out when a new obstacle is analyzed, namely two model value points are respectively calculated according to the left characteristic point and the right characteristic point of an object, and the better one is selected, so that the direction of detouring the obstacle is determined.
Defining the quality of the evaluation function quantization type value point:
wherein, before distance is the distance from the last type point to the present type point; the leftDistance is the distance from the current value point to the terminal point of the initial path segment; viaDiffAngle is the angle difference between the current value point and the previous value point; originalsistance is the distance between the point of origin and the projected point. The denominator represents the sum of the left and right type value point parameters of the corresponding parameter. The smaller the Q value, the higher the corresponding type point quality.
(3) Tests show that the dense type value points easily cause extremely uneven change of the curvature radius, so that the type value points with the distance smaller than half the vehicle length from the previous type value point or the barrier boundary type value point are discarded.
F. Extra type value point:
since the curvature radius is not uniform due to the type value points generated only by the feature points, additional type value points need to be added according to the type value points calculated from the feature points.
(1) Adjacent type value points are expanded in the same direction
And when the distance between the two type value points is greater than half the vehicle length, extending the middle point of the two type value points by 100mm along the vertical direction of the type value point vector to be used as an additional type value point.
(2) Adjacent type value points are not expanded in the same direction
The line connecting the last primitive type value point and this primitive type value point is called a chord.
And if the projection of the chord on the last basic type value point direction is greater than the half car length, extending the projection distance of the last basic type value point along the angular bisector of the chord direction in the self direction to be used as an additional type value point.
And if the projection of the chord on the reverse direction of the current basic value point is larger than the half car length, extending the projection distance of the current basic value point along the angular bisector of the chord direction in the reverse direction of the current basic value point and the chord direction to be used as an additional value point.
And obtaining the coordinate sequence of the path required to be traveled by the robot according to the calculation.
Step 108, the robot continuously detects whether a new obstacle appears in the process of planning the path autonomously, if so, the robot continues to execute step 104 to compare the characteristic points of the obstacle with the path which is planned autonomously again, and then plans the path again; after the robot starts the autonomous detour path planning, the robot is performed in a segmented detection and segmented planning mode in the path operation process, and when a new obstacle appears in the robot detour process, the obstacle feature points need to be re-segmented and put into a map system again for path calculation to obtain a new detour path.
The present invention has been described in detail with reference to the specific and preferred embodiments, but it should be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and any modifications, equivalents and the like, which are within the spirit and principle of the present invention, should be included in the scope of the present invention.
Claims (1)
1. A method for planning an autonomous path of a robot is characterized by comprising the following steps: the method comprises the following steps:
101, running the robot according to a preset walking path below a scheduling system, and acquiring the coordinate position of the robot in a map in real time;
102, in the normal walking process of the robot, calculating the absolute coordinate position and the course angle of the robot in a map system through a guiding and positioning algorithm according to peripheral contour information fed back by a laser ranging sensor;
103, judging whether an obstacle exists according to a ranging signal fed back by the laser ranging sensor, if no obstacle signal is fed back within a certain distance, continuing to execute the step 101, and continuing to operate forwards; if an obstacle is found, go to step 104;
step 104, if the robot is judged to have an obstacle signal in front of the walking, dividing the obstacle according to the plane lattice data scanned by the laser ranging sensor to obtain the characteristic points of the obstacle, wherein the dividing method comprises the following steps:
the segmentation method comprises the following steps:
(1) taking the initial angle value as the values of the closest point, the left boundary point and the right boundary point;
(2) taking down an angle value (the angle step is determined by a sensor), and calculating the variable quantity corresponding to the last value;
(3) if the variation is less than-200 mm, taking the angle value up to the last as an obstacle; if the variation is less than 200mm, updating the left boundary value and the value of the nearest point; if the variation is larger than 200, then taking 10 values backwards to judge whether the variation is larger than 200mm, if so, taking the last angle value as an obstacle, and if not, updating the left boundary value and the value of the nearest point;
(4) if the terminal angle is reached, the current data is used as an obstacle, and if the terminal angle is not reached, the step (2) is returned;
105, dividing and extracting the characteristic points of the obstacles, and then dividing a path where the robot will walk to form an envelope curve;
the segmentation method comprises the following steps:
if the minimum curvature radius of the curve section is more than or equal to 10000mm, the whole section is equivalent to a straight line;
if the minimum radius of curvature of the curved segment is less than 10000mm, the curved segment needs to be cut:
(1) calculating a starting point u and a corresponding curvature radius as a starting value of the small section;
(2) taking the value of the increasing step of u to be 0.005, calculating a new curvature radius, and if u is more than or equal to 1.0, generating a small straight line segment and finishing segmentation;
(3) if the initial curvature radius of the small segment is larger than 50000mm and the new curvature radius is smaller than 50000, a small straight line segment is generated, the new u and the new curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(4) if the initial curvature radius of the small segment is less than 50000mm and the new curvature radius is more than 50000mm, a small arc segment is generated, the new u and curvature radius are used for updating the initial value of the small segment and the step (2) is returned;
(5) if the difference value between the new curvature radius and the initial curvature radius of the small segment is larger than 5 percent of the initial curvature radius of the small segment, generating the small segment, updating the initial value of the small segment by using the new u and the curvature radius, and returning to the step (2);
after the robot path is divided, the envelope range of the robot when the robot runs to each divided section needs to be calculated according to each divided section, and the calculation method is as follows:
1. when the cut segment is a straight segment:
in order to avoid the motion deviation of the robot and the deviation caused by equivalent cutting, the mechanical size of the robot needs to be expanded, and at present, four edges of the robot are all expanded by 50mm outwards; from straight-line segment EF, the corresponding rectangle can be generated with four vertices ABCD, and the coordinates of the A, B, C, D four points are computable;
vehicle width: w = actual vehicle width + 50 × 2;
the front length of the vehicle is as follows: l1 = actual front vehicle length + 50;
the rear length of the vehicle is as follows: l2 = actual rear length + 50;
2. when the cutting segment is a curved segment:
the curve obtained by cutting is not a circular arc, and is equivalent to the circular arc for the convenience of calculation; the starting point and the end point of the circular arc are the starting point and the end point of the curve, the equivalent radius of the circular arc is the average value of the radius of the starting point and the radius of the end point of the curve, the equivalent center point O can be obtained according to the geometric property of the circular arc, and a partial circular ring O-ABCD can be generated according to the cutting curve section EF;
step 106, after the envelope curve of the robot is formed, obtaining the overall contour track of the whole robot running in the curve, and at the moment, putting the characteristic points of the obstacles into the envelope curve track to judge whether the inclusion relation exists or not;
the judgment method comprises the following steps:
the area swept by the vehicle motion is equivalent to a plurality of rectangles and partial circular rings, and whether the characteristic point is in the envelope curve or not, namely whether the characteristic point is in one or more rectangles and partial circular arcs or not is judged;
(1) judging whether a point is in the rectangle
Assuming the presence of point P and rectangle ABCD, the sufficient condition for point P within the rectangle is:
(2) judging whether a point is in a partial ring
Assuming the presence of point P and partial circles 0-ABCD, the sufficient condition for point P to be within the partial circle is:
Step 107, when judging that the obstacle feature point is in the envelope rectangle or part of the envelope, starting to plan the autonomous path, specifically, the method comprises the following steps:
and (3) basic type value point selection:
selecting an angle: selecting a vector direction formed by the projection of the previous barrier and the barrier on the initial path as an angle of the current value point;
and (3) selecting coordinates: extending the semi-vehicle width (including the expansion width) outwards along the vertical direction of the angle of the model point on the basis of the obstacle feature point to obtain a model point coordinate;
and (3) expanding analysis:
(1) the model value points selected by the basic model value points are not always accessible, the rectangular formed by the model value points needs to be further subjected to obstacle detection, and if the detection fails, the rectangular formed by the model value points continues to extend outwards until the detection succeeds or the allowable distance is exceeded;
(2) the obstacle detouring method includes that the obstacle inevitably passes through one side of the obstacle in the process of detouring the obstacle, so that edge detection is firstly carried out when a new obstacle is analyzed, namely two model value points are respectively calculated according to left and right characteristic points of an object, and the better one is selected, so that the direction of detouring the obstacle is determined;
defining the quality of the evaluation function quantization type value point:
wherein, before distance is the distance from the last type point to the present type point; the leftDistance is the distance from the current value point to the terminal point of the initial path segment; viaDiffAngle is the angle difference between the current value point and the previous value point; originalsistance is the distance between the local point and the projected point; the denominator represents the sum of the left and right type value point parameters of the corresponding parameter; the smaller the Q value is, the higher the quality of the corresponding type value point is;
(3) tests show that the dense type value points easily cause extremely uneven change of curvature radius, so that the type value points with the distance from the previous type value point or the barrier boundary type value point smaller than half the vehicle length are abandoned;
extra type value point:
since the curvature radius is not uniform due to the model value points generated only by the feature points, additional model value points need to be added according to the model value points calculated by the feature points;
(1) the adjacent type value points are expanded in the same direction:
when the distance between the two type value points is greater than half the vehicle length, extending the middle point of the two type value points by 100mm along the vertical direction of the type value point vector to be used as an additional type value point;
(2) the adjacent type value points are not expanded in the same direction:
a connecting line of the last basic model value point and the current basic model value point is called a chord;
if the projection of the chord on the last basic type value point direction is larger than the half car length, extending the projection distance of the last basic type value point along the angular bisector of the chord direction in the self direction to be used as an additional type value point;
if the projection of the chord on the reverse direction of the current basic value point is larger than the half car length, extending the projection distance of the current basic value point along the angular bisector of the chord direction in the reverse direction of the current basic value point and the angular bisector of the chord direction to be used as an additional value point;
acquiring a coordinate sequence of a path required to be traveled by the robot according to the calculation;
step 108, the robot continuously detects whether a new obstacle appears in the process of planning the path autonomously, if so, the robot continues to execute step 104 to compare the characteristic points of the obstacle with the path which is planned autonomously again, and then plans the path again;
and step 109, returning to the original planned path to walk when the robot does not detect that the obstacle exists on the running path.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010214045.4A CN111338352A (en) | 2020-03-24 | 2020-03-24 | Robot autonomous path planning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010214045.4A CN111338352A (en) | 2020-03-24 | 2020-03-24 | Robot autonomous path planning method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111338352A true CN111338352A (en) | 2020-06-26 |
Family
ID=71186129
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010214045.4A Withdrawn CN111338352A (en) | 2020-03-24 | 2020-03-24 | Robot autonomous path planning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111338352A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112379679A (en) * | 2021-01-15 | 2021-02-19 | 北京理工大学 | Unmanned vehicle local path planning method |
CN112612273A (en) * | 2020-12-21 | 2021-04-06 | 南方电网电力科技股份有限公司 | Routing inspection robot obstacle avoidance path planning method, system, equipment and medium |
CN113435417A (en) * | 2021-08-26 | 2021-09-24 | 山东华力机电有限公司 | Double-wheel-driven AGV steering visual control method |
CN113771850A (en) * | 2021-09-08 | 2021-12-10 | 恒大新能源汽车投资控股集团有限公司 | Vehicle road running control method and device and computer readable storage medium |
WO2022000961A1 (en) * | 2020-06-30 | 2022-01-06 | 珠海一微半导体股份有限公司 | Edgewise path selection method for robot obstacle crossing, chip, and robot |
CN115129070A (en) * | 2022-08-31 | 2022-09-30 | 深圳市欧铠智能机器人股份有限公司 | Intelligent obstacle avoidance system and method for storage robot under Internet of things |
CN115338548A (en) * | 2022-10-14 | 2022-11-15 | 四川智龙激光科技有限公司 | Obstacle avoiding method and system for cutting head of plane cutting machine |
CN115718489A (en) * | 2022-11-10 | 2023-02-28 | 中元宇(北京)物联网科技有限公司 | Path planning method and system of epidemic prevention intelligent robot |
CN116188480A (en) * | 2023-04-23 | 2023-05-30 | 安徽同湃特机器人科技有限公司 | Calculation method of AGV traveling path point during ceiling operation of spraying robot |
CN117148811A (en) * | 2023-11-01 | 2023-12-01 | 宁波舜宇贝尔机器人有限公司 | AGV trolley carrying control method and system, intelligent terminal and lifting mechanism |
WO2024113261A1 (en) * | 2022-11-30 | 2024-06-06 | 汤恩智能科技(上海)有限公司 | Robot and obstacle avoidance method therefor, and robot system and storage medium |
-
2020
- 2020-03-24 CN CN202010214045.4A patent/CN111338352A/en not_active Withdrawn
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022000961A1 (en) * | 2020-06-30 | 2022-01-06 | 珠海一微半导体股份有限公司 | Edgewise path selection method for robot obstacle crossing, chip, and robot |
CN112612273A (en) * | 2020-12-21 | 2021-04-06 | 南方电网电力科技股份有限公司 | Routing inspection robot obstacle avoidance path planning method, system, equipment and medium |
CN112379679A (en) * | 2021-01-15 | 2021-02-19 | 北京理工大学 | Unmanned vehicle local path planning method |
CN112379679B (en) * | 2021-01-15 | 2021-04-23 | 北京理工大学 | Unmanned vehicle local path planning method |
CN113435417A (en) * | 2021-08-26 | 2021-09-24 | 山东华力机电有限公司 | Double-wheel-driven AGV steering visual control method |
CN113771850A (en) * | 2021-09-08 | 2021-12-10 | 恒大新能源汽车投资控股集团有限公司 | Vehicle road running control method and device and computer readable storage medium |
CN115129070B (en) * | 2022-08-31 | 2022-12-30 | 深圳市欧铠智能机器人股份有限公司 | Intelligent obstacle avoidance system and method for storage robot under Internet of things |
CN115129070A (en) * | 2022-08-31 | 2022-09-30 | 深圳市欧铠智能机器人股份有限公司 | Intelligent obstacle avoidance system and method for storage robot under Internet of things |
CN115338548A (en) * | 2022-10-14 | 2022-11-15 | 四川智龙激光科技有限公司 | Obstacle avoiding method and system for cutting head of plane cutting machine |
CN115718489A (en) * | 2022-11-10 | 2023-02-28 | 中元宇(北京)物联网科技有限公司 | Path planning method and system of epidemic prevention intelligent robot |
CN115718489B (en) * | 2022-11-10 | 2023-08-15 | 中元宇(北京)物联网科技有限公司 | Path planning method and system for epidemic prevention intelligent robot |
WO2024113261A1 (en) * | 2022-11-30 | 2024-06-06 | 汤恩智能科技(上海)有限公司 | Robot and obstacle avoidance method therefor, and robot system and storage medium |
CN116188480A (en) * | 2023-04-23 | 2023-05-30 | 安徽同湃特机器人科技有限公司 | Calculation method of AGV traveling path point during ceiling operation of spraying robot |
CN116188480B (en) * | 2023-04-23 | 2023-07-18 | 安徽同湃特机器人科技有限公司 | Calculation method of AGV traveling path point during ceiling operation of spraying robot |
CN117148811A (en) * | 2023-11-01 | 2023-12-01 | 宁波舜宇贝尔机器人有限公司 | AGV trolley carrying control method and system, intelligent terminal and lifting mechanism |
CN117148811B (en) * | 2023-11-01 | 2024-01-16 | 宁波舜宇贝尔机器人有限公司 | AGV trolley carrying control method and system, intelligent terminal and lifting mechanism |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111338352A (en) | Robot autonomous path planning method | |
US11460311B2 (en) | Path planning method, system and device for autonomous driving | |
Zheng et al. | RRT based path planning for autonomous parking of vehicle | |
US8515612B2 (en) | Route planning method, route planning device and autonomous mobile device | |
Jian et al. | Putn: A plane-fitting based uneven terrain navigation framework | |
JP2010061293A (en) | Route searching device, route searching method, and route searching program | |
CN112284393A (en) | Global path planning method and system for intelligent mobile robot | |
Zhang et al. | An efficient LiDAR-based localization method for self-driving cars in dynamic environments | |
US20230063845A1 (en) | Systems and methods for monocular based object detection | |
Al-Dahhan et al. | Voronoi boundary visibility for efficient path planning | |
CN113454487A (en) | Information processing device and mobile robot | |
CN113848940A (en) | AGV autonomous navigation control method and system | |
Bi et al. | CURE: A hierarchical framework for multi-robot autonomous exploration inspired by centroids of unknown regions | |
JP5287050B2 (en) | Route planning method, route planning device, and autonomous mobile device | |
CN114200945A (en) | Safety control method of mobile robot | |
Wang et al. | Research on AGV task path planning based on improved A* algorithm | |
CN115755888A (en) | AGV obstacle detection system with multi-sensor data fusion and obstacle avoidance method | |
JP4127419B2 (en) | How to build a global map for mobile robots | |
CN113341999A (en) | Forklift path planning method and device based on optimized D-x algorithm | |
CN112669358A (en) | Map fusion method suitable for multi-platform collaborative perception | |
CN115981314A (en) | Robot navigation automatic obstacle avoidance method and system based on two-dimensional laser radar positioning | |
CN114740869A (en) | Robot obstacle avoidance method and system based on multi-sensor fusion estimation and iterative pre-search | |
CN113048978B (en) | Mobile robot repositioning method and mobile robot | |
García et al. | Vodec: A fast Voronoi algorithm for car-like robot path planning in dynamic scenarios | |
Ouach et al. | PRM path smoothening by circular arc fillet method for mobile robot navigation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20200626 |