CN111522022B - Dynamic target detection method of robot based on laser radar - Google Patents

Dynamic target detection method of robot based on laser radar Download PDF

Info

Publication number
CN111522022B
CN111522022B CN202010310030.8A CN202010310030A CN111522022B CN 111522022 B CN111522022 B CN 111522022B CN 202010310030 A CN202010310030 A CN 202010310030A CN 111522022 B CN111522022 B CN 111522022B
Authority
CN
China
Prior art keywords
radar
point cloud
pose
robot
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010310030.8A
Other languages
Chinese (zh)
Other versions
CN111522022A (en
Inventor
孙伟
杜川
林旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202010310030.8A priority Critical patent/CN111522022B/en
Publication of CN111522022A publication Critical patent/CN111522022A/en
Application granted granted Critical
Publication of CN111522022B publication Critical patent/CN111522022B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a dynamic target detection method, in particular to a dynamic target detection method for a robot based on a laser radar, which corrects positioning by using a non-gradient optimization mode, reduces positioning errors, improves detection precision, and can reach centimeter level; the invention uses radar to scan the whole plane for detection, and has wide detection range. The robot dynamic multi-target detection method based on the laser radar is provided to reduce positioning errors and improve detection precision.

Description

Dynamic target detection method for robot based on laser radar
Technical Field
The invention relates to a dynamic target detection method, in particular to a dynamic target detection method for a robot based on a laser radar.
Background
In recent years, robots have become a representative strategic target in the high-tech field. The appearance and development of the robot technology not only enable the traditional industrial production appearance to be fundamentally changed, but also can generate profound influence on the social life of human beings. Since the autonomous mobile robot can effectively and safely perform autonomous movement only by accurately knowing the position of the autonomous mobile robot, the position of the obstacle in the working space and the movement condition of the obstacle, the problem of target detection and positioning of the autonomous mobile robot is particularly important. A great deal of research work has been done in this field both at home and abroad. In terms of current research progress, sensors such as a GPS, a camera, an inertial navigation, and the like are mainly used for acquiring environmental information and processing the environmental information to realize autonomous positioning and target detection. This also determines that the accurate and robust description of the sensor's various properties and environment guarantees the correctness of the subsequent decision.
To perform accurate target detection, the robot is first positioned. In the prior art, when a robot is used for target detection, a plurality of uncertain factors exist in the positioning process. Such as the uncertainty of the robot itself, the accumulation of odometry errors, the noise interference of the sensors, and the complexity, unknowns, etc. of the environment in which the robot is located. Due to the existence of the uncertain factors, the positioning of the robot becomes more complex, so that the environment description has errors, and the accumulation of wrong information in the system is caused, and the accumulation further aggravates the wrong deviation of the environment perception system, so that the precision of the detection result is lower.
Disclosure of Invention
The invention aims to provide a dynamic multi-target detection method for a robot based on a laser radar to reduce positioning errors and improve detection precision aiming at the defects of the prior art.
In order to achieve the aim, the method for detecting the dynamic target by the robot based on the laser radar comprises the following steps:
(1) The robot center comprises a laser scanning radar for collecting and generating a point cloud set [ beta ] n ,l 1n ]The robot positioning unit gives a robot positioning pose; wherein, beta n The scanning angle of the laser radar is represented, the angle range is the radar scanning full plane, l 1n Representing a distance of the detected target from the robot at the corresponding angle; by beta n Abscissa, l 1n Establishing an entity radar point cloud data map for the vertical coordinate;
(2) Establishing a vector scene map according to a scene, representing barrier blocks in the scene map and the peripheral edges of the scene by using enclosed line segments, and obtaining coordinates of end points and starting points of all the line segments;
(3) Establishing a simulation laser radar model, inputting the pose of the entity laser radar, and solving a point cloud set [ beta ] formed by intersection points of all rays emitted by the simulation laser radar, map edges and obstacle blocks by using an intercept traversal method n ,l 2n ]Obtaining a radar simulation graph;
(4) Correcting the positioning pose of the robot by adopting a non-gradient optimization mode, and using the corrected pose p k Replacing the previously input simulated entity radar pose to obtain a new simulated laser radar point cloud set and point cloud data map;
(5) And (3) subtracting points of the entity laser radar point cloud set and the new simulation laser radar point cloud set at the same scanning angle, and comparing the difference value with a set threshold value zeta:
if the distance difference l between the continuous n points in the entity laser radar point cloud data map and the corresponding points in the simulation laser radar point cloud data map is larger than the set threshold value zeta, namely l is larger than zeta, the entity laserThe optical radar detects the target and converts beta of adjacent n points n Is recorded as the mean value of
Figure BDA0002457305550000031
The average value of the distances of the entity lidar at adjacent n points is recorded as->
Figure BDA0002457305550000032
The position of the entity laser radar is taken as the origin of a polar coordinate system, and the polar coordinate of the target is obtained as ^ greater or greater>
Figure BDA0002457305550000033
Otherwise, no target is detected.
Compared with the prior art, the invention has the following advantages:
1) The invention uses a non-gradient optimization mode to correct positioning, reduces positioning errors, improves detection precision which can reach centimeter level;
2) The invention uses radar to scan the whole plane for detection, and has wide detection range.
Drawings
FIG. 1 general flow chart of an implementation of the present invention
FIG. 2 is a data diagram of a cloud of entity radar points in accordance with the present invention;
FIG. 3 is an actual scene map used in the present invention;
FIG. 4 is a scene vector diagram established according to an actual scene in the present invention;
FIG. 5 is a sub-flow diagram of the generation of simulated radar point cloud data in accordance with the present invention;
FIG. 6 is a simulation diagram of the intersection of rays emitted by the simulated laser radar with the scene edge and the obstacle block, respectively, in accordance with the present invention;
FIG. 7 is a comparison result diagram of point cloud data of the entity radar and the simulation radar at the robot positioning point in the invention;
FIG. 8 is a sub-flowchart of the present invention for correcting the robot positioning pose using non-gradient optimization;
FIG. 9 is a point cloud data diagram of a simulated radar in accordance with the present invention;
FIG. 10 is a diagram showing the comparison result of the point cloud data of the corrected entity radar and the simulated radar at the robot positioning point.
Detailed Description
The embodiments and effects of the present invention will be described in detail below with reference to the accompanying drawings:
referring to fig. 1, the method for detecting a dynamic target by a robot based on a laser radar includes the following specific implementation steps:
the method comprises the following steps: the robot center comprises a laser scanning radar for collecting and generating a point cloud set [ beta ] n ,l 1n ]The robot comprises a positioning unit for giving a robot positioning pose (x, y, alpha); wherein, beta n The scanning angle of the laser radar is represented, the angle range is the radar scanning full plane, l 1n Representing a distance of the detected target from the robot at the corresponding angle; at beta n Abscissa, < i > 1n Establishing an entity radar point cloud data map for the vertical coordinate;
UTM-30LX of hokuyo is adopted by the laser scanning radar, the UTM-30LX of the laser scanning radar hokuyo has a scanning angle theta of 270 degrees, an angular resolution lambda of 0.25 degrees, a scanning range of 0.1-30m, a scanning frequency of 40HZ, point cloud data of UTM-30LX of the hokuyo is (beta) n ,l n ),β n And l n Respectively representing the angle and the distance of the nth ray emitted by the entity laser radar relative to a laser radar body polar coordinate system by beta n Is the abscissa, l n A map of the laser scanning radar point cloud data established for the ordinate is shown in figure 4.
Referring to fig. 4, the angle β of the nth ray emitted by the lidar with respect to the polar coordinate system of the lidar body n In the range of
Figure BDA0002457305550000051
And the robot pose is the pose of the laser scanning radar and is generated by Monte Carlo positioning or provided by UWB and GPS systems, and the pose is assumed to be (x, y, alpha), wherein x and y represent the position of the entity laser radar in the scene coordinate systemAnd alpha represents the included angle between the central line of the entity laser radar and the x axis of the map coordinate system, and the UWB system is used for generating the positioning pose of the robot in the embodiment.
Step two: establishing a vector scene map according to a scene, representing barrier blocks in the scene map and the peripheral edges of the scene by enclosed line segments, and obtaining coordinates of end points and starting points of all the line segments;
referring to fig. 3, the actual scene map used in the present invention is composed of obstacle blocks and scene edges surrounded by line segments, and the coordinates of the start and end points of all line segments are shown in table 1:
table 1: coordinates of line segment in map (Unit: CM)
Figure BDA0002457305550000052
Figure BDA0002457305550000061
According to the end point coordinates of a line segment in the table, the end point coordinates are expressed in an x-o-y rectangular coordinate system, and a vector map of the scene can be obtained, as shown in fig. 4.
Step three: establishing a simulation laser radar model, inputting the pose of the entity laser radar, and solving a point cloud set [ beta ] formed by intersection points of all rays emitted by the simulation laser radar, the map edge and the obstacle block by using an intercept traversal method n ,l 2n ]Obtaining a radar simulation graph;
3.1 Establishing polar coordinates with the origin of coordinates of the vector map and the X-axis;
3.2 Assume physical radar coordinates of (x) 0 ,y 0 ) And if the included angle between the central line of the entity radar and the direction of the x axis is alpha degrees, the scanning range of the radar is theta degrees, and the angular resolution of the radar is lambda degrees, the simulation radar sends n rays in total:
Figure BDA0002457305550000062
these rays are issued from radar coordinate points, spaced at λ degrees, covering a line centered at α
Figure BDA0002457305550000063
The range of (2), the rays are finally intersected with the obstacle blocks or line segments around the scene, and the intersection points are point cloud coordinates (x) obtained by scanning of the simulated radar i ,y i );
3.3 Referring to fig. 5, each intersection is found by the vector method:
3.3.1 Outline the edge of the ray and the obstacle block or the periphery of the scene into a vector, and the angle gamma between the ith ray and the horizontal direction satisfies
Figure BDA0002457305550000064
3.3.2 Solving the intersection point (x) of the ray and the edge contour line in the two-dimensional space through the constraint relation of the intersection of two straight lines i ,y i ):
According to the result of 3.3.1), the direction vector of the ith ray is (cos gamma, sin gamma), and the coordinates of the starting point of the ray are (x) 0 ,y 0 ) Then (x) 0 +cosγ,y 0 + sin γ) is also a point on the ray, and the starting point and the end point of the edge contour are respectively (x) s ,y s ) And (x) e ,y e ),
The intersection point satisfies both equations:
Figure BDA0002457305550000071
calculating the coordinates (x) of the intersection point from the set of equations i ,y i ) Simulating the distance from the radar coordinate point to the obstacle
Figure BDA0002457305550000072
3.4 Record gamma and l for all n rays β And obtaining a simulated laser radar point cloud set, wherein a simulation graph of the simulated laser radar point cloud set is shown in fig. 6.
Step four: using non-gradient optimisationThe robot positioning pose is corrected by the mode, and the corrected pose p is used k Replacing the previously input simulated entity radar pose to obtain a new simulated laser radar point cloud set and a point cloud data map;
through the steps, the point cloud data of the entity radar and the simulation radar at the robot positioning point are obtained, but certain deviation exists before the point cloud data of the entity radar and the simulation radar, and the deviation can cause false detection, as shown in fig. 7;
in fig. 7, a first line a represents the entity lidar point cloud, a second line B represents the simulated lidar point cloud obtained in step three, and a third line C represents the difference between the two, which can be obtained by observing line C in the figure, the simulated lidar and the entity lidar point cloud are not completely matched, a black arrow is an obstacle detected by the entity lidar, and a gray arrow is a position with the maximum deviation between the two; the source of the deviation is two-sided, one is caused by the difference between the actual scene map and the simulation scene map, and the difference can be reduced by reducing the error between the simulation map and the actual scene map as much as possible when the simulation map is established; secondly, due to the fact that the positioning result of the entity robot is not accurate enough, the difference between the positioning pose used by the simulation radar and the actual pose of the laser radar can cause the overall offset of the point cloud, the offset of a line B and a line A in a square frame of a graph of fig. 7 is reflected on the graph, the offset generated under the condition can be reflected in the whole point cloud, particularly the offset can be large at the inflection point position of the curve, the false detection of a target is easily caused, and therefore, the offset is eliminated in a non-gradient optimization mode in the embodiment, and the positioning pose of the robot is corrected;
referring to fig. 8, the specific implementation of this step is as follows:
4.1 Suppose the real pose of radar in the map coordinate system is (x) 0 ,y 00 ) The pose is unknown but really exists, and the pose obtained by the robot through the positioning unit or the Monte Carlo self-positioning is (x) 1 ,y 11 ) And due to the existence of noise in the sensor, the pose is more or less deviated from the real pose of the robot, and the pose is defined
Figure BDA0002457305550000081
Is unbiased rate, where L is the set of solid laser radar point clouds, L' is the set of simulated laser radar point clouds, N t Representing the total number of points in the lidar point cloud, N in this example t =1080,N u Zeta is a set threshold value for all points satisfying that the distance difference l is less than or equal to zeta;
4.2 Initializing unbiased rate C = -1, and the pose obtained by positioning the robot at a certain moment is p 1 =(x 1 ,y 11 ) Wherein x is 1 ,y 1 Representing the position of the physical lidar in the scene coordinate system, α 1 At the moment, the included angle between the central line of the radar and the direction of the x axis is L, and the point cloud set scanned by the entity laser radar is L 1n =[β n ,l 1n ]Taking the pose as an initial value, and calculating to obtain a pose based on p 1 Point cloud L of a simulated radar of points 2n =[β n ,l 2n ]=f(x 1 ,y 11 ) Calculating the unbiased rate C at this time according to the formula in 4.1) 1 =f C (L 1n ,L 2n );
4.3 Set step size is expressed in step for x 1 ,y 11 And respectively stepping and shifting to obtain the expression of the shifted simulated laser radar point cloud set in three directions:
Figure BDA0002457305550000091
Figure BDA0002457305550000092
the point cloud set corresponds to unbiased ratios in three directions as
Figure BDA0002457305550000093
4.4 Dx = C) with dx, dy, d α being the respective offset x -C 1 ,dy=C y -C 1 ,dα=C α -C 1 To obtain a new positioning pose p 2 =(x 2 ,y 22 )=(x 1 +dx,y 1 +dy,α 1 +dα);
4.5 Repeat (4.3) and (4.4) a total of 200 times and record the maximum unbiased rate C obtained max And C max Corresponding pose p k =(x k ,y kk ) Wherein x is k ,y k To obtain C max Position of time-simulated radar in scene coordinate system, alpha k For the angle between the center line of the radar and the x-axis at this time, p k The point most approximate to the real pose of the robot;
step five: using the pose p after correction k And replacing the previously input simulated entity radar pose to obtain a new simulated laser radar point cloud set.
Positioning pose p of the most approximate robot obtained in the fourth step k Inputting a simulated laser radar model, generating simulated radar point cloud data based on the pose according to the third step, and generating new simulated radar point cloud data as shown in fig. 9.
At p k The comparison between the actual radar and the simulated radar point cloud data is shown in fig. 10, wherein a line a represents the actual laser radar point cloud, a second line B represents the new simulated radar point cloud, and a third line C represents the difference between the two, as can be seen from fig. 10, the offset between the two is greatly reduced, which indicates that the robot pose is corrected.
Step six: and judging whether a target exists or not according to the difference value of the points of the entity laser radar point cloud set and the new simulation laser radar point cloud set at the same scanning angle.
6.1 The points of the physical lidar point cloud set and the new simulated lidar point cloud set at the same scanning angle are differenced, as shown by a line C in fig. 10, and the difference is compared with a set threshold value ζ:
if the difference l between the distance of continuous n points in the entity laser radar point cloud data graph and the distance of the corresponding point in the simulation laser radar point cloud data graph is larger than a set threshold zeta as shown in the arrow mark position of fig. 10, the entity laser radar detects a target and executes 6.2);
otherwise, no target is detected;
6.2 Beta of adjacent n points) n Is recorded as the mean value of
Figure BDA0002457305550000101
The mean value of the distances of adjacent n points is recorded as->
Figure BDA0002457305550000103
Taking the position of the entity laser radar as the origin of a polar coordinate system to obtain the polar coordinate of the target as->
Figure BDA0002457305550000102
/>
The foregoing description is only an example of the present invention and is not intended to limit the invention, so that it will be apparent to those skilled in the art that various changes and modifications in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (7)

1. The method for detecting the dynamic target of the robot based on the laser radar is characterized by comprising the following steps: the method comprises the following specific implementation steps:
the method comprises the following steps: the robot center comprises a laser scanning radar for collecting and generating a point cloud set [ beta ] n ,l 1n ]The robot comprises a positioning unit for giving a robot positioning pose (x, y, alpha); wherein, beta n The scanning angle of the laser radar is represented, the angle range is the radar scanning full plane, l 1n Representing a distance of the detected target from the robot at the corresponding angle; by beta n Abscissa, < i > 1n Establishing an entity radar point cloud data map for the vertical coordinate;
step two: establishing a vector scene map according to a scene, representing barrier blocks in the scene map and the peripheral edges of the scene by using enclosed line segments, and obtaining coordinates of end points and starting points of all the line segments;
step three: creation of an imitationInputting the pose of the entity laser radar to the true laser radar model, and calculating a point cloud set [ beta ] formed by the intersection points of all rays emitted by the simulation laser radar and the map edge and the obstacle block by using an intercept traversal method n ,l 2n ]Obtaining a radar simulation graph;
step four: correcting the positioning pose of the robot by adopting a non-gradient optimization mode, and using the corrected pose p k Replacing the previously input simulated entity radar pose to obtain a new simulated laser radar point cloud set and point cloud data map;
step five: using the pose p after correction k Replacing the position and posture of the entity radar input in the past to obtain a new simulated laser radar point cloud set;
step six: and judging whether a target exists or not according to the difference value of the points of the entity laser radar point cloud set and the new simulation laser radar point cloud set at the same scanning angle.
2. The method for dynamic target detection by lidar based robot according to claim 1, wherein: the laser scanning radar used in the first step adopts UTM-30LX of hokuyo, the UTM-30LX of the laser scanning radar hokuyo has a scanning angle theta of 270 degrees, an angular resolution lambda of 0.25 degrees, a scanning range of 0.1-30m and a scanning frequency of 40HZ.
3. The lidar based robot dynamic target detection method of claim 2, wherein: the point cloud data of UTM-30LX of hokuyo is (beta) n ,l n ),β n And l n Respectively representing the angle and the distance of the nth ray emitted by the entity laser radar relative to a laser radar body polar coordinate system; the angle beta of the nth ray emitted by the laser scanning radar relative to the polar coordinate system of the laser radar body n In the range of
Figure FDA0002457305540000021
And is arranged at the center of the robot, the pose of the robot is the pose of the laser scanning radar and is generated by Monte Carlo positioning,or provided by a UWB and GPS system, and assuming the pose is (x, y, alpha), wherein x and y represent the position of the entity laser radar in a scene coordinate system, and alpha represents the included angle between the central line of the entity laser radar and the x axis of a map coordinate system.
4. The method for dynamic target detection by lidar based robot according to claim 1, wherein: the third step specifically comprises the following steps:
3.1 Establishing polar coordinates with the origin of coordinates of the vector map and the X-axis;
3.2 Assume physical radar coordinates of (x) 0 ,y 0 ) The included angle between the center line of the entity radar and the direction of the x axis is alpha degrees, the scanning range of the radar is theta degrees, the angular resolution of the radar is lambda degrees, and then the simulation radar sends out n rays in total:
Figure FDA0002457305540000022
these rays are emitted from radar coordinates points, spaced by λ degrees, covering a field centered at α
Figure FDA0002457305540000031
The ray finally intersects with the obstacle block or the line segment around the scene, and the intersection points are the point cloud coordinates (x) obtained by the simulated radar scanning i ,y i );/>
3.3 Each intersection is found by the vectorial method:
3.3.1 The edge contour of the ray and the periphery of the obstacle block or the scene is linearized into a vector, and the angle gamma between the ith ray and the horizontal direction satisfies
Figure FDA0002457305540000032
3.3.2 Solving the intersection point (x) of the ray and the edge contour line in the two-dimensional space through the constraint relation of the intersection of two straight lines i ,y i ):
According to the result of 3.3.1), the method for obtaining the ith rayThe vector is (cos gamma, sin gamma), and the ray origin coordinate is (x) 0 ,y 0 ) Then (x) 0 +cosγ,y 0 + sin γ) is also a point on the ray, and the starting point and the end point of the edge contour are respectively (x) s ,y s ) And (x) e ,y e ),
The intersection point satisfies both equations:
Figure FDA0002457305540000033
calculating the coordinates (x) of the intersection point from the set of equations i ,y i ) Simulating the distance from the coordinate point of the radar to the obstacle
Figure FDA0002457305540000034
3.4 Record gamma and l for all n rays β And obtaining a simulated laser radar point cloud set.
5. The method for dynamic target detection by lidar based robot according to claim 1, wherein: the fourth step specifically comprises the following steps:
4.1 Suppose the true pose of radar in the map coordinate system is (x) 0 ,y 00 ) The pose is unknown but actually exists, and the pose obtained by the robot through the positioning unit or the Monte Carlo self-positioning is (x) 1 ,y 11 ) And due to the existence of noise in the sensor, the pose is more or less deviated from the real pose of the robot, and the pose is defined
Figure FDA0002457305540000041
Is unbiased rate, where L is the set of solid laser radar point clouds, L' is the set of simulated laser radar point clouds, N t Representing the total number of points in the lidar point cloud, N in this example t =1080,N u Zeta is a set threshold value for all points satisfying that the distance difference l is less than or equal to zeta;
4.2 Initialization ofThe partial rate C = -1, and the pose obtained by the robot positioning at a certain moment is p 1 =(x 1 ,y 11 ) Wherein x is 1 ,y 1 Representing the position of the physical lidar in the scene coordinate system, α 1 At the moment, the included angle between the central line of the radar and the direction of the x axis is formed, and the point cloud set scanned by the entity laser radar is L 1n =[β n ,l 1n ]Taking the pose as an initial value, and calculating to obtain a pose based on p 1 Point cloud L of a simulated radar of points 2n =[β n ,l 2n ]=f(x 1 ,y 11 ) Calculating the unbiased rate C at this time according to the formula in 4.1) 1 =f C (L 1n ,L 2n );
4.3 Set step size is expressed in step for x 1 ,y 11 And respectively stepping and shifting to obtain expressions of the shifted simulated laser radar point cloud set in three directions:
Figure FDA0002457305540000042
Figure FDA0002457305540000043
the point cloud set corresponds to unbiased rates in three directions as
Figure FDA0002457305540000044
4.4 Dx = C) with dx, dy, d α being the respective offset x -C 1 ,dy=C y -C 1 ,dα=C α -C 1 To obtain a new positioning pose p 2 =(x 2 ,y 22 )=(x 1 +dx,y 1 +dy,α 1 +dα);
4.5 Repeat (4.3) and (4.4) a total of 200 times and record the maximum unbiased rate C obtained max And C max Corresponding pose p k =(x k ,y kk ) Wherein x is k ,y k To obtain C max Position of time-simulated radar in scene coordinate system, alpha k At this time, the center line of the radar and the x-axis directionAngle of inclusion of p k That is, the point that most closely approximates the true pose of the robot.
6. The method for dynamic target detection by lidar based robot according to claim 1, wherein: the fifth step specifically comprises the following further steps:
positioning pose p of the most approximate robot obtained in the fourth step k Inputting a simulated laser radar model, generating simulated radar point cloud data based on the pose according to the third step, and generating new simulated radar point cloud data;
at p k And comparing the solid radar point cloud data with the simulated radar point cloud data, wherein a line A represents the solid laser radar point cloud, a second line B represents the new simulated radar point cloud, a third line C represents the difference between the two, and the pose of the robot is corrected.
7. The method for dynamic target detection by lidar based robot according to claim 1, wherein: the sixth step specifically comprises the following further steps:
6.1 The difference value of the points of the entity laser radar point cloud set and the new simulation laser radar point cloud set at the same scanning angle is compared with a set threshold value zeta:
if the distance difference l between the continuous n points in the laser scanning radar point cloud data map and the distance between the corresponding points in the simulated laser radar point cloud data map is larger than a set threshold value zeta, the laser scanning radar detects a target, and 6.2) is executed;
otherwise, no target is detected;
6.2 β from adjacent n points) n Is recorded as the mean value of
Figure FDA0002457305540000051
Recording the mean value of the distances of adjacent n points as +>
Figure FDA0002457305540000052
By the position of the physical laser radarIs the origin of a polar coordinate system, and obtains the polar coordinate of the target as ^>
Figure FDA0002457305540000053
/>
CN202010310030.8A 2020-04-20 2020-04-20 Dynamic target detection method of robot based on laser radar Active CN111522022B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010310030.8A CN111522022B (en) 2020-04-20 2020-04-20 Dynamic target detection method of robot based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010310030.8A CN111522022B (en) 2020-04-20 2020-04-20 Dynamic target detection method of robot based on laser radar

Publications (2)

Publication Number Publication Date
CN111522022A CN111522022A (en) 2020-08-11
CN111522022B true CN111522022B (en) 2023-03-28

Family

ID=71903379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010310030.8A Active CN111522022B (en) 2020-04-20 2020-04-20 Dynamic target detection method of robot based on laser radar

Country Status (1)

Country Link
CN (1) CN111522022B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112433211B (en) * 2020-11-27 2022-11-29 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
CN112732849B (en) * 2020-12-14 2022-09-27 北京航空航天大学 High-precision vector map compression method based on polar coordinate system
CN114743449A (en) * 2020-12-23 2022-07-12 北醒(北京)光子科技有限公司 Thing networking teaching aid based on laser radar
CN113359151B (en) * 2021-08-09 2021-11-23 浙江华睿科技股份有限公司 Robot task point positioning method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018182538A1 (en) * 2017-03-31 2018-10-04 Agency For Science, Technology And Research Systems and methods that improve alignment of a robotic arm to an object
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018182538A1 (en) * 2017-03-31 2018-10-04 Agency For Science, Technology And Research Systems and methods that improve alignment of a robotic arm to an object
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于惯性测量单元的激光雷达点云融合方法;张艳国等;《系统仿真学报》;20181108(第11期);全文 *

Also Published As

Publication number Publication date
CN111522022A (en) 2020-08-11

Similar Documents

Publication Publication Date Title
CN111522022B (en) Dynamic target detection method of robot based on laser radar
CN110645974B (en) Mobile robot indoor map construction method fusing multiple sensors
WO2021189468A1 (en) Attitude correction method, apparatus and system for laser radar
Nieto et al. Recursive scan-matching SLAM
Kümmerle et al. Large scale graph-based SLAM using aerial images as prior information
US8831778B2 (en) Method of accurate mapping with mobile robots
CN111693053B (en) Repositioning method and system based on mobile robot
Loevsky et al. Reliable and efficient landmark-based localization for mobile robots
CN108332752B (en) Indoor robot positioning method and device
CN112396656B (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
Zhao et al. Prediction-based geometric feature extraction for 2D laser scanner
JP6649743B2 (en) Matching evaluation device and matching evaluation method
Großmann et al. Robust mobile robot localisation from sparse and noisy proximity readings using Hough transform and probability grids
CN110736456A (en) Two-dimensional laser real-time positioning method based on feature extraction in sparse environment
CN113902828A (en) Construction method of indoor two-dimensional semantic map with corner as key feature
Donoso-Aguirre et al. Mobile robot localization using the Hausdorff distance
CN115661252A (en) Real-time pose estimation method and device, electronic equipment and storage medium
Jae-Bok Mobile robot localization using range sensors: Consecutive scanning and cooperative scanning
CN115453549A (en) Method for extracting environment right-angle point coordinate angle based on two-dimensional laser radar
CN111915632B (en) Machine learning-based method for constructing truth database of lean texture target object
Chang et al. Robust accurate LiDAR-GNSS/IMU self-calibration based on iterative refinement
Szaj et al. Vehicle localization using laser scanner
Gao et al. A rapid recognition of impassable terrain for mobile robots with low cost range finder based on hypotheses testing theory
Pyo et al. Development of radial layout underwater acoustic marker using forward scan sonar for AUV
Torres-Torriti et al. Scan-to-map matching using the Hausdorff distance for robust mobile robot localization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant