CN111678516A - Bounded region rapid global positioning method based on laser radar - Google Patents

Bounded region rapid global positioning method based on laser radar Download PDF

Info

Publication number
CN111678516A
CN111678516A CN202010380244.2A CN202010380244A CN111678516A CN 111678516 A CN111678516 A CN 111678516A CN 202010380244 A CN202010380244 A CN 202010380244A CN 111678516 A CN111678516 A CN 111678516A
Authority
CN
China
Prior art keywords
map
robot
global
positioning
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010380244.2A
Other languages
Chinese (zh)
Other versions
CN111678516B (en
Inventor
张辉
吴海欣
林军记
文梓豪
熊弘俊
崔彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN202010380244.2A priority Critical patent/CN111678516B/en
Publication of CN111678516A publication Critical patent/CN111678516A/en
Application granted granted Critical
Publication of CN111678516B publication Critical patent/CN111678516B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

According to the bounded region rapid global positioning method of the laser radar, provided by the invention, the rough positioning of the map region where the robot is located is realized by constructing the primary feature vectors of a global map and a local map and matching; and further, according to the map where the robot is located, a secondary feature vector of a global map and a secondary feature vector of a local map are constructed, the position and the azimuth angle of the robot are accurately positioned, the influence of illumination on positioning precision is effectively avoided, a convex polygon geometric map is used as map representation, a laser radar positioning algorithm based on map feature vector matching is obtained, the complexity of the algorithm is reduced, and the robot is quickly and globally positioned in a bounded area.

Description

Bounded region rapid global positioning method based on laser radar
Technical Field
The invention relates to the technical field of indoor positioning of mobile robots, in particular to a bounded area rapid global positioning method of a laser radar.
Background
At present, the indoor positioning fields at home and abroad mainly comprise positioning technologies such as Radio Frequency Identification (RFID), Wi-Fi, ZigBee, Ultra Wideband (UWB), inertial navigation, ultrasonic waves, laser, infrared rays, Bluetooth and the like. The positioning technologies are different in sensor and data transmission modes, and the background core positioning algorithm is a factor playing a key role in positioning effect, including a proximity information method, a multilateration method, a hyperbolic positioning method, a triangulation method, a fingerprint positioning method, a dead reckoning method and the like. In terms of the current technical development, each positioning technology has advantages and limitations, and does not have a universal technology which can simultaneously meet multiple aspects of positioning accuracy, cost and power consumption. Meanwhile, different positioning algorithms are different in applicable positioning technologies, so that various circles seek a reliable and stable positioning technology and algorithm combination to achieve optimization in the aspects of indoor positioning accuracy, power consumption and the like.
On the other hand, indoor positioning is an indispensable key technology of a robot, and its technology can be divided into two categories, position tracking and global positioning. The position tracking refers to positioning the robot in a motion state on the premise of a known initial pose and an unknown map, such as laser SLAM positioning and visual SLAM positioning. But SLAM positioning techniques inevitably suffer from cumulative errors that increase with increasing robot range. In contrast, global positioning requires acquiring the position information of the robot at the current time without any prior knowledge and known map of the initial pose. The global positioning based on the known map is different from the SLAM, and the process is to match the environmental point cloud acquired at the current moment with the map built in advance, so that no accumulated error exists. The global positioning is also more difficult and challenging, the camera-based global positioning is affected by illumination factors, the algorithm is complex, and the positioning accuracy is not high.
Disclosure of Invention
The invention provides a bounded area rapid global positioning method of a laser radar, aiming at overcoming the technical defects that the existing global positioning technology is easily influenced by ambient light, has a complex algorithm and is low in positioning precision in the positioning process.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a bounded region rapid global positioning method of a laser radar comprises the following steps:
s1: acquiring known data of a plurality of convex-edge maps, and constructing primary feature vectors of a plurality of global maps;
s2: scanning the map boundary where the local laser radar is located through the laser radar arranged on the local part, and preprocessing point cloud data acquired by the laser radar;
s3: performing feature extraction on the point cloud data obtained by preprocessing, and constructing a first-level feature vector of a local map;
s4: matching the primary characteristic vector of the global map with the primary characteristic vector of the local map, and roughly positioning to determine the map area where the robot is located;
s5: reconstructing a secondary feature vector of a global map and a secondary feature vector of a local map aiming at a map where the robot is located;
s6: and matching the secondary characteristic vector of the global map with the secondary characteristic vector of the local map, and finely positioning and outputting the position and the direction angle of the robot to complete the positioning of the robot.
In step S1, the process of constructing the primary feature vectors of the multiple global maps specifically includes: under the condition that a polygonal map and a ground coordinate system are known, firstly, feature extraction is carried out on a global map under the ground coordinate system, and global coordinates of k vertexes of an ith polygonal map are substituted into a gravity center coordinate calculation formula:
Figure BDA0002481771000000021
Figure BDA0002481771000000022
wherein, X (ij), Y (ij) represent the global coordinate of the jth vertex of the ith polygon map, DijRepresenting the Euclidean distance between the jth vertex and the next vertex of the map; next, barycentric coordinates (X) of the ith polygon map are extractedig,Yig) Then, the distance from the center of gravity to the farthest point and the direction angle L are further extractedimax,aimaxDistance from center of gravity to nearest point and direction angle Limin、aimin,aimax、αiminJ (th) angle and distance α from center of gravity to boundary point on the direction angleij、Lij(ii) a And finally, respectively constructing a first-level feature vector of the ith polygon map:
Vi=[Xig,Yig,Limax,αimax,Limin,αiminLi1,αi1,Li2,αi2,Li3,αi3,...]T(3)
a plurality of global map primary feature vectors are stored in a database in advance for subsequent positioning.
In step S2, the process of scanning the map boundary and preprocessing the point cloud data by the laser radar specifically includes:
when the laser radar is located in a map area, scanning the boundary of the map in real time, and acquiring n groups of discrete point cloud data of the map boundary, including the distance and the angle from n boundary points to the radar; however, most of the range finding ranges of laser radars have a certain threshold, and when the radar is closer to or farther from the map boundary, the abnormal conditions that the distance data of partial boundary points is 0 and the like occur, so that the abnormal data are firstly identified and eliminated; meanwhile, system errors and noise errors generated by ranging by a radar flight time method are still considered, and the errors are corrected; after the error is corrected, because the point cloud acquired by the laser radar is polar coordinate data and rectangular coordinate data is needed in subsequent positioning, the acquired distance and angle data of the boundary point needs to be converted into rectangular coordinates of the boundary point under the robot coordinate system.
In step S2, the distances and angles from the n boundary points to the radar are respectively represented as: r ═ r (1), r (2),. times, r (n), ω ═ ω (1), ω (2),. times, ω (n) ]; the calculation formula for performing coordinate transformation on the collected distance and angle data of the boundary points is specifically as follows:
Figure BDA0002481771000000031
wherein, x (i), y (i) represent the abscissa and ordinate of the scanned ith boundary point in the robot coordinate system, r (i) represent the distance data from the point to the laser radar, and ω (i) represents the angle data of the point.
In step S2, the laser radar is a two-dimensional single line laser radar.
Wherein, the step S3 specifically includes: substituting the abscissa x (i) and the ordinate y (i) of the boundary point calculated in step S2 into expressions (5) to (7), and solving the barycentric coordinate (x) of the local map in the robot coordinate systemg,yg):
Figure BDA0002481771000000032
Figure BDA0002481771000000033
Figure BDA0002481771000000034
Wherein d isiRepresenting the distance between two adjacent boundary points in the local map, i ═ 1, 2.. n; further extracting the distance l from the gravity center of the local map to the farthest pointmaxAnd direction angle βmaxDistance l from center of gravity to closest pointminAnd direction angle βminAnd βmax,βminBetween the ith corner βiAnd the distance l from the center of gravity to the boundary point at the direction anglei(ii) a Thereby, the features in the local map are extractedAfter information is signed, a first-level feature vector v of the local map in a robot coordinate system is constructed:
v=[xg,yg,lmax,βmax,lminβmin,l1,β1,l2,β2,l3,β3,...]T(8)。
wherein, the step S4 specifically includes:
under the condition that a plurality of map areas are known, the primary feature vector of each map area is unique, so that the map in which the robot is located can be preliminarily determined through the matching of the primary feature vectors of the global map and the local map; firstly, calculating the first-level matching difference value of the local map and each global mapi=|Vi-V |, comparing the matching difference values and identifying the map { m | | V) corresponding to the minimum matching difference valuem-v|=min|ViAnd-v | }, preliminarily determining that the robot is in the mth map, reducing the positioning identification range, and completing the rough positioning of the map area where the robot is located.
Wherein, the step S5 specifically includes:
step S4 is to reconstruct the global map secondary feature vector V 'of the map where the robot is located and the local map secondary feature vector V' after preliminarily obtaining the mth map of the map area where the robot is located through the primary feature vector matching:
V′=[Xmg,Ymg,Lmmax,αmmax,Lmmin,vαmmin](9)
Figure BDA0002481771000000041
wherein Xmg,YmgCoordinates of the boundary gravity center of the mth map in a ground coordinate system; l ismmax,αmmaxRespectively representing the distance and the direction angle from the center of gravity to the farthest point of the map; l ismmin,αmminRespectively representing the distance and the direction angle of the center of gravity to the closest point of the map.
Wherein, the step S6 specifically includes:
based on the secondary feature vector V 'of the global map and the secondary feature vector V' of the local map, which are constructed in the step S5, direct comparison and matching are performed, so that the rapid global positioning of the robot in the bounded area is realized, the calculation amount of the algorithm is reduced, and the pose of the output robot is as follows:
Figure BDA0002481771000000042
further, the direction angle of the robot is calculated
Figure BDA0002481771000000043
Then, the barycentric coordinate (X) in the secondary feature vector of the global map is utilizedmg,Ymg) Coordinates of center of gravity (x) with local mapg,yg) Substituting the position coordinate calculation formula of the robot into the position coordinate calculation formula of the robot to obtain the position coordinate of the robot in the ground coordinate system
Figure BDA0002481771000000044
The specific calculation formula is as follows:
Figure BDA0002481771000000045
thereby completing the positioning of the robot.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
according to the bounded region rapid global positioning method of the laser radar, provided by the invention, the rough positioning of the map region where the robot is located is realized by constructing the primary feature vectors of a global map and a local map and matching; and further, according to the map where the robot is located, a secondary feature vector of a global map and a secondary feature vector of a local map are constructed, the position and the azimuth angle of the robot are accurately positioned, the influence of illumination on positioning precision is effectively avoided, a convex polygon geometric map is used as map representation, a laser radar positioning algorithm based on map feature vector matching is obtained, the complexity of the algorithm is reduced, and the robot is quickly and globally positioned in a bounded area.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention;
FIG. 2 is a diagram illustrating the transformation relationship of the coordinate system of the present invention: (a) a global map schematic under a ground coordinate system; (b) a local map schematic under a vehicle coordinate system;
FIG. 3 is a schematic view of a polygonal map according to the present invention (a); (b) and the polygon map corresponds to a primary feature vector diagram.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
Example 1
As shown in fig. 1, a bounded region fast global positioning method for a laser radar includes the following steps:
s1: acquiring known data of a plurality of convex-edge maps, and constructing primary feature vectors of a plurality of global maps;
s2: scanning the map boundary where the local laser radar is located through the laser radar arranged on the local part, and preprocessing point cloud data acquired by the laser radar;
s3: performing feature extraction on the point cloud data obtained by preprocessing, and constructing a first-level feature vector of a local map;
s4: matching the primary characteristic vector of the global map with the primary characteristic vector of the local map, and roughly positioning to determine the map area where the robot is located;
s5: reconstructing a secondary feature vector of a global map and a secondary feature vector of a local map aiming at a map where the robot is located;
s6: and matching the secondary characteristic vector of the global map with the secondary characteristic vector of the local map, and finely positioning and outputting the position and the direction angle of the robot to complete the positioning of the robot.
In a specific implementation process, the method for quickly and globally positioning the bounded region of the laser radar realizes the rough positioning of the map region where the robot is located by constructing the primary feature vectors of a global map and a local map and matching the primary feature vectors; and further, according to the map where the robot is located, a secondary feature vector of a global map and a secondary feature vector of a local map are constructed, the position and the azimuth angle of the robot are accurately positioned, the influence of illumination on positioning precision is effectively avoided, a convex polygon geometric map is used as map representation, a laser radar positioning algorithm based on map feature vector matching is obtained, the complexity of the algorithm is reduced, and the robot is quickly and globally positioned in a bounded area.
Example 2
More specifically, on the basis of embodiment 1, as shown in fig. 2, the conversion relationship between the ground coordinate system and the vehicle direction angle are defined as OXGYGGround coordinate system, OXVYVIs a vehicle coordinate system. The orientation of the laser radar 0 degree is defined as the positive direction of the longitudinal axis of the vehicle coordinate system, the vehicle direction angle is defined as the included angle between the direction of the laser radar 0 degree and the positive direction of the transverse axis of the ground coordinate system, and the embodiment maps are four-side convex polygon maps.
More specifically, the step S1 specifically includes: firstly, extracting features of a plurality of known maps (namely, a global map) under a ground coordinate system, and substituting coordinates of 4 vertexes, namely a convex polygon map k, into a barycentric coordinate calculation formula:
Figure BDA0002481771000000061
Figure BDA0002481771000000062
wherein x (ij), y (ij) (i ═ 1, 2, 3.., k) representsGlobal coordinate of jth vertex of ith polygon map, DijIndicating the euclidean distance of the jth vertex of the map to its next vertex.
Extracting barycentric coordinates (X) of ith polygon mapig,Yig) Then, the distance from the center of gravity to the farthest point and the direction angle L are further extractedimax、αimaxDistance from center of gravity to nearest point and direction angle Limin、αimin、αimax、αiminJ (th) angle and distance α from center of gravity to boundary point on the direction angleij、Lij. And finally, respectively constructing the first-level feature vectors of the known polygonal maps as follows:
Vi=[Xig,Yig,Limax,αimax,Limin,αimin,Li1,αi1,Li2,αi2]T(3)
and pre-stored in a database for subsequent positioning. The primary feature vector of one of the convex polygon maps is shown in fig. 3.
More specifically, in step S2, the process of scanning the map boundary by the laser radar and preprocessing the point cloud data specifically includes:
when a two-dimensional laser radar with an angle resolution of 0.5 degrees and a scanning range of 360 degrees is positioned in a certain map area, the boundary of the map is scanned in real time, and n (720) groups of discrete point cloud data of the map boundary are acquired, wherein the discrete point cloud data comprise the distances r (1), r (2), a. However, most of the range finding ranges of laser radars have a certain threshold, and when the radar is relatively close to or far away from the map boundary, the distance data of a part of boundary points is abnormal, such as 0, and the abnormal data is identified and removed first. It should be noted that the time-of-flight ranging of lidar also has errors, including systematic errors and noise errors. Therefore, in addition to removing the abnormal data, the error of the point cloud data needs to be corrected.
After error correction, because the point cloud acquired by the laser radar is polar coordinate data and rectangular coordinate data is needed in subsequent positioning, the acquired distance and angle data of the boundary point need to be converted into rectangular coordinates of the boundary point under a vehicle coordinate system, and a local map is constructed. The coordinate transformation formula is as follows:
Figure BDA0002481771000000071
wherein, x (i), y (i) represent the abscissa and ordinate of the scanned ith boundary point in the robot coordinate system, r (i) represent the distance data from the point to the laser radar, and ω (i) represents the angle data of the point.
More specifically, in step S2, the laser radar is a two-dimensional single line laser radar.
More specifically, the step S3 specifically includes: substituting the abscissa x (i) and the ordinate y (i) of the boundary point calculated in step S2 into expressions (5) to (7), and solving the barycentric coordinate (x) of the local map in the robot coordinate systemg,yg):
Figure BDA0002481771000000072
Figure BDA0002481771000000073
Figure BDA0002481771000000074
Wherein d isiRepresenting the distance between two adjacent boundary points in the local map, i ═ 1, 2, … n; further extracting the distance l from the gravity center of the local map to the farthest pointmaxAnd direction angle βmaxDistance l from center of gravity to closest pointminAnd direction angle βminAnd βmax,βminBetween the ith corner βiAnd the distance l from the center of gravity to the boundary point at the direction anglei(ii) a Therefore, after extracting the feature information in the local map, constructing a primary feature vector v of the local map in the robot coordinate system:
v=[xg,yg,lmax,βmax,lmin,βmin,l1,β1,l2,β2,l3,β3,...]T(8)。
more specifically, the step S4 specifically includes:
under the condition that a plurality of map areas are known, the primary feature vector of each map area is unique, so that the map in which the robot is located can be preliminarily determined through the matching of the primary feature vectors of the global map and the local map; firstly, calculating the first-level matching difference value of the local map and each global mapi=|Vi-V |, comparing the matching difference values and identifying the map { m | | V) corresponding to the minimum matching difference valuem-v|=min|ViAnd-v | }, preliminarily determining that the robot is in the mth map, reducing the positioning identification range, and completing the rough positioning of the map area where the robot is located.
More specifically, the step S5 specifically includes:
step S4 is to reconstruct the global map secondary feature vector V 'of the map where the robot is located and the local map secondary feature vector V' after preliminarily obtaining the mth map of the map area where the robot is located through the primary feature vector matching:
V′=[xmg,Ymg,Lmmax,αmmax,Lmmin,αmmin](9)
v′=[xg,yg,lmax,βmax,lmin,βmin]T(10)
wherein Xmg,YmgCoordinates of the boundary gravity center of the mth map in a ground coordinate system; l ismmax,αmmaxRespectively representing the distance and the direction angle from the center of gravity to the farthest point of the map; l ismmin,αmminRespectively representing the distance and direction from the center of gravity to the nearest point of the mapAnd (4) an angle. The reconstructed secondary feature vector can be directly used for matching the accurate pose of the output vehicle, and the complexity of the algorithm is reduced.
More specifically, the step S6 specifically includes:
based on the secondary feature vector V 'of the global map and the secondary feature vector V' of the local map, which are constructed in the step S5, direct comparison and matching are performed, so that the rapid global positioning of the robot in the bounded area is realized, the calculation amount of the algorithm is reduced, and the pose of the output robot is as follows:
Figure BDA0002481771000000081
further, the direction angle of the robot is calculated
Figure BDA0002481771000000082
Then, the barycentric coordinate (X) in the secondary feature vector of the global map is utilizedmg,Ymg) Coordinates of center of gravity (x) with local mapg,yg) Substituting the position coordinate calculation formula of the robot into the position coordinate calculation formula of the robot to obtain the position coordinate of the robot in the ground coordinate system
Figure BDA0002481771000000083
The specific calculation formula is as follows:
Figure BDA0002481771000000091
thereby completing the positioning of the robot.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (9)

1. A bounded region rapid global positioning method of a laser radar is characterized by comprising the following steps:
s1: acquiring known data of a plurality of convex-edge maps, and constructing primary feature vectors of a plurality of global maps;
s2: scanning the map boundary where the local laser radar is located through the laser radar arranged on the local part, and preprocessing point cloud data acquired by the laser radar;
s3: performing feature extraction on the point cloud data obtained by preprocessing, and constructing a first-level feature vector of a local map;
s4: matching the primary characteristic vector of the global map with the primary characteristic vector of the local map, and roughly positioning to determine the map area where the robot is located;
s5: reconstructing a secondary feature vector of a global map and a secondary feature vector of a local map aiming at a map where the robot is located;
s6: and matching the secondary characteristic vector of the global map with the secondary characteristic vector of the local map, and finely positioning and outputting the position and the direction angle of the robot to complete the positioning of the robot.
2. The method according to claim 1, wherein in step S1, the process of constructing the first-level feature vectors of the global maps is specifically: under the condition that a polygonal map and a ground coordinate system are known, firstly, feature extraction is carried out on a global map under the ground coordinate system, and global coordinates of k vertexes of an ith polygonal map are substituted into a gravity center coordinate calculation formula:
Figure FDA0002481770990000011
Figure FDA0002481770990000012
wherein, X (ij), Y (ij) represent the ith polygonGlobal coordinate of jth vertex of the graph, DijRepresenting the Euclidean distance between the jth vertex and the next vertex of the map; next, barycentric coordinates (X) of the ith polygon map are extractedig,Yig) Then, the distance from the center of gravity to the farthest point and the direction angle L are further extractedimax,αimaxDistance from center of gravity to nearest point and direction angle Limin、αimin,αimax、αiminJ (th) angle and distance α from center of gravity to boundary point on the direction angleij、Lij(ii) a And finally, respectively constructing a first-level feature vector of the ith polygon map:
Vi=[Xig,Yig,Limax,αimax,Limin,αimin,Li1,αi1,Li2,Li3,αi3,...]T(3)
a plurality of global map primary feature vectors are stored in a database in advance for subsequent positioning.
3. The method for fast global positioning of a bounded area by a lidar according to claim 2, wherein in step S2, the process of scanning the map boundary and preprocessing the point cloud data by the lidar is specifically:
when the laser radar is located in a map area, scanning the boundary of the map in real time, and acquiring n groups of discrete point cloud data of the map boundary, including the distance and the angle from n boundary points to the radar; however, most of the range finding ranges of laser radars have a certain threshold, and when the radar is closer to or farther from the map boundary, the abnormal conditions that the distance data of partial boundary points is 0 and the like occur, so that the abnormal data are firstly identified and eliminated; meanwhile, system errors and noise errors generated by ranging by a radar flight time method are still considered, and the errors are corrected; after the error is corrected, because the point cloud acquired by the laser radar is polar coordinate data and rectangular coordinate data is needed in subsequent positioning, the acquired distance and angle data of the boundary point needs to be converted into rectangular coordinates of the boundary point under the robot coordinate system.
4. The method for fast global localization of a bounded area of a lidar according to claim 3, wherein in step S2, the distances and angles from the n boundary points to the radar are respectively represented as: r ═ r (1), r (2),. times, r (n), ω ═ ω (1), ω (2),. times, ω (n) ]; the calculation formula for performing coordinate transformation on the collected distance and angle data of the boundary points is specifically as follows:
Figure FDA0002481770990000021
wherein, x (i), y (i) represent the abscissa and ordinate of the scanned ith boundary point in the robot coordinate system, r (i) represent the distance data from the point to the laser radar, and ω (i) represents the angle data of the point.
5. The method for fast global localization of a bounded area of a lidar according to claim 3, wherein in step S2, the lidar is a two-dimensional singlet lidar.
6. The method for fast global positioning of a bounded area of a lidar according to claim 4, wherein the step S3 specifically comprises: substituting the abscissa x (i) and the ordinate y (i) of the boundary point calculated in step S2 into expressions (5) to (7), and solving the barycentric coordinate (x) of the local map in the robot coordinate systemg,yg):
Figure FDA0002481770990000022
Figure FDA0002481770990000023
Figure FDA0002481770990000024
Wherein d isiRepresenting the distance between two adjacent boundary points in the local map, i ═ 1, 2.. n; further extracting the distance l from the gravity center of the local map to the farthest pointmaxAnd direction angle βmaxDistance l from center of gravity to closest pointminAnd direction angle βminAnd βmax,βminBetween the ith corner βiAnd the distance l from the center of gravity to the boundary point at the direction anglei(ii) a Therefore, after extracting the feature information in the local map, constructing a primary feature vector v of the local map in the robot coordinate system:
v=[xg,yg,lmax,βmax,lmin,βmin,l1,β1,l2,β2,l3,β3,...]T(8)。
7. the method for fast global positioning of a bounded area of a lidar according to claim 6, wherein the step S4 specifically comprises:
under the condition that a plurality of map areas are known, the primary feature vector of each map area is unique, so that the map in which the robot is located can be preliminarily determined through the matching of the primary feature vectors of the global map and the local map; firstly, calculating the first-level matching difference value of the local map and each global mapi=|Vi-V |, comparing the matching difference values and identifying the map { m | | V) corresponding to the minimum matching difference valuem-v|=min|ViAnd-v | }, preliminarily determining that the robot is in the mth map, reducing the positioning identification range, and completing the rough positioning of the map area where the robot is located.
8. The method for fast global positioning of a bounded area of a lidar according to claim 7, wherein the step S5 specifically comprises:
step S4 is to reconstruct the secondary feature vector V' of the global map and the secondary feature vector V ″ of the local map after preliminarily obtaining the mth map as the map area where the robot is located through the primary feature vector matching:
V′=[Xmg,Ymg,Lmmax,αmmax,Lmmin,αmmin]T(9)
v′=[xg,yg,lmax,βmax,lmin,βmin]T(10)
wherein Xmg,YmgCoordinates of the boundary gravity center of the mth map in a ground coordinate system; l ismmax,αmmaxRespectively representing the distance and the direction angle from the center of gravity to the farthest point of the map; l ismmin,αmminRespectively representing the distance and the direction angle of the center of gravity to the closest point of the map.
9. The method for fast global positioning of a bounded area of a lidar according to claim 7, wherein the step S6 specifically comprises:
based on the secondary feature vector V 'of the global map constructed in the step S5 and the secondary feature vector V' of the local map, the comparison and matching are directly carried out, so that the rapid global positioning of the robot in the bounded area is realized, the calculation amount of the algorithm is reduced, and the pose of the output robot is as follows:
Figure FDA0002481770990000041
further, the direction angle of the robot is calculated
Figure FDA0002481770990000042
Then, the barycentric coordinate (X) in the secondary feature vector of the global map is utilizedmg,Ymg) Coordinates of center of gravity (x) with local mapg,yg) Substituting the position coordinate calculation formula of the robot into the position coordinate calculation formula of the robot to obtain the position coordinate of the robot in the ground coordinate system
Figure FDA0002481770990000043
The specific calculation formula is as follows:
Figure FDA0002481770990000044
thereby completing the positioning of the robot.
CN202010380244.2A 2020-05-08 2020-05-08 Bounded region rapid global positioning method based on laser radar Active CN111678516B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010380244.2A CN111678516B (en) 2020-05-08 2020-05-08 Bounded region rapid global positioning method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010380244.2A CN111678516B (en) 2020-05-08 2020-05-08 Bounded region rapid global positioning method based on laser radar

Publications (2)

Publication Number Publication Date
CN111678516A true CN111678516A (en) 2020-09-18
CN111678516B CN111678516B (en) 2021-11-23

Family

ID=72451777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010380244.2A Active CN111678516B (en) 2020-05-08 2020-05-08 Bounded region rapid global positioning method based on laser radar

Country Status (1)

Country Link
CN (1) CN111678516B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112711012A (en) * 2020-12-18 2021-04-27 上海蔚建科技有限公司 Global position initialization method and system of laser radar positioning system
CN112767476A (en) * 2020-12-08 2021-05-07 中国科学院深圳先进技术研究院 Rapid positioning system, method and application
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN114993316A (en) * 2022-05-24 2022-09-02 清华大学深圳国际研究生院 Ground robot autonomous navigation method based on plane fitting and robot

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107179086A (en) * 2017-05-24 2017-09-19 北京数字绿土科技有限公司 A kind of drafting method based on laser radar, apparatus and system
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
CN108445480A (en) * 2018-02-02 2018-08-24 重庆邮电大学 Mobile platform based on laser radar adaptively extends Target Tracking System and method
CN108732584A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for updating map
CN108917759A (en) * 2018-04-19 2018-11-30 电子科技大学 Mobile robot pose correct algorithm based on multi-level map match
US20190066330A1 (en) * 2017-08-23 2019-02-28 TuSimple Feature extraction from 3d submap and global map system and method for centimeter precision localization using camera-based submap and lidar-based global map
US20190066344A1 (en) * 2017-08-23 2019-02-28 TuSimple 3d submap reconstruction system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN110187348A (en) * 2019-05-09 2019-08-30 盈科视控(北京)科技有限公司 A kind of method of laser radar positioning
CN110207710A (en) * 2019-06-26 2019-09-06 北京小狗智能机器人技术有限公司 Robot method for relocating and device
CN110221603A (en) * 2019-05-13 2019-09-10 浙江大学 A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
CN108732584A (en) * 2017-04-17 2018-11-02 百度在线网络技术(北京)有限公司 Method and apparatus for updating map
CN107179086A (en) * 2017-05-24 2017-09-19 北京数字绿土科技有限公司 A kind of drafting method based on laser radar, apparatus and system
US20190066330A1 (en) * 2017-08-23 2019-02-28 TuSimple Feature extraction from 3d submap and global map system and method for centimeter precision localization using camera-based submap and lidar-based global map
US20190066344A1 (en) * 2017-08-23 2019-02-28 TuSimple 3d submap reconstruction system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN108445480A (en) * 2018-02-02 2018-08-24 重庆邮电大学 Mobile platform based on laser radar adaptively extends Target Tracking System and method
CN108917759A (en) * 2018-04-19 2018-11-30 电子科技大学 Mobile robot pose correct algorithm based on multi-level map match
CN110187348A (en) * 2019-05-09 2019-08-30 盈科视控(北京)科技有限公司 A kind of method of laser radar positioning
CN110221603A (en) * 2019-05-13 2019-09-10 浙江大学 A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud
CN110207710A (en) * 2019-06-26 2019-09-06 北京小狗智能机器人技术有限公司 Robot method for relocating and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王元华等: "基于激光雷达的移动机器人定位和地图创建", 《微计算机信息》 *
童祎: "《基于激光雷达强度值的目标反射特征提取》", 《激光与光电子学进展》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767476A (en) * 2020-12-08 2021-05-07 中国科学院深圳先进技术研究院 Rapid positioning system, method and application
CN112767476B (en) * 2020-12-08 2024-04-26 中国科学院深圳先进技术研究院 Rapid positioning system, method and application
CN112711012A (en) * 2020-12-18 2021-04-27 上海蔚建科技有限公司 Global position initialization method and system of laser radar positioning system
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN114993316A (en) * 2022-05-24 2022-09-02 清华大学深圳国际研究生院 Ground robot autonomous navigation method based on plane fitting and robot
CN114993316B (en) * 2022-05-24 2024-05-31 清华大学深圳国际研究生院 Ground robot autonomous navigation method based on plane fitting and robot

Also Published As

Publication number Publication date
CN111678516B (en) 2021-11-23

Similar Documents

Publication Publication Date Title
CN111678516B (en) Bounded region rapid global positioning method based on laser radar
Wang et al. Intensity scan context: Coding intensity and geometry relations for loop closure detection
CN102426019B (en) Unmanned aerial vehicle scene matching auxiliary navigation method and system
Lenac et al. Fast planar surface 3D SLAM using LIDAR
CN110645974A (en) Mobile robot indoor map construction method fusing multiple sensors
CN103424112B (en) A kind of motion carrier vision navigation method auxiliary based on laser plane
Zhou et al. T-LOAM: Truncated least squares LiDAR-only odometry and mapping in real time
CN113936198A (en) Low-beam laser radar and camera fusion method, storage medium and device
Meng et al. Efficient and reliable LiDAR-based global localization of mobile robots using multiscale/resolution maps
CN110986956A (en) Autonomous learning global positioning method based on improved Monte Carlo algorithm
CN114088081B (en) Map construction method for accurate positioning based on multistage joint optimization
WO2021082380A1 (en) Laser radar-based pallet recognition method and system, and electronic device
CN110906924A (en) Positioning initialization method and device, positioning method and device and mobile device
Guo et al. E-LOAM: LiDAR odometry and mapping with expanded local structural information
Cao et al. Accurate localization of autonomous vehicles based on pattern matching and graph-based optimization in urban environments
CN113759928B (en) Mobile robot high-precision positioning method for complex large-scale indoor scene
CN116758153A (en) Multi-factor graph-based back-end optimization method for accurate pose acquisition of robot
Liao et al. SE-Calib: Semantic Edge-Based LiDAR–Camera Boresight Online Calibration in Urban Scenes
Nielsen et al. Survey on 2d lidar feature extraction for underground mine usage
CN113902828A (en) Construction method of indoor two-dimensional semantic map with corner as key feature
Yin et al. A failure detection method for 3D LiDAR based localization
US8798957B2 (en) Method for the computer-aided calculation of the movement of an object using sensor data
Meng et al. Accurate LiDAR-based localization in glass-walled environment
Kim et al. Automatic multiple LiDAR calibration based on the plane features of structured environments
Xue et al. Real-time 3D grid map building for autonomous driving in dynamic environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant