CN113033564B - High-precision and robust 2D laser point cloud matching method - Google Patents

High-precision and robust 2D laser point cloud matching method Download PDF

Info

Publication number
CN113033564B
CN113033564B CN202110193224.9A CN202110193224A CN113033564B CN 113033564 B CN113033564 B CN 113033564B CN 202110193224 A CN202110193224 A CN 202110193224A CN 113033564 B CN113033564 B CN 113033564B
Authority
CN
China
Prior art keywords
laser
point cloud
point
matching
advancing direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110193224.9A
Other languages
Chinese (zh)
Other versions
CN113033564A (en
Inventor
姜跃君
张启富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eoslift Logistics Technology Shanghai Co Ltd
Original Assignee
Eoslift Logistics Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eoslift Logistics Technology Shanghai Co Ltd filed Critical Eoslift Logistics Technology Shanghai Co Ltd
Priority to CN202110193224.9A priority Critical patent/CN113033564B/en
Publication of CN113033564A publication Critical patent/CN113033564A/en
Application granted granted Critical
Publication of CN113033564B publication Critical patent/CN113033564B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a laser point cloud matching method, in particular to a high-precision and robust 2D laser point cloud matching method, which comprises the steps of converting original laser point clouds into a polar coordinate system of a laser coordinate system, dividing the point clouds parallel to the laser advancing direction and perpendicular to the laser advancing direction, counting the number of the point clouds of each characteristic, carrying out point cloud weight evaluation according to the influence degree of the characteristic, carrying out rough matching on the point clouds, estimating the initial angle rotation amount, verifying whether the initial angle rotation amount is available, carrying out fine matching and optimization on point cloud orientation characteristics if the initial angle rotation amount is available, otherwise initializing the initial angle rotation amount, and evaluating the matching quality; the technical scheme provided by the invention can effectively overcome the defects of low point cloud matching precision, poor reliability and lack of effective reliability evaluation in the prior art.

Description

High-precision and robustness 2D laser point cloud matching method
Technical Field
The invention relates to a laser point cloud matching method, in particular to a high-precision and robust 2D laser point cloud matching method.
Background
Point cloud matching in the simultaneous localization and mapping (SLAM) technology is a key technology, and theoretically, if matching has no error, accurate pose can be estimated only by the change of laser point cloud. However, the actual environment is complex, and due to the existence of factors such as noise of laser, inaccurate algorithm and the like, the matching of the laser point cloud has accumulated errors, and even the situation of matching errors at a certain time is very likely to occur, so that the positioning failure or pose jump is caused.
The corridor problem is the most difficult to solve in the 2D laser point cloud matching, if the corridor environment is strict, the laser matching is theoretically unsolved simply, and other auxiliary sensors are needed to be used for measurement and correction. The method is characterized in that the method is also a corridor-like environment which is not a strict corridor environment, the number of point clouds in one direction is too small, if the other direction is close to symmetry, matching failure is easily caused or wrong transformation is matched, so that the accuracy and robustness of a matching algorithm are required to be optimized, the 2D laser matching algorithm has general environmental adaptability and accuracy, and meanwhile, the matching reliability also needs to be known, so that the reliability evaluation after matching is required to be provided, and the error state of the current estimated pose is recorded.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects in the prior art, the invention provides a high-precision and high-robustness 2D laser point cloud matching method, which can effectively overcome the defects of low point cloud matching precision, poor reliability and lack of effective reliability evaluation in the prior art.
(II) technical scheme
In order to achieve the purpose, the invention is realized by the following technical scheme:
a high-precision and robust 2D laser point cloud matching method comprises the following steps:
s1, converting original laser point clouds to a polar coordinate system of a laser coordinate system, and dividing the point clouds parallel to the laser advancing direction and perpendicular to the laser advancing direction;
s2, counting the point cloud number of each feature, and carrying out point cloud weight evaluation according to the feature influence degree;
s3, carrying out rough matching on the point cloud, estimating the initial angle rotation amount, and verifying whether the initial angle rotation amount is available;
s4, if the initial angle rotation amount is available, performing point cloud orientation feature fine matching and optimization, otherwise initializing the initial angle rotation amount;
and S5, evaluating the matching quality.
Preferably, the segmentation of the point cloud perpendicular to the laser traveling direction and parallel to the laser traveling direction in S1 includes:
calculating the point clouds parallel to the laser advancing direction and the distance between the point clouds perpendicular to the laser advancing direction and the radar, searching the point clouds in a distance threshold, obtaining point features meeting conditions in the direction parallel to the laser advancing direction as a feature A1 and a feature A2, obtaining point features meeting conditions in the direction perpendicular to the laser advancing direction as a feature B1 and a feature B2, and recording the point features of the rest point clouds as a feature C.
Preferably, the Index values of the point clouds parallel to the laser traveling direction include Index1 and Index2, and the Index values of the point clouds perpendicular to the laser traveling direction include Index3 and Index4, which are calculated according to the following formula:
Index1=(θw-π/2)*180/π/(360/2400);
Index2=(θw+π/2)*180/π/(360/2400);
Index3=(θw)*180/π/(360/2400);
Index4=(θw+π)*180/π/(360/2400);
wherein, index1 is the Index value of the point cloud with the right side parallel to the laser advancing direction, index2 is the Index value of the point cloud with the left side parallel to the laser advancing direction, index3 is the Index value of the point cloud with the upper side perpendicular to the laser advancing direction, and Index4 is the Index value of the point cloud with the lower side perpendicular to the laser advancing direction.
Preferably, the step S2 of counting the number of point clouds of each feature and performing point cloud weight evaluation according to the influence degree of the features includes:
the number of point clouds parallel to the laser advancing direction is numA (A1 + A2), the number of point clouds perpendicular to the laser advancing direction is numB (B1 + B2), the number of point clouds of unclassified features is numC, and then the point cloud weight is set as:
Wa∶Wb∶Wc=1∶(k*numA)∶numB;
wherein Wa is the point cloud weight of the characteristic A1 and the characteristic A2, wb is the point cloud weight of the characteristic B1 and the characteristic B2, wc is the point cloud weight of the characteristic C, and k is adjusted according to actual needs;
the characteristic A1 and the characteristic A2 are point clouds parallel to the laser advancing direction, and the translation weight needs to be lowered; the characteristic B1 and the characteristic B2 are point clouds vertical to the advancing direction of the laser, and the translation weight needs to be improved; feature C is some unclassified point and no additional weight is required.
Preferably, the step of performing rough matching on the point cloud in S3, estimating a preliminary angle rotation amount, and verifying whether the preliminary angle rotation amount is usable includes the following steps:
s31, preliminarily extracting reference frame laser and matching frame laser according to a distance threshold, and reserving three clustering point clouds with the largest number of points;
s32, extracting and cutting a straight corner point of the clustered point cloud, and then performing RANSAC fitting after cutting to obtain three straight line characteristics line1, line2 and line3 and corresponding straight line slopes k1, k2 and k3 respectively;
s33, carrying out rotation rough estimation on the preliminary angle rotation amount;
s34, point clouds in the linear features line1, line2 and line3 of the matched frame laser are rotated to the position under the reference frame laser by utilizing the initial angle rotation amount, and adjacent points are searched in the point clouds of the linear features line1, line2 and line3 of the reference frame laser;
and S35, if the number of the adjacent points is approximately equal to the number of the point clouds in the total straight line feature, the preliminary angle rotation amount is available, otherwise, the preliminary angle rotation amount is unavailable.
Preferably, the fine matching and optimization of the point cloud orientation features in S4 includes:
the matching between laser frames is solved by adopting a ceres mode, the fast searching mode under a polar coordinate system is selected for searching adjacent points, the distance between a point and a straight line is adopted for minimum distance calculation, and an optimized error formula is as follows:
Figure BDA0002945974070000041
bringing all the corresponding point data into cerees for solving;
where J is the cost function, W i Is a corresponding weight, p i Is p j1 And p j2 ,n i Is corresponding toPoint p j1 And p j2 Normal vector of connecting line, t k+1 Is the translation vector, R (θ) k+1 ) Is the rotation matrix and θ is the rotation angle.
Preferably, the initial estimation of the rotation angle θ is a preliminary rotation angle, and if there is no initial estimation, the default rotation angle θ is 0;
correcting the initial estimated rotation amount, converting the point cloud of the matched frame into a reference frame to obtain the point clouds of a corresponding polar coordinate system and a Cartesian coordinate system, then searching adjacent points to obtain a point set corr of corresponding points, searching the corresponding points in the corr, judging the point cloud characteristics of which the index values of the corresponding points accord with, and configuring the weight W of the point cloud characteristics i
Preferably, the evaluating the matching quality in S5 includes:
calculating pose transformation amount according to ceres, and calculating coordinates of corresponding point sets in the matching frame under a reference frame coordinate system, wherein the distance between the same corresponding points is as follows:
Figure BDA0002945974070000042
calculate the jacobian matrix of errors:
Figure BDA0002945974070000043
the general calculation formula is:
Figure BDA0002945974070000044
circularly traversing all the corresponding points, and calculating a final covariance matrix cov;
where Z is a coordinate value (jx, jy) of the corresponding point, cov (Z) is configured according to the noise characteristic of the actual laser, and may be set as a square difference diagonal matrix.
(III) advantageous effects
Compared with the prior art, the high-precision and robust 2D laser point cloud matching method can classify the symmetric features or sparse point features in the laser point cloud, match by using the same feature classes, ensure the direction compatibility of error optimization by adding weight rules, finally output the pose transformation and reliability, ensure the accuracy and robustness of point cloud matching, and more accurately output the laser point cloud pose transformation in a corridor-like environment or a certain direction point cloud sparse environment.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a schematic diagram of a laser point cloud matching process according to the present invention;
fig. 2 is a schematic view of point cloud orientation segmentation in a corridor-like environment according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
A high-precision and robust 2D laser point cloud matching method is disclosed, as shown in fig. 1 and fig. 2, converting laser original point cloud into a polar coordinate system of a laser coordinate system, and dividing the point cloud parallel to the laser advancing direction and perpendicular to the laser advancing direction.
The position and attitude of the laser in the world coordinate system are marked as (X) w ,Y w ,θ w ) And converting the original laser point Cloud into a polar coordinate system of a laser coordinate system, wherein the point Cloud is marked as Cloudw, the laser scanning characteristic used by the method is 360 degrees, the scanning frequency is 35HZ, the 2400 point set, and the maximum scanning distance is 30m.
Wherein, cut apart perpendicular to laser advancing direction and the point cloud that is on a parallel with laser advancing direction, include:
calculating the point clouds parallel to the laser advancing direction and the distance between the point clouds perpendicular to the laser advancing direction and the radar, searching the point clouds in a distance threshold, obtaining point features meeting conditions in the direction parallel to the laser advancing direction as a feature A1 and a feature A2, obtaining point features meeting conditions in the direction perpendicular to the laser advancing direction as a feature B1 and a feature B2, and recording the point features of the rest point clouds as a feature C.
Index values of point clouds parallel to the laser advancing direction comprise Index1 and Index2, index values of point clouds perpendicular to the laser advancing direction comprise Index3 and Index4, and the Index values are calculated according to the following formula:
Index1=(θw-π/2)*180/π/(360/2400);
Index2=(θw+π/2)*180/π/(360/2400);
Index3=(θw)*180/π/(360/2400);
Index4=(θw+π)*180/π/(360/2400);
as shown in fig. 2, the forward direction of the y-axis is the laser traveling direction, index1 is the Index value of the point cloud on the right side parallel to the laser traveling direction, index2 is the Index value of the point cloud on the left side parallel to the laser traveling direction, index3 is the Index value of the point cloud on the upper side perpendicular to the laser traveling direction, and Index4 is the Index value of the point cloud on the lower side perpendicular to the laser traveling direction.
Respectively calculating the distances between the point clouds of Index1 and Index2 and the radar, namely searching the distance value r in Cloudw to obtain r 1 、r 2 In Cloudw, the deviation distance Δ r = (r) is calculated by searching from Index1 and Index2 to both sides in sequence i -r 1 ) The threshold distance was set to 0.3m. If the value is larger than the threshold value, stopping searching, obtaining points meeting the conditions through searching index1 and index2 as a feature A1 and a feature A2 respectively, and searching in the same wayThe points which are obtained by calculation of the index3 and the index4 and meet the conditions are respectively the feature B1 and the feature B2, and all the remaining point clouds are marked as the feature C.
Counting the point cloud number of each feature, and performing point cloud weight evaluation according to the feature influence degree, wherein the method specifically comprises the following steps:
the number of point clouds parallel to the laser advancing direction is numA (A1 + A2), the number of point clouds perpendicular to the laser advancing direction is numB (B1 + B2), the number of point clouds of unclassified features is numC, and then the point cloud weight is set as:
Wa∶Wb∶Wc=1∶(k*numA)∶numB;
wherein Wa is the point cloud weight of the characteristic A1 and the characteristic A2, wb is the point cloud weight of the characteristic B1 and the characteristic B2, wc is the point cloud weight of the characteristic C, and k is adjusted according to actual needs;
the characteristic A1 and the characteristic A2 are point clouds parallel to the laser advancing direction, and the translation weight needs to be lowered; the characteristic B1 and the characteristic B2 are point clouds vertical to the laser advancing direction, and the translation weight needs to be improved; feature C is some unclassified points and no additional weight is required.
In the inter-frame matching of laser light, since the sensitivity of rotation to a gallery-like environment is not great, the largest cause affecting the rotation estimation amount is that the amount of rotation is too large, and therefore the same weight as that of the rotation addition and the point cloud feature C is sufficient.
The method comprises the following steps of performing rough matching on point cloud, estimating the initial angle rotation amount, and verifying whether the initial angle rotation amount is available, wherein the method comprises the following steps:
s31, preliminarily extracting reference frame laser and matching frame laser according to a distance threshold, and reserving three clustering point clouds with the largest number of points;
s32, extracting and cutting a straight corner point of the clustered point cloud, and fitting RANSAC after cutting to obtain three straight line characteristics line1, line2 and line3 and corresponding straight line slopes k1, k2 and k3 respectively;
s33, carrying out rotation rough estimation on the preliminary angle rotation amount;
s34, rotating the point clouds in the linear features line1, line2 and line3 of the matched frame laser to the reference frame laser by utilizing the initial angle rotation amount, and searching adjacent points in the point clouds of the linear features line1, line2 and line3 of the reference frame laser;
and S35, if the number of the adjacent points is approximately equal to the number of the point clouds in the total straight line feature, the preliminary angle rotation amount is available, otherwise, the preliminary angle rotation amount is unavailable.
It is noted that the rough matching of the point cloud adopts a clustering algorithm different from the above point cloud feature extraction, and can be performed synchronously with the directional segmentation of the point cloud.
And if the initial angle rotation amount is available, performing point cloud orientation feature fine matching and optimization, otherwise initializing the initial angle rotation amount.
Wherein, carry out point cloud orientation characteristic fine matching and optimization, include:
the matching between laser frames is solved by adopting a ceres mode, the fast searching mode under a polar coordinate system is selected for searching adjacent points, the distance between a point and a straight line is adopted for minimum distance calculation, and an optimized error formula is as follows:
Figure BDA0002945974070000081
bringing all the corresponding point data into cerees for solving;
where J is a cost function, W i Is a corresponding weight, p i Is p j1 And p j2 ,n i Is the corresponding point p j1 And p j2 Normal vector of connecting line, t k+1 Is the translation vector, R (θ) k+1 ) Is the rotation matrix and θ is the rotation angle.
When all the corresponding point data are brought into cerees for solving, the iteration times are set to be 20 times, and the solver is set to be an LM solving mode.
The initial estimation of the rotation angle theta is an initial angle rotation amount, and if no initial estimation is carried out, the default rotation angle theta is 0;
after correcting the initial estimated rotation amount, converting the point cloud of the matching frame into a reference frame to obtain a corresponding polar coordinate system andpoint clouds in a Cartesian coordinate system, then searching for adjacent points to obtain a point set corr of corresponding points, searching for corresponding points in the corr, judging point cloud characteristics corresponding to index values of the corresponding points, and configuring weight W of the point cloud characteristics i
Evaluating the matching quality, which specifically comprises the following steps:
calculating pose transformation quantity (delta x, delta y and delta theta) according to ceres, calculating coordinates of corresponding point sets in the matching frame under a reference frame coordinate system, wherein the distance between the same corresponding points is as follows:
Figure BDA0002945974070000082
calculate the jacobian matrix of errors:
Figure BDA0002945974070000091
the overall calculation formula is:
Figure BDA0002945974070000092
circularly traversing all corresponding points, and calculating a final covariance matrix cov;
where Z is a coordinate value (jx, jy) of the corresponding point, cov (Z) is configured according to the noise characteristic of the actual laser, and may be set as a square difference diagonal matrix.
The cov diagonal matrix represents the variance of the poses x, y and theta, and if the value is small, point cloud matching representing two corresponding points is good; if the value is large, the matching has a large error, and the current matching pose is recorded to be unreliable, so that the method is used for subsequent optimization processing.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. A high-precision and robust 2D laser point cloud matching method is characterized by comprising the following steps: the method comprises the following steps:
s1, converting laser original point cloud into a polar coordinate system of a laser coordinate system, and dividing the point cloud parallel to the laser advancing direction and perpendicular to the laser advancing direction;
s2, counting the point cloud number of each feature, and carrying out point cloud weight evaluation according to the feature influence degree;
s3, carrying out rough matching on the point cloud, estimating the initial angle rotation amount, and verifying whether the initial angle rotation amount is available;
s4, if the initial angle rotation amount is available, performing point cloud orientation feature fine matching and optimization, otherwise initializing the initial angle rotation amount;
and S5, evaluating the matching quality.
2. The method of high precision and robustness 2D laser point cloud matching of claim 1, wherein: segmenting point clouds perpendicular to the laser advancing direction and parallel to the laser advancing direction in the S1, and the segmenting method comprises the following steps:
calculating the point clouds parallel to the laser advancing direction and the distance between the point clouds perpendicular to the laser advancing direction and the radar, searching the point clouds in a distance threshold, obtaining point features meeting conditions in the direction parallel to the laser advancing direction as a feature A1 and a feature A2, obtaining point features meeting conditions in the direction perpendicular to the laser advancing direction as a feature B1 and a feature B2, and recording the point features of the rest point clouds as a feature C.
3. The method of high precision and robust 2D laser point cloud matching according to claim 2, wherein: the Index values of the point clouds parallel to the laser advancing direction comprise Index1 and Index2, the Index values of the point clouds perpendicular to the laser advancing direction comprise Index3 and Index4, and the Index values are calculated according to the following formula:
Index1=(θw-π/2)*180/π/(360/2400);
Index2=(θw+π/2)*180/π/(360/2400);
Index3=(θw)*180/π/(360/2400);
Index4=(θw+π)*180/π/(360/2400);
the Index1 is an Index value of a point cloud with the right side parallel to the laser advancing direction, the Index2 is an Index value of a point cloud with the left side parallel to the laser advancing direction, the Index3 is an Index value of a point cloud with the upper side perpendicular to the laser advancing direction, and the Index4 is an Index value of a point cloud with the lower side perpendicular to the laser advancing direction.
4. The method of high precision and robust 2D laser point cloud matching according to claim 2, wherein: s2, counting the point cloud number of each feature, and performing point cloud weight evaluation according to the feature influence degree, wherein the point cloud weight evaluation comprises the following steps:
the number of point clouds parallel to the laser advancing direction is numA (A1 + A2), the number of point clouds perpendicular to the laser advancing direction is numB (B1 + B2), the number of point clouds of unclassified features is numC, and then the point cloud weight is set as:
Wa∶Wb∶Wc=1∶(k*numA)∶numB;
wherein Wa is the point cloud weight of the characteristic A1 and the characteristic A2, wb is the point cloud weight of the characteristic B1 and the characteristic B2, wc is the point cloud weight of the characteristic C, and k is adjusted according to actual needs;
the characteristic A1 and the characteristic A2 are point clouds parallel to the laser advancing direction, and the translation weight needs to be lowered; the characteristic B1 and the characteristic B2 are point clouds vertical to the laser advancing direction, and the translation weight needs to be improved; feature C is some unclassified point and no additional weight is required.
5. The method of high precision and robustness 2D laser point cloud matching of claim 4, wherein: and S3, roughly matching the point cloud, estimating the initial angle rotation amount, and verifying whether the initial angle rotation amount is available, wherein the method comprises the following steps:
s31, preliminarily extracting reference frame laser and matching frame laser according to a distance threshold, and reserving three clustering point clouds with the largest number of points;
s32, extracting and cutting a straight corner point of the clustered point cloud, and fitting RANSAC after cutting to obtain three straight line characteristics line1, line2 and line3 and corresponding straight line slopes k1, k2 and k3 respectively;
s33, carrying out rotation rough estimation on the preliminary angle rotation amount;
s34, point clouds in the linear features line1, line2 and line3 of the matched frame laser are rotated to the position under the reference frame laser by utilizing the initial angle rotation amount, and adjacent points are searched in the point clouds of the linear features line1, line2 and line3 of the reference frame laser;
and S35, if the number of the adjacent points is approximately equal to the number of the point clouds in the total straight line feature, using the preliminary angle rotation amount, otherwise, not using the preliminary angle rotation amount.
6. The method of high precision and robustness 2D laser point cloud matching of claim 5, wherein: and S4, performing point cloud orientation feature fine matching and optimization, including:
the matching between laser frames is solved by adopting a ceres mode, the fast searching mode under a polar coordinate system is selected for searching adjacent points, the distance between a point and a straight line is adopted for minimum distance calculation, and an optimized error formula is as follows:
Figure FDA0002945974060000031
bringing all the corresponding point data into cerees for solving;
where J is the cost function, W i Is a corresponding weight, p i Is p j1 And p j2 ,n i Is the corresponding point p j1 And p j2 Normal vector of connecting line, t k+1 Is the translation vector, R (θ) k+1 ) Is the rotation matrix and θ is the rotation angle.
7. The method of high accuracy and robustness 2D laser point cloud matching according to claim 6, wherein: the initial estimation of the rotation angle theta is a primary angle rotation amount, and if no initial estimation is carried out, the default rotation angle theta is 0;
correcting the initial estimated rotation amount, converting the point cloud of the matched frame into a reference frame to obtain the point clouds of a corresponding polar coordinate system and a Cartesian coordinate system, then searching adjacent points to obtain a point set corr of corresponding points, searching the corresponding points in the corr, judging the point cloud characteristics of which the index values of the corresponding points accord with, and configuring the weight W of the point cloud characteristics i
8. The method of high precision and robust 2D laser point cloud matching according to claim 6, wherein: and S5, evaluating the matching quality, which comprises the following steps:
calculating pose transformation quantity according to ceres, and calculating coordinates of corresponding point sets in the matching frame in a reference frame coordinate system, wherein the distance between the same corresponding points is as follows:
Figure FDA0002945974060000032
calculate the jacobian matrix of errors:
Figure FDA0002945974060000041
the general calculation formula is:
Figure FDA0002945974060000042
circularly traversing all corresponding points, and calculating a final covariance matrix cov;
wherein, Z is a coordinate value (jx, jy) of the corresponding point, cov (Z) is configured according to the noise characteristic of the actual laser, and can be set into a square error diagonal matrix.
CN202110193224.9A 2021-02-20 2021-02-20 High-precision and robust 2D laser point cloud matching method Active CN113033564B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110193224.9A CN113033564B (en) 2021-02-20 2021-02-20 High-precision and robust 2D laser point cloud matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110193224.9A CN113033564B (en) 2021-02-20 2021-02-20 High-precision and robust 2D laser point cloud matching method

Publications (2)

Publication Number Publication Date
CN113033564A CN113033564A (en) 2021-06-25
CN113033564B true CN113033564B (en) 2022-10-14

Family

ID=76460857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110193224.9A Active CN113033564B (en) 2021-02-20 2021-02-20 High-precision and robust 2D laser point cloud matching method

Country Status (1)

Country Link
CN (1) CN113033564B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463894A (en) * 2014-12-26 2015-03-25 山东理工大学 Overall registering method for global optimization of multi-view three-dimensional laser point clouds
WO2016185637A1 (en) * 2015-05-20 2016-11-24 三菱電機株式会社 Point-cloud-image generation device and display system
CN109035207A (en) * 2018-07-03 2018-12-18 电子科技大学 The laser point cloud characteristic detection method of degree adaptive
CN109492656A (en) * 2017-09-11 2019-03-19 百度在线网络技术(北京)有限公司 Method and apparatus for output information
CN109781119A (en) * 2017-11-15 2019-05-21 百度在线网络技术(北京)有限公司 A kind of laser point cloud localization method and system
CN110208771A (en) * 2019-07-01 2019-09-06 南京林业大学 A kind of point cloud intensity correcting method of mobile two-dimensional laser radar
CN111709981A (en) * 2020-06-22 2020-09-25 高小翎 Registration method of laser point cloud and analog image with characteristic line fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463894A (en) * 2014-12-26 2015-03-25 山东理工大学 Overall registering method for global optimization of multi-view three-dimensional laser point clouds
WO2016185637A1 (en) * 2015-05-20 2016-11-24 三菱電機株式会社 Point-cloud-image generation device and display system
CN109492656A (en) * 2017-09-11 2019-03-19 百度在线网络技术(北京)有限公司 Method and apparatus for output information
CN109781119A (en) * 2017-11-15 2019-05-21 百度在线网络技术(北京)有限公司 A kind of laser point cloud localization method and system
CN109035207A (en) * 2018-07-03 2018-12-18 电子科技大学 The laser point cloud characteristic detection method of degree adaptive
CN110208771A (en) * 2019-07-01 2019-09-06 南京林业大学 A kind of point cloud intensity correcting method of mobile two-dimensional laser radar
CN111709981A (en) * 2020-06-22 2020-09-25 高小翎 Registration method of laser point cloud and analog image with characteristic line fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Directionally constrained fully convolutional neural network for airborne LiDAR point cloud classification;Congcong Wen 等;《ELSEVIER》;20200208;全文 *
Plane segmentation and fitting method of point clouds based on improved density clustering algorithm for laser radar;Xiaobin Xu 等;《ELSEVIER》;20181122;全文 *
移动机器人基于激光点云定位建图的汽车宽度与方位估计;薛连杰 等;《机械与电子》;20180630;第36卷(第6期);全文 *

Also Published As

Publication number Publication date
CN113033564A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN106650701B (en) Binocular vision-based obstacle detection method and device in indoor shadow environment
CN112396656B (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
CN112509044A (en) Binocular vision SLAM method based on dotted line feature fusion
CN112484746B (en) Monocular vision auxiliary laser radar odometer method based on ground plane
CN110274598B (en) Robot monocular vision robust positioning estimation method
WO2021082380A1 (en) Laser radar-based pallet recognition method and system, and electronic device
Li et al. Road extraction algorithm based on intrinsic image and vanishing point for unstructured road image
CN111915517A (en) Global positioning method for RGB-D camera in indoor illumination adverse environment
CN105354816B (en) A kind of electronic units fix method and device
CN110942077B (en) Feature line extraction method based on weight local change degree and L1 median optimization
CN114120149A (en) Oblique photogrammetry building feature point extraction method and device, electronic equipment and medium
CN114549549B (en) Dynamic target modeling tracking method based on instance segmentation in dynamic environment
CN116309026A (en) Point cloud registration method and system based on statistical local feature description and matching
WO2018131163A1 (en) Information processing device, database generation device, method, and program, and storage medium
CN113033564B (en) High-precision and robust 2D laser point cloud matching method
CN113947636A (en) Laser SLAM positioning system and method based on deep learning
CN116844124A (en) Three-dimensional object detection frame labeling method, three-dimensional object detection frame labeling device, electronic equipment and storage medium
CN110766728A (en) Combined image feature accurate matching algorithm based on deep learning
CN114608522B (en) Obstacle recognition and distance measurement method based on vision
CN116091603A (en) Box workpiece pose measurement method based on point characteristics
CN114283199B (en) Dynamic scene-oriented dotted line fusion semantic SLAM method
CN113793355B (en) Automatic matching method for top surface center line of unmanned aerial vehicle image railway steel rail
CN116309817A (en) Tray detection and positioning method based on RGB-D camera
CN115457130A (en) Electric vehicle charging port detection and positioning method based on depth key point regression
Kovacs et al. Edge detection in discretized range images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant