CN110111374B - Laser point cloud matching method based on grouped stepped threshold judgment - Google Patents

Laser point cloud matching method based on grouped stepped threshold judgment Download PDF

Info

Publication number
CN110111374B
CN110111374B CN201910355885.XA CN201910355885A CN110111374B CN 110111374 B CN110111374 B CN 110111374B CN 201910355885 A CN201910355885 A CN 201910355885A CN 110111374 B CN110111374 B CN 110111374B
Authority
CN
China
Prior art keywords
cloud data
point cloud
point
matching
subgroup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910355885.XA
Other languages
Chinese (zh)
Other versions
CN110111374A (en
Inventor
章弘凯
范光宇
周圣杰
陈年生
徐圣佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Dianji University
Original Assignee
Shanghai Dianji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Dianji University filed Critical Shanghai Dianji University
Priority to CN201910355885.XA priority Critical patent/CN110111374B/en
Publication of CN110111374A publication Critical patent/CN110111374A/en
Application granted granted Critical
Publication of CN110111374B publication Critical patent/CN110111374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Abstract

The invention provides a laser point cloud matching method based on grouped step type threshold judgment, which comprises the following steps: s1: acquiring M, N two frames of point cloud data before and after the laser radar at the current moment; s2: m, N into a first cloud data subgroup and a second cloud data subgroup of the fixed array; s3: performing iterative closest point matching on each point cloud; s4: judging whether the matching rate of the first cloud data subgroup and the second cloud data subgroup is greater than a first preset threshold value or not; if yes, matching is successful, the subsequent steps are continued, otherwise, matching is failed; s5: judging M, N whether the matching group rate of successful matching is larger than a second preset threshold value; if yes, matching is successful, otherwise, continuing the subsequent steps; s6: and acquiring two groups of point cloud data of two frames before and after the laser radar at the next moment as new M, N, and returning to the step S2. The laser point cloud matching method based on the grouped step-type threshold judgment can reduce the calculation amount of the algorithm under the condition of not reducing the positioning precision.

Description

Laser point cloud matching method based on grouped stepped threshold judgment
Technical Field
The invention relates to the field of robot navigation, in particular to a laser point cloud matching method based on grouping stepped threshold judgment.
Background
At present, the robot positioning technology is widely applied to fields of park inspection, warehousing and transportation and the like, and the application of the robot autonomous positioning navigation technology can effectively replace a human to complete part of operations, so that the positioning navigation technology of the robot is a current research hotspot.
In the robot navigation process, the surrounding environment is scanned through the laser radar, and the robot is positioned. The difficulty of the robot positioning technology lies in the identification and successful matching of surrounding obstacles. For example, in the scanning process of the laser radar, the laser radar scans the same obstacle at different times and different positions, and two or more groups of point clouds need to be matched to successfully match the obstacle in the environment. In the prior art, an Iterative Closest Point (ICP) algorithm is often used to scan and match obstacles in the surrounding environment, and is a matching method for a point set. The iterative closest point method is to make two groups of point clouds scanned by the same barrier maximally overlap through rotation transformation, so as to complete matching.
In the iterative closest point matching process, closest neighbor point matching needs to be performed on each point of a group of point clouds, the calculation amount is large, and a local optimal problem may be involved. In the conventional ICP algorithm, when finding the corresponding point, the point with the closest euclidean distance is considered as the corresponding point, and this assumption may generate a certain number of wrong corresponding points. Due to the problems of large calculation amount of the iteration closest point and the like, the real-time matching of the obstacle of the robot is low, the positioning effect is poor, and the obstacle avoidance function cannot be well completed.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides the laser point cloud matching method based on the grouped step-type threshold judgment, which can reduce the calculation amount of the algorithm under the condition of not reducing the positioning precision.
In order to achieve the above object, the present invention provides a laser point cloud matching method based on a grouped stepwise threshold judgment, comprising the steps of:
s1: acquiring first point cloud data M of a previous frame and second point cloud data N of a next frame of a laser radar at the current moment;
s2: respectively dividing the current first point cloud data M and the second point cloud data N into a first cloud data subgroup and a second cloud data subgroup of a fixed array; the first cloud data subgroup and the second cloud data subgroup respectively comprise a plurality of point clouds;
s3: performing iterative closest point matching on each point cloud of the current first point cloud data M and the second point cloud data N;
s4: judging whether the matching rate of the first cloud data subgroup and the second cloud data subgroup is greater than a first preset threshold value or not; if so, the first cloud data subgroup and the second cloud data subgroup are successfully matched, the subsequent steps are continued, otherwise, the first cloud data subgroup and the second cloud data subgroup are unsuccessfully matched, and the current first cloud data subgroup and the current second cloud data subgroup are abandoned;
s5: judging whether the matching group rate of the first point cloud data M and the second point cloud data N which are successfully matched is greater than a second preset threshold value or not; if so, the matching is successful, the step is ended, otherwise, the subsequent step is continued;
s6: and acquiring two groups of point cloud data of two frames before and after the laser radar at the next moment, respectively serving as the new first point cloud data M and the second point cloud data N, and returning to the step S2.
Preferably, the step of S3 further comprises the steps of:
s31: calculating the centroid of each point cloud of the current first point cloud data M and the second point cloud data N by using formula (1):
Figure BDA0002045409110000021
wherein, mumRepresenting the centroid of the ith group of jth point clouds in the first point cloud data M; mu.snRepresenting the centroid of the ith group of jth point clouds in the second point cloud data N; d represents the actual number of each point set included in the first cloud data subgroup and the second cloud data subgroup; m isijRepresenting the ith group of jth point clouds in the first point cloud data M; n isijRepresenting the ith group of jth point clouds in the second point cloud data M; i. j represents a natural number greater than zero;
s32: removing the centroid from each point cloud to obtain the updated first point cloud data M 'and the updated second point cloud data N';
s33: calculating by using a formula (2), the updated first point cloud data M 'and the updated second point cloud data N' to obtain a first transformation matrix U and a second transformation matrix V:
Figure BDA0002045409110000031
wherein W represents a matrix for solving singular value decomposition; m'ijA point set of the ith group of jth point clouds representing the updated first point cloud data M'; n'ijA point set of the ith group of jth point clouds representing the updated second point cloud data N'; t represents transposition; sigma1Singular values representing a decomposition matrix W; sigma2Singular values representing a decomposition matrix W; sigma3Singular values representing a decomposition matrix W;
s34: when rank (w) is 3, finding a unique solution of the first transformation matrix U and the second transformation matrix V;
s35: calculating by using a formula (3) to obtain a transformation matrix R and a translation matrix T';
Figure BDA0002045409110000032
s36: obtaining N' by calculation by utilizing the transformation matrix R and the translation matrix Tij,N″ijAnd the theoretical value of the ith group of jth point clouds of the updated second point cloud data N' is represented.
Preferably, the step of S4 further comprises the steps of:
s41: calculating ith group j point cloud M 'of the updated first point cloud data M'ijAnd N ″)ijThe distance of (d);
s42: judging whether the point clouds at the corresponding positions of the first point cloud data subgroup and the second point cloud data N are matched qualified or not by using a formula (4);
Figure BDA0002045409110000033
wherein p (i) represents a matching coefficient; d (i) represents M'ijAnd N ″)ijThe distance of (d); e represents a preset distance threshold; a match failure is indicated when p (i) has a value of 0; when p (i) is 1, it indicates current correspondenceThe two-point cloud matching of the position is qualified, and the quantity of the qualified point cloud matching is recorded;
s43: calculating the matching rate of the current first cloud data subgroup and the second point cloud data N, wherein the matching rate is equal to the qualified number of point cloud matches in the current first cloud data subgroup divided by the total number of point clouds in the current first cloud data subgroup;
s44: judging whether the matching rate is greater than the first preset threshold value or not; if the current point cloud data M and the current point cloud data N are larger than the preset threshold value, outputting the current first point cloud data M and the current second point cloud data N and continuing the subsequent steps, and otherwise, discarding the current first point cloud data M and the current second point cloud data N.
Preferably, the matching group rate is equal to the number of groups for which the first cloud data subgroup and the second cloud data subgroup are successfully matched, divided by the fixed number.
Preferably, the matching of the first cloud data subgroup and the second cloud data subgroup is successful, and then the method further comprises the following steps: outputting the current first cloud data subgroup and the second cloud data subgroup, and recording the number of groups successfully matched with the first cloud data subgroup and the second cloud data subgroup.
Due to the adoption of the technical scheme, the invention has the following beneficial effects:
according to the method, the first point cloud data M and the second point cloud data N are respectively averagely divided into k groups from beginning to end according to the scanning time sequence, the point clouds of each group corresponding to the M and the N are subjected to nearest neighbor matching, each point cloud of each group is matched, the laser radar point clouds are subjected to group matching, the calculation amount of an algorithm can be reduced under the condition that the positioning accuracy is not reduced, and the obstacle avoidance function of the robot can be effectively improved when the method is applied to robot path planning.
Drawings
Fig. 1 is a flowchart of a laser point cloud matching method based on a grouped stepwise threshold determination according to an embodiment of the present invention.
Detailed Description
The following description of the preferred embodiment of the present invention, in accordance with the accompanying drawings of which 1 is presented to enable a better understanding of the invention as to its functions and features.
Referring to fig. 1, a laser point cloud matching method based on a grouped stepwise threshold determination according to an embodiment of the present invention includes:
s1: and acquiring first point cloud data M of a previous frame and second point cloud data N of a next frame of the laser radar at the current moment.
S2: respectively dividing the current first point cloud data M and the current second point cloud data N into a first cloud data subgroup and a second cloud data subgroup of a fixed array; the first cloud data subgroup and the second cloud data subgroup respectively comprise a plurality of point clouds.
S3: and performing iterative closest point matching on each point cloud of the current first point cloud data M and the current second point cloud data N.
Wherein the step of S3 further comprises the steps of:
s31: calculating the centroid of each point cloud of the current first point cloud data M and the current second point cloud data N by using formula (1):
Figure BDA0002045409110000051
wherein, mumRepresenting the centroid of the ith group of jth point clouds in the first point cloud data M; mu.snRepresenting the centroid of the ith group of jth point clouds in the second point cloud data N; d represents the actual number of each point set point contained in the first cloud data subgroup and the second cloud data subgroup; m isijRepresenting the ith group of jth point clouds in the first point cloud data M; n isijRepresenting the ith group of jth point clouds in the second point cloud data M; i. j represents a natural number greater than zero;
s32: removing a mass center from each point cloud to obtain updated first point cloud data M 'and updated second point cloud data N';
s33: calculating by using a formula (2), the updated first point cloud data M 'and the updated second point cloud data N' to obtain a first transformation matrix U and a second transformation matrix V:
Figure BDA0002045409110000052
wherein W represents a matrix for solving singular value decomposition; m'ijA point set of the ith group of jth point clouds representing the updated first point cloud data M'; n'ijA point set of the ith group of jth point clouds representing the updated second point cloud data N'; t represents transposition; sigma1Singular values representing a decomposition matrix W; sigma2Singular values representing a decomposition matrix W; sigma3Singular values representing a decomposition matrix W;
s34: when rank (w) is 3, the unique solution of the first transformation matrix U and the second transformation matrix V is obtained;
s35: calculating by using a formula (3) to obtain a transformation matrix R and a translation matrix T';
Figure BDA0002045409110000053
s36: obtaining N' by calculation of transformation matrix R and translation matrix Tij,N″ijAnd the theoretical value of the ith group of jth point clouds of the updated second point cloud data N' is represented.
S4: judging whether the matching rate of the first cloud data subgroup and the second cloud data subgroup is greater than a first preset threshold value or not; if the number of the cloud data subgroups is larger than the preset number, the first cloud data subgroup and the second cloud data subgroup are successfully matched, the subsequent steps are continued, otherwise, the first cloud data subgroup and the second cloud data subgroup are failed to be matched, and the current first cloud data subgroup and the current second cloud data subgroup are abandoned.
Wherein the step of S4 further comprises the steps of:
s41: calculating ith group j point cloud M 'of updated first point cloud data M'ijAnd N ″)ijThe distance of (d);
s42: judging whether the point clouds at the corresponding positions of the current first point cloud data subgroup and the second point cloud data N are matched qualified or not by using a formula (4);
Figure BDA0002045409110000061
wherein p (i) represents a matching coefficient; d (i) represents M'ijAnd N ″)ijThe distance of (d); e represents a preset distance threshold; a match failure is indicated when p (i) has a value of 0; when the value of p (i) is 1, the two-point cloud matching of the current corresponding position is qualified, and the number of the qualified point cloud matching is recorded;
s43: calculating the matching rate of the current first cloud data subgroup and the second point cloud data N, wherein the matching rate is equal to the number of qualified point cloud matches in the current first cloud data subgroup divided by the total number of point clouds in the current first cloud data subgroup;
s44: judging whether the matching rate is greater than a first preset threshold value or not; if the current first point cloud data M and the current second point cloud data N are larger than the preset threshold value, outputting the current first point cloud data M and the current second point cloud data N and continuing the subsequent steps, otherwise, discarding the current first point cloud data M and the current second point cloud data N.
Preferably, the matching group rate is equal to the number of groups for which the matching of the first cloud data subgroup and the second cloud data subgroup is successful, divided by the fixed number.
Preferably, the matching of the first cloud data subgroup and the second cloud data subgroup is successful, and then the method further comprises the following steps: and outputting the current first cloud data subgroup and the second cloud data subgroup, and recording the number of successfully matched groups of the first cloud data subgroup and the second cloud data subgroup.
S5: judging whether the matching group rate of the first point cloud data M and the second point cloud data N which are successfully matched is greater than a second preset threshold value or not; if so, the matching is successful, the step is ended, otherwise, the subsequent step is continued;
s6: and acquiring two groups of point cloud data of two frames before and after the laser radar at the next moment, respectively serving as the new first point cloud data M and the new second point cloud data N, and returning to the step S2.
The invention provides a laser point cloud matching method based on grouped step-type threshold judgment. In the point cloud matching process, the invention utilizes a grouping stepped threshold judgment method to match the point clouds in real time. Inputting two frames of point cloud data before and after the laser radar, grouping the point cloud data, and performing group-to-group matching on the point cloud data. And if the matching rate of the set of point clouds meets the requirement, judging that the set of point clouds is successfully matched. And if the matching group rate of the point clouds meets the requirement, judging that the point clouds of the two point sets are successfully matched.
For example:
i, inputting two frames of point cloud data before and after the laser radar, and grouping the point cloud data.
(1.1) firstly, scanning two frames before and after the laser radar as point cloud data, and respectively using the point cloud data as an M point set and an N point set.
(1.2) dividing the M point set and the N point set into F groups respectively, and marking as M1,M2,M3...MkAnd N1,N2,N3...NkAnd the number of each point set point is D.
II. And carrying out ICP matching on each group of point clouds.
(2.1) mixing MijAnd NijICP matching of point cloud (M)ijJ-th point cloud representing i-th group in M point set) two point clouds M are calculated using equation (1)ijAnd NijThe center of mass of (c):
Figure BDA0002045409110000071
respectively removing corresponding centroids from the two point sets to obtain a new point set M'ij,N′ij
(2.2) the transformation matrix is obtained by SVD (singular Value decomposition) decomposition (singular Value decomposition), and U, V is obtained by the following equation (2).
Figure BDA0002045409110000072
If rank (w) is 3 (rank of matrix), the solution is determined to be unique, and the rotation transformation matrix R and the translation matrix T' are determined by equation (3).
Figure BDA0002045409110000073
(2.3) obtaining N 'from R and T'ijComparison of M'ij,N′ijIs measured by the distance d.
Figure BDA0002045409110000074
Wherein p (i) is used for judging whether the point matching is qualified, and E is a distance threshold value obtained through an experiment, and the distance threshold value is 5mm in the experiment.
And (2.4) if the point matching does not reach the distance threshold, re-matching the point cloud after the position is rotationally translated, and re-performing the processes from (2.1) to (2.3) for repeated iteration. And if the distance of the point clouds is smaller than the threshold value E, judging that the two-point cloud matching is successful.
And III, performing threshold judgment on the matching rate of each group of point clouds and the number of the point cloud matching groups to judge whether the requirements are met.
(3.1) if Mi,NiWhen the matching rate of the point clouds reaches a threshold value zeta (the matching rate is the number of the constituent functional point clouds divided by the total number of the point clouds in the group), the iteration is stopped, and the two groups of corresponding point clouds are successfully matched.
And if the matching group rate (the matching successful group divided by the total group number) of the M and N reaches a threshold value beta, stopping iteration, and indicating that the matching of the M and N point sets is successful.
While the present invention has been described in detail and with reference to the embodiments thereof as illustrated in the accompanying drawings, it will be apparent to one skilled in the art that various changes and modifications can be made therein. Therefore, certain details of the embodiments are not to be interpreted as limiting, and the scope of the invention is to be determined by the appended claims.

Claims (4)

1. A laser point cloud matching method based on grouping stepped threshold judgment comprises the following steps:
s1: acquiring first point cloud data M of a previous frame and second point cloud data N of a next frame of a laser radar at the current moment;
s2: respectively dividing the current first point cloud data M and the second point cloud data N into a first cloud data subgroup and a second cloud data subgroup of a fixed array; the first cloud data subgroup and the second cloud data subgroup respectively comprise a plurality of point clouds;
s3: performing iterative closest point matching on each point cloud of the current first point cloud data M and the second point cloud data N;
s4: judging whether the matching rate of the first cloud data subgroup and the second cloud data subgroup is greater than a first preset threshold value or not; if so, the first cloud data subgroup and the second cloud data subgroup are successfully matched, the subsequent steps are continued, otherwise, the first cloud data subgroup and the second cloud data subgroup are unsuccessfully matched, and the current first cloud data subgroup and the current second cloud data subgroup are abandoned;
s5: judging whether the matching group rate of the first point cloud data M and the second point cloud data N which are successfully matched is greater than a second preset threshold value or not; if so, the matching is successful, the step is ended, otherwise, the subsequent step is continued;
s6: acquiring two groups of point cloud data of two frames before and after the laser radar at the next moment, respectively serving as the new first point cloud data M and the second point cloud data N, and returning to the step S2;
the step of S3 further includes the steps of:
s31: calculating the centroid of each point cloud of the current first point cloud data M and the second point cloud data N by using formula (1):
Figure FDA0002685344920000011
wherein, mumRepresenting the centroid of the ith group of jth point clouds in the first point cloud data M; mu.snRepresenting the centroid of the ith group of jth point clouds in the second point cloud data N; d represents the actual number of each point set included in the first cloud data subgroup and the second cloud data subgroup; m isijRepresenting the ith group of jth point clouds in the first point cloud data M; n isijRepresenting the ith group of jth point clouds in the second point cloud data N; i. j represents a natural number greater than zero;
s32: removing the centroid from each point cloud to obtain the updated first point cloud data M 'and the updated second point cloud data N';
s33: calculating by using a formula (2), the updated first point cloud data M 'and the updated second point cloud data N' to obtain a first transformation matrix U and a second transformation matrix V:
Figure FDA0002685344920000021
wherein W represents a matrix for solving singular value decomposition; m'ijA point set of the ith group of jth point clouds representing the updated first point cloud data M'; n'ijA point set of the ith group of jth point clouds representing the updated second point cloud data N'; t represents transposition; sigma1Singular values representing a decomposition matrix W; sigma2Singular values representing a decomposition matrix W; sigma3Singular values representing a decomposition matrix W;
s34: when rank (w) is 3, finding a unique solution of the first transformation matrix U and the second transformation matrix V;
s35: calculating by using a formula (3) to obtain a transformation matrix R and a translation matrix T';
Figure FDA0002685344920000022
s36: obtaining N through calculation by utilizing the transformation matrix R and the translation matrix T'ij,N"ijAnd the theoretical value of the ith group of jth point clouds of the updated second point cloud data N' is represented.
2. The laser point cloud matching method based on grouped stepwise threshold determination according to claim 1, wherein the step of S4 further comprises the steps of:
s41: calculating ith group j point cloud M 'of the updated first point cloud data M'ijAnd N "ijIs a distance of;
S42: judging whether the point clouds at the corresponding positions of the first point cloud data subgroup and the second point cloud data N are matched qualified or not by using a formula (4);
Figure FDA0002685344920000023
wherein p (i) represents a matching coefficient; d (i) represents M'ijAnd N "ijThe distance of (d); e represents a preset distance threshold; a match failure is indicated when p (i) has a value of 0; when the value of p (i) is 1, the two-point cloud matching of the current corresponding position is qualified, and the number of the qualified point cloud matching is recorded;
s43: calculating the matching rate of the current first cloud data subgroup and the second point cloud data N, wherein the matching rate is equal to the qualified number of point cloud matches in the current first cloud data subgroup divided by the total number of point clouds in the current first cloud data subgroup;
s44: judging whether the matching rate is greater than the first preset threshold value or not; if the current point cloud data M and the current point cloud data N are larger than the preset threshold value, outputting the current first point cloud data M and the current second point cloud data N and continuing the subsequent steps, and otherwise, discarding the current first point cloud data M and the current second point cloud data N.
3. The laser point cloud matching method based on grouping stepped threshold determination of claim 2, wherein the matching group rate is equal to the number of groups of which the matching of the first cloud data subgroup and the second cloud data subgroup is successful divided by the fixed number.
4. The laser point cloud matching method based on the grouped stepwise threshold judgment according to claim 3, wherein the step of matching the first cloud data subgroup with the second cloud data subgroup successfully further comprises: outputting the current first cloud data subgroup and the second cloud data subgroup, and recording the number of groups successfully matched with the first cloud data subgroup and the second cloud data subgroup.
CN201910355885.XA 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment Active CN110111374B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910355885.XA CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910355885.XA CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Publications (2)

Publication Number Publication Date
CN110111374A CN110111374A (en) 2019-08-09
CN110111374B true CN110111374B (en) 2020-11-17

Family

ID=67487504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910355885.XA Active CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Country Status (1)

Country Link
CN (1) CN110111374B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929694B (en) * 2020-10-12 2021-01-26 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN113204030A (en) * 2021-04-13 2021-08-03 珠海市一微半导体有限公司 Multipoint zone constraint repositioning method, chip and robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109459759A (en) * 2018-11-13 2019-03-12 中国科学院合肥物质科学研究院 City Terrain three-dimensional rebuilding method based on quadrotor drone laser radar system
CN109633688A (en) * 2018-12-14 2019-04-16 北京百度网讯科技有限公司 A kind of laser radar obstacle recognition method and device

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011186780A (en) * 2010-03-09 2011-09-22 Sony Corp Information processing apparatus, information processing method, and program
EP2884879B1 (en) * 2012-08-14 2020-01-08 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems
TWI486906B (en) * 2012-12-14 2015-06-01 Univ Nat Central Using Image Classification to Strengthen Image Matching
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device
CN104932001B (en) * 2015-07-08 2018-01-30 四川德马克机器人科技有限公司 A kind of 3D nuclear radiation environments in real time rebuild monitoring system
CN105180890B (en) * 2015-07-28 2017-07-21 南京工业大学 Merge the ORIENTATION OF DISCONTINUITY IN ROCK MASS measuring method of laser point cloud and digitized video
US10404962B2 (en) * 2015-09-24 2019-09-03 Intel Corporation Drift correction for camera tracking
CN105678318B (en) * 2015-12-31 2019-03-08 百度在线网络技术(北京)有限公司 The matching process and device of traffic sign
CN105701820B (en) * 2016-01-14 2018-10-30 上海大学 A kind of point cloud registration method based on matching area
CN105913489B (en) * 2016-04-19 2019-04-23 东北大学 A kind of indoor three-dimensional scenic reconstructing method using plane characteristic
CN106981081A (en) * 2017-03-06 2017-07-25 电子科技大学 A kind of degree of plainness for wall surface detection method based on extraction of depth information
KR20180106417A (en) * 2017-03-20 2018-10-01 현대자동차주식회사 System and Method for recognizing location of vehicle
CN107491071B (en) * 2017-09-04 2020-10-30 中山大学 Intelligent multi-robot cooperative mapping system and method thereof
CN107861920B (en) * 2017-11-27 2021-11-30 西安电子科技大学 Point cloud data registration method
CN108152831B (en) * 2017-12-06 2020-02-07 中国农业大学 Laser radar obstacle identification method and system
CN108776474B (en) * 2018-05-24 2022-03-15 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108765487B (en) * 2018-06-04 2022-07-22 百度在线网络技术(北京)有限公司 Method, device, equipment and computer readable storage medium for reconstructing three-dimensional scene
CN108986149A (en) * 2018-07-16 2018-12-11 武汉惟景三维科技有限公司 A kind of point cloud Precision Registration based on adaptive threshold
CN109345620B (en) * 2018-08-13 2022-06-24 浙江大学 Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109459759A (en) * 2018-11-13 2019-03-12 中国科学院合肥物质科学研究院 City Terrain three-dimensional rebuilding method based on quadrotor drone laser radar system
CN109633688A (en) * 2018-12-14 2019-04-16 北京百度网讯科技有限公司 A kind of laser radar obstacle recognition method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种非结构环境下目标识别和 3D 位姿估计方法;任秉银等;《哈尔滨工业大学学报》;20190131;第51卷(第1期);全文 *

Also Published As

Publication number Publication date
CN110111374A (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN109978955B (en) Efficient marking method combining laser point cloud and image
CN108320329B (en) 3D map creation method based on 3D laser
CN108388896B (en) License plate identification method based on dynamic time sequence convolution neural network
CN104484648B (en) Robot variable visual angle obstacle detection method based on outline identification
CN110866934B (en) Normative coding-based complex point cloud segmentation method and system
CN110111374B (en) Laser point cloud matching method based on grouped stepped threshold judgment
CN110480075B (en) Workpiece curved surface contour compensation system and method based on point cloud data and medium
CN110533726B (en) Laser radar scene three-dimensional attitude point normal vector estimation correction method
CN104166989B (en) Rapid ICP method for two-dimensional laser radar point cloud matching
CN110136177B (en) Image registration method, device and storage medium
CN114862932B (en) BIM global positioning-based pose correction method and motion distortion correction method
CN110570449A (en) positioning and mapping method based on millimeter wave radar and visual SLAM
CN111368766A (en) Cattle face detection and identification method based on deep learning
CN112762937B (en) 2D laser sequence point cloud registration method based on occupied grids
CN108845303B (en) Nonlinear robust subspace true and false target feature extraction method
CN110738687A (en) Object tracking method, device, equipment and storage medium
Wang et al. Accurate mix-norm-based scan matching
CN110930444B (en) Point cloud matching method, medium, terminal and device based on bilateral optimization
CN104091148A (en) Facial feature point positioning method and device
CN107123138A (en) Based on vanilla R points to rejecting tactful point cloud registration algorithm
CN112697158A (en) Man-made loop-back instant positioning and picture building method and system for indoor and outdoor scenes
CN108629371B (en) Data dimension reduction method for two-dimensional time-frequency data
CN113406658A (en) Mobile robot positioning method based on point-line characteristic scanning matching
CN111275748A (en) Point cloud registration method based on laser radar in dynamic environment
CN112381952B (en) Face contour point cloud model reconstruction method and device based on multiple cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant