CN111145232A - Three-dimensional point cloud automatic registration method based on characteristic information change degree - Google Patents

Three-dimensional point cloud automatic registration method based on characteristic information change degree Download PDF

Info

Publication number
CN111145232A
CN111145232A CN201911299529.7A CN201911299529A CN111145232A CN 111145232 A CN111145232 A CN 111145232A CN 201911299529 A CN201911299529 A CN 201911299529A CN 111145232 A CN111145232 A CN 111145232A
Authority
CN
China
Prior art keywords
point
points
point cloud
feature
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911299529.7A
Other languages
Chinese (zh)
Inventor
达飞鹏
黄源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201911299529.7A priority Critical patent/CN111145232A/en
Publication of CN111145232A publication Critical patent/CN111145232A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a three-dimensional point cloud automatic registration method based on characteristic information change degree, wherein a processing object is two or more than two pieces of three-dimensional point cloud data with overlapped parts, and the processing steps are as follows: (1) selecting characteristic points according to the degree of change of the local normal vector of the point cloud; (2) designing three feature metrics to carry out feature description on each obtained feature point; (3) comparing the feature description of each feature point through threshold value constraint to obtain an initial matching point pair; (4) obtaining an accurate matching point pair by using a rigid distance constraint condition, and calculating by using a four-element method to obtain an initial registration parameter; (5) and precisely registering the point cloud by adopting a modified ICP (iterative close point) algorithm. The point cloud can be automatically registered according to the steps, the feature description provided by the invention is simple, the identification degree is high, the robustness is high, and the registration precision and the registration speed are improved to a certain extent.

Description

Three-dimensional point cloud automatic registration method based on characteristic information change degree
Technical Field
The invention relates to a three-dimensional point cloud automatic registration method based on characteristic information change degree, and belongs to the field of three-dimensional information reconstruction.
Background
Three-dimensional reconstruction of object surfaces has been an important topic of research in the field of machine vision. The point cloud data of the surface of the measured object can be quickly acquired by an optical scanner, but due to the linear propagation characteristic of light, the complete data of the surface of the measured object needs to be acquired by measuring for many times under multiple view angles, so that the acquired data are not under the same coordinate system, and in order to acquire a complete model of the object, the data acquired from each view angle need to be subjected to coordinate transformation and finally combined into a unified coordinate system, which is point cloud registration. The point cloud registration technology is widely applied to the fields of robot navigation, reverse engineering, object surface shape detection, virtual reality and the like.
According to the difference of input and output results of the depth map to be registered, the existing depth map registration methods can be roughly divided into two categories: coarse registration and fine registration. The rough registration refers to finding a group of approximate coordinate transformation relations without any prior knowledge, and unifying the depth maps under two viewpoints into the same coordinate system. Since there is no information about the relative positional relationship between the depth maps before registration, the coarse registration algorithm is usually developed around how to establish the matching relationship of corresponding elements between two depth maps at the same position of the model, and many scholars propose different methods for this. These methods can be divided into point-to-point correspondence, line-to-line correspondence, plane-to-plane correspondence, and body-to-body correspondence according to the difference in corresponding elements; if the matching modes are different, the equal relation based on Euclidean distance and the equal relation based on descriptors can be divided; the search range by matching can be divided into non-feature matching and feature matching. However, due to the diversity and complexity of the measurement model, the coarse registration algorithm is often dependent on some specific applications, and further research on the coarse registration algorithm aims to improve the accuracy, efficiency, robustness and applicability of the coarse registration. The result of the coarse registration typically provides only an approximation of the exact coordinate transformation between the depth maps, so that the overlapping regions of the two depth maps have a certain degree of fit. However, due to the approximate position transformation, the overlapping regions of the depth maps are difficult to be accurately fitted, some interleaving and layering phenomena often exist, and the subsequent fusion of the depth map data is not facilitated, so that the position of the depth map needs to be further adjusted to improve the registration accuracy of the depth map, and the process is called as fine registration. The fine registration is to realize more accurate registration transformation by continuously iterating and minimizing the distance between corresponding points on two depth maps on the basis that the approximate value of the coordinate transformation between the depth maps is obtained by coarse registration, and the representative algorithm is the classic ICP algorithm.
Disclosure of Invention
The invention provides an automatic registration method based on feature information change degree, aiming at the problems of identification degree of feature description of a three-dimensional point cloud automatic registration midpoint, registration precision and robustness. The method includes the steps that point cloud data are simplified by introducing Sine of Normal (SiN) on the basis of solution vector, a feature point set is extracted, so that the subsequent calculation amount is reduced, corresponding point pairs are searched through three geometric features, effective accurate matching point pairs are obtained by a method of combining rigid distance constraint and a RANSAC algorithm and used for calculating initial registration parameters, and finally, secondary splicing is conducted through an improved ICP algorithm. The method can improve the precision and speed of registration and has higher robustness.
The invention adopts the following technical scheme for solving the technical problems:
the invention provides a three-dimensional point cloud automatic registration method based on characteristic information change degree, which comprises the following steps:
step 1, acquiring feature point sets Pt and Qt of two point cloud data P and Q to be registered based on normal vector information of different neighborhood radii, wherein Q is reference point cloud;
step 2, establishing the feature description of each feature point in Pt and Qt;
step 3, based on the characteristic description of Pt and Qt obtained in step 2, finding the most similar point for each point in Pt in Qt as a matching point to obtain an initial matching point pair set;
step 4, combining the initial matching point pair set obtained in the step 3 with the distance constraint condition between points, and using a self-adaptive RANSAC algorithm to obtain an accurate matching point pair set;
step 5, calculating an initial registration parameter by using a four-element method;
and 6, performing point cloud registration by adopting an improved iterative closest point ICP algorithm.
As a further technical solution of the present invention, step 1 specifically comprises:
step 1.1, the characteristic degree of a certain point p in the point cloud data is
Figure BDA0002321510800000021
Where θ is p in different neighborhood radius r1、r2Normal vector of
Figure BDA0002321510800000022
Angle of (d) r1≠r2
Step 1.2, extracting characteristic degree of point cloud data greater than set threshold epsilon1The points of (2) are used as characteristic points of the point cloud data;
step 1.3, feature point extraction is performed on P and Q by the methods of step 1.1 and step 1.2, respectively, and the set of feature points of P is obtained as Pt ═ { Pt ═ Pt1,pt2,pt3,…ptm'Q has a feature point set of Qt ═ Qt1,qt2,qt3,…qtn'Where m 'and n' are the numbers of characteristic points of P and Q, respectively.
As a further technical solution of the present invention, step 2 specifically is:
step 2.1, for the ith' point Pt in Pti′The feature degree of the point is taken as the first feature quantity f at the point1(pti′);
Step 2.2: for pti′In P with pti′The point in the spherical region having the origin and the radius of γ is denoted as pti′To calculate pti′Neighborhood center of gravity O (pt)i′) In pti′To O (pt)i′) As the second characteristic quantity f of the point2(pti′)=||O(pti′)-pti′||;
Step 2.3: for pti′In pti′And O (pt)i′) The line between pt andi′taking the inverse cosine value of the included angle of the normal vector as the third characteristic quantity of the point
Figure BDA0002321510800000031
Step 2.4: for pti′If there is a jth point Qt at Qtj′Meet the following three conditions, then qtj′Is pti′The corresponding points of (1):
|f1(pti′)-f1(qtj′)|/(f1(pti′)+f1(qtj′))≤ε2
|f2(pti′)-f2(qtj′)|/(f2(pti′)+f2(qtj′))≤ε3
|f3(pti′)-f3(qtj′)|/(f3(pti′)+f3(qtj′))≤ε4
wherein epsilon2、ε3、ε4All are set thresholds.
As a further technical solution of the present invention, step 3 specifically is:
based on the information obtained in step 2, finding a corresponding point for each point in Pt in Qt as its initial matching point, and the initial matching point pair is set as:
Figure BDA0002321510800000032
where num (matchdots) is the number of initial matching point pairs.
As a further technical solution of the present invention, step 4 specifically is:
step 4.1: for point pairs in Matchdots
Figure BDA0002321510800000033
And
Figure BDA0002321510800000034
if it satisfies
Figure BDA0002321510800000035
Then
Figure BDA0002321510800000036
And
Figure BDA0002321510800000037
according to the constraint condition of the distance between the points, and the sum in Matchdots is calculated
Figure BDA0002321510800000038
Number num of matching point pairs satisfying distance constraint between pointsgWherein h is 1,2,3 … num (Matchdots), dist (·,) represents the distance between two points, ε5Setting a threshold;
step 4.2: selecting the matching point pair corresponding to the larger N values in the calculation result of the step 4.1, and checking whether the N matching point pairs are correct matching point pairs by using a RANSAC algorithm, wherein the specific checking method comprises the following steps:
1) randomly selecting 3 matching point pairs from the N matching point pairs as a sample;
2) setting 3 selected matching point pairs in the step 1) as correct matching point pairs, and calculating a rigid body transformation matrix U;
3) judging whether the rest N-3 matching points are correctly matched under the rigid body transformation matrix U in the step 2): for any point(s) in the remaining N-3 matching points1,s2) If | | | U(s)1),s2If | is less than the predefined threshold value, then(s) is considered1,s2) If the matching point pair is correct, marking as an inner point; otherwise,(s)1,s2) The matching point pairs are wrong and are marked as outer points, and all inner points form a consistent set of the sampling; wherein, U(s)1) Denotes s1Calculating a new coordinate point through a rigid body transformation matrix;
4) repeating the steps 1) to 3) until the iteration times reaches the upper limit of the sampling times, finally, taking the rigid body transformation matrix corresponding to the consistent set with the maximum number of the internal points obtained by iteration as a correct rigid body transformation matrix, and obtaining the internal points in Matchdots as final accurate matching point pairs according to the correct rigid body transformation matrix, wherein the accurate matching point pair set is as follows:
Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}
wherein (m)e,m'e) To exactly match a pair of points, t is the number of exactly matching pairs of points.
As a further technical solution of the present invention, the calculation of the initial registration parameter in step 5 specifically includes:
1): separately compute a set of points { me|meE.g. P, e 1, 2.e|m'eE.g., centroid of Q, e 1, 2.
Figure BDA0002321510800000041
Wherein μ, μ' are each point me,m'eThe center of mass of;
2): respectively set points { me|meE.g. P, e 1, 2.e|m'eE Q, e 1,2, t, do a translation relative to the centroid:
Figure BDA0002321510800000042
wherein s ise,s'eAre respectively a point me,m'eNew feature points after the relative centroid translation;
3): according to the point set seAnd { s'eComputing a correlation matrix K:
Figure BDA0002321510800000043
4): by the elements K in the correlation matrix KbcConstructing a four-dimensional symmetric matrix Ka
Figure BDA0002321510800000044
Wherein b, c is 1,2,3, 4;
5): calculating KaThe unit feature vector q corresponding to the maximum feature root:
q=[q0q1q2q3]T
6): calculating a rotation matrix R0
Figure BDA0002321510800000051
7): solving translation vector T0
T0=μ'-Rμ。
As a further technical solution of the present invention, step 6 specifically is:
step 6.1, utilizing the initial registration parameter R obtained in step 50And T0Transforming each point in the target point cloud P to a coordinate system of the reference point cloud Q, and obtaining an initial target point cloud set P in the ICP algorithm after transformation0={p01,p02,…,p0mIn which p is0i=R0·pi+T0,piIs the ith point in the target point cloud P, i is 1,2, …, m;
step 6.2, for P0At any point in the four points, the gravity center P of a tetrahedron formed by four points with the nearest Euclidean distance on Q is obtained through a polyhedron gravity center formula0The ith point and its corresponding center of gravity constitute the initial closest point pair (p) in the ICP algorithm0i,q0i);
Step 6.3, according to an iterative formula Ps+1=Rs+1Ps+Ts+1Performing iterative calculation until a set convergence condition d is mets-ds+1If the registration parameter is less than E, the obtained final registration parameter is used for registration of the target point cloud P to obtain an accurate registration result; wherein E is a set threshold value,
Figure BDA0002321510800000052
(p(s-1)i,q(s-1)i)、(psi,qsi) Respectively the s-1 th iteration and the nearest neighbor point pair, R, obtained by the registration parameter obtained by the iteration after the s iteration according to the method in the steps 6.1 and 6.2s、Rs+1Respectively obtaining rotation matrixes after the s-th iteration and the s + 1-th iteration; t iss、Ts+1Respectively obtaining translation vectors after the s-th iteration and the s + 1-th iteration; ps、Ps+1Respectively after s iterations and s +1 iterations of the target point cloud PPoint cloud collection of ds、ds+1Are respectively Ps、Ps+1And the average distance of Q.
Has the advantages that:
compared with other feature-based registration algorithms which only use a single feature value (such as curvature, gravity center distance and the like) to describe the point cloud, the invention adopts various geometric feature features to describe the point cloud, so that the point identification degree is improved, the anti-noise interference capability of the method is improved, the influence of the point cloud topological structure is small, and the registration precision and speed are improved to a certain extent. Compared with other feature-based registration techniques, the present invention has the following advantages:
firstly, the method selects the characteristic point set based on the change degree of the normal vectors of different neighborhood radiuses of the point cloud instead of using all the points for point pair matching, saves the processing time, removes the points with unobvious relative characteristics, improves the accuracy of point pair matching, and avoids the occurrence of wrong matched point pairs caused by too many similar point matches.
Secondly, various geometric characteristics are designed to carry out characteristic point pair acquisition, the identification degree of characteristic description is improved, the anti-noise interference capability of the characteristics is strong, and the influence of a point cloud topological structure is small, so that the accuracy of point pair matching is effectively improved.
And finally, combining rigid distance constraint and RANSAC algorithm to obtain accurate matching point pairs, and further iteratively solving an accurate registration result by using an improved ICP algorithm, wherein the algorithm is quick and effective.
In conclusion, the method can realize automatic registration of partially overlapped point cloud data, the point cloud does not need any predicted information, manual intervention is not needed in the registration process, the initial registration effect is good, the secondary registration result is more accurate, and the registration requirement of multi-view point cloud data is met.
Drawings
FIG. 1 is a flow chart of the overall process of the present invention.
FIG. 2 is a schematic diagram of the bunny point cloud with normal information in whole and in part.
FIG. 3 is a schematic diagram of bunny point cloud feature point extraction.
FIG. 4 is a schematic diagram of two pairwise view-angle initial matching point-to-connecting lines of the bunny point cloud.
FIG. 5 is a schematic diagram of a connecting line of matching point pairs of two pairwise viewing angles of the bunny point cloud after removing error point pairs.
Fig. 6 is the initial registration result of two view angles of bunny point cloud.
Fig. 7 is a comparison diagram of the initial registration and the accurate registration result of two viewing angles of the bunny point cloud.
Detailed Description
The following further describes the embodiments of the present invention with reference to the drawings. And selecting Visual Studio2017 as a programming tool under a Windows operating system, and performing registration processing on multi-view point cloud data acquired by the three-dimensional measuring equipment. In the example, the bunny point cloud data of Stanford university is adopted, and a relatively accurate registration result is finally obtained.
FIG. 1 is a flow chart of the overall process of the present invention.
The invention provides a point cloud automatic registration algorithm based on feature information change degree, which aims at the problems of low feature identification degree, sensitivity to noise, high requirements on a point cloud topological structure and the like in the existing feature-based automatic registration algorithm. On the basis of solving vector, point cloud data is simplified by introducing Sine of Normal (SiN) to extract a feature point set, so that the subsequent calculation amount is reduced, corresponding point pairs are sought through three geometric features, then accurate matching point pairs are obtained by utilizing rigid distance constraint conditions and applying RANSAC algorithm, initial registration parameters are calculated by adopting a four-element method, and finally, the point cloud is accurately registered by adopting an improved ICP algorithm.
The method comprises the following specific implementation steps:
step 1: the method comprises the following steps of obtaining feature point sets in two point clouds to be matched based on normal vector information of different neighborhood radiuses:
step 1.1: the existing three-dimensional scanner is adopted to obtain multi-view point cloud data with normal vector information, and overlapping parts exist between point clouds obtained from adjacent views, so that the point p of a certain point in the point cloud can be calculated at different positionsNeighborhood radius r1、r2(r1≠r2) Normal vector of
Figure BDA0002321510800000071
Defining the characteristic degree f of the point p as its normal vector
Figure BDA0002321510800000072
Absolute value of the included angle sine | sin θ |:
Figure BDA0002321510800000073
step 1.2: selecting a suitable threshold value epsilon1Removing the relatively flat part in the point cloud, and reserving f > epsilon1Point (2) of (c).
Step 1.3: setting two point cloud data to be registered as P and Q respectively, wherein Q is a reference point cloud, and respectively extracting the feature points of the two point clouds by using the feature point extraction methods provided in the step 1.1 and the step 1.2 to obtain a feature point set Pt ═ Pt of P1,pt2,pt3,…ptm'Q has a feature point set of Qt ═ Qt1,qt2,qt3,…qtn'Where m 'and n' are the numbers of characteristic points of P and Q, respectively.
FIG. 2 is a schematic diagram of the bunny point cloud with normal information in whole and in part.
FIG. 3 is a schematic diagram of bunny point cloud feature point extraction.
Step 2: establishing the feature description of the feature point set, wherein the method comprises the following steps:
step 2.1: for the ith' point Pt in the point set Pti′The feature degree of the point is taken as the first feature quantity f at the point1(pti′);
Step 2.2: for the ith' point Pt in the point set Pti′In the origin set P, pti′The point in the spherical region having the origin and the radius of γ is denoted as pti′To calculate pti′Neighborhood center of gravity O (pt)i′) In pti′To O (pt)i′) The distance value of (2) is taken as the second special characteristic of the pointCharacterization, noted as:
f2(pti′)=||O(pti′)-pti′||
step 2.3: for the ith' point Pt in the point set Pti′In pti′And O (pt)i′) The line between pt andi′normal vector
Figure BDA0002321510800000074
The inverse cosine value of the included angle of (b) is taken as a third characteristic quantity of the point and is recorded as:
Figure BDA0002321510800000075
step 2.4: for the ith' point Pt in the point set Pti′For it, search for the corresponding point in the set of points Qt, if Qtj′Is pti′Should the three geometrical characteristics proposed above be the same or approximately the same for the two points, and therefore for Pt in the point set Pti′And Qt in the set of points Qtj′We consider these three conditions to have correspondence as long as they are satisfied, and the three conditions are as follows:
|f1(pti′)-f1(qtj′)|/(f1(pti′)+f1(qtj′))≤ε2
|f2(pti′)-f2(qtj′)|/(f2(pti′)+f2(qtj′))≤ε3
|f3(pti′)-f3(qtj′)|/(f3(pti′)+f3(qtj′))≤ε4
wherein take epsilon2=ε3=ε4And (5) filtering by using the three relations and initially establishing the point correspondence relation, wherein the point correspondence relation is 0.01.
And step 3: based on the information obtained in step 2, finding a corresponding point for each point in Pt in Qt as its initial matching point, and the initial matching point pair is set as:
Figure BDA0002321510800000081
where num (matchdots) is the number of initial matching point pairs.
FIG. 4 is a schematic diagram of two pairwise view-angle initial matching point-to-connecting lines of the bunny point cloud.
And 4, step 4: combining the distance constraint condition between points, using a self-adaptive RANSAC algorithm to obtain an accurate matching point pair, and the method specifically comprises the following steps:
step 4.1: any two point pairs in Matchdots
Figure BDA0002321510800000082
And
Figure BDA0002321510800000083
if all are correct matching point pairs, then it is easy to get the following according to the constraint condition of distance between points:
Figure BDA0002321510800000084
since it is difficult to find the exact corresponding point pair in two discrete point clouds, the correct matching point pair here is only an approximate corresponding point pair, i.e. it only needs to satisfy:
Figure BDA0002321510800000085
selecting an appropriate ε5>0(ε5Taken as 0.02), for point pairs in Matchdots
Figure BDA0002321510800000086
And
Figure BDA0002321510800000087
if it satisfies
Figure BDA0002321510800000088
Then
Figure BDA0002321510800000089
And
Figure BDA00023215108000000810
and meeting the constraint condition of the distance between the points.
To pair
Figure BDA00023215108000000811
Calculating the number num of point pairs of Matchdots, except for the point pairs meeting the point-to-point distance constraint conditiong
Step 4.2: because a certain number of correct matching point pairs exist in the matching point pairs obtained after the initial matching, in general, the number num of the point pairs conforming to the distance constraint is obtained by the correct matching point pairsgIs relatively large; the num proposed above is calculated for each point pairgValue, then in numgSorting Matchdots from large to small, and selecting the first N point pairs; the calculation of the rigid body transformation matrix requires at least 3 pairs of matching points which are not collinear in the three-dimensional space.
The RANSAC algorithm is used herein to check whether the N point pairs are correct matching point pairs, where N is generally 50 in the experiment, and the specific checking method is as follows:
1) randomly selecting 3 point pairs from the N point pairs as a sample;
2) considering the 3 point pairs as correct matching point pairs, and calculating a rigid body transformation matrix U;
3) judging whether the rest N-3 point pairs are correctly matched under the rigid body transformation matrix U: for any point(s) in the remaining N-3 matching points1,s2) If | | | U(s)1),s2If | is less than a predefined threshold (this threshold is typically set to 2 times the average distance between points of the point cloud), then(s) is considered1,s2) If the matching point pair is correct, marking as an inner point; otherwise,(s)1,s2) And the matching point pairs are wrong and are marked as outer points, and all the inner points form a consistent set of the sampling. Wherein, U(s)1) Denotes s1And calculating a new coordinate point through the rigid body transformation matrix.
4) Repeating the steps 3 until the cycle times reach the upper limit of the sampling times, and finally taking the rigid body transformation matrix corresponding to the consistent set with the maximum number of the internal points as a correct rigid body transformation matrix. And obtaining inner points in Matchdots as final accurate matching point pairs according to the transformation matrix, wherein the accurate matching point pairs are recorded as:
Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}
wherein t is the number of precisely matched point pairs;
fig. 5 is a schematic diagram of the effect of the matching point pairs obtained after distance constraint of two pairwise viewing angle matching point pairs of bunny.
And 5: calculating an initial registration parameter by using a four-element method according to the accurate matching point pair obtained in the step 4, and specifically comprising the following steps:
1): separately compute a set of points { me|meE.p, e 1,2, n, and m'e|m'eE.g., centroid of Q, e 1, 2.
Figure BDA0002321510800000091
Wherein μ, μ' are each point me,m'eThe center of mass of;
2): respectively set points { me|meE.p, e 1,2, n, and m'e|m'eE Q, e 1,2, n, do a translation relative to the centroid:
Figure BDA0002321510800000092
wherein s ise,s'eAre respectively a point me,m'eNew feature points after the relative centroid translation;
3): according to the point set seAnd { s'eComputing a correlation matrix K:
Figure BDA0002321510800000101
4): by the elements K in the correlation matrix KbcConstructing a four-dimensional symmetric matrix Ka
Figure BDA0002321510800000102
Wherein b, c is 1,2,3, 4;
5): calculating KaThe unit feature vector q corresponding to the maximum feature root:
q=[q0q1q2q3]T
6): calculating a rotation matrix R0
Figure BDA0002321510800000103
7): solving translation vector T0
T0=μ'-Rμ。
Fig. 6 is a result diagram obtained after initial registration of the bunny four-view point cloud image, and it can be seen from the result diagram that a satisfactory result can be obtained by the initial registration of the method.
Step 6: the method adopts an improved ICP algorithm to accurately register the point cloud, and comprises the following specific steps:
step 6.1, utilizing the initial registration parameter R obtained in step 50And T0Transforming each point in the target point cloud P to a coordinate system of the reference point cloud Q, and obtaining an initial target point cloud set P in the ICP algorithm after transformation0={p01,p02,…,p0mIn which p is0i=R0·pi+T0,piIs the ith point in the target point cloud P, i is 1,2, …, m;
step 6.2, for P0At any point in the four points, the gravity center P of a tetrahedron formed by four points with the nearest Euclidean distance on Q is obtained through a polyhedron gravity center formula0The ith point and its corresponding center of gravity constitute the initial closest point pair (p) in the ICP algorithm0i,q0i);
Step 6.3, according to an iterative formula Ps+1=Rs+1Ps+Ts+1Performing iterative computation onUntil a set convergence condition d is satisfieds-ds+1If the registration parameter is less than E, the obtained final registration parameter is used for registration of the target point cloud P to obtain an accurate registration result; wherein E is a set threshold (E is 0.001 in the present invention),
Figure BDA0002321510800000111
Figure BDA0002321510800000112
(p(s-1)i,q(s-1)i)、(psi,qsi) Respectively the s-1 th iteration and the nearest neighbor point pair, R, obtained by the registration parameter obtained by the iteration after the s iteration according to the method in the steps 6.1 and 6.2s、Rs+1Respectively obtaining rotation matrixes after the s-th iteration and the s + 1-th iteration; t iss、Ts+1Respectively obtaining translation vectors after the s-th iteration and the s + 1-th iteration; ps、Ps+1Respectively is a point cloud set of the target point cloud P after s iterations and s +1 iterations, ds、ds+1Are respectively Ps、Ps+1And the average distance of Q.
Fig. 7 is a comparison graph of results obtained after initial and accurate registration of bunny two-view point cloud images, and the effectiveness and good registration effect of the method can be seen from the comparison graph.
Based on the normal vector information of the point cloud, the invention selects points with larger local normal vector change as feature points, utilizes three kinds of geometric information as feature description, sets and compares the similarity degree of the feature points through a threshold value to obtain initial matching point pairs, obtains accurate matching point pairs by using a method of combining rigid distance constraint and RANSAC algorithm, obtains initial registration parameters by using a four-element method, and further obtains accurate registration results by using improved ICP algorithm iteration.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can understand that the modifications or substitutions within the technical scope of the present invention are included in the scope of the present invention, and therefore, the scope of the present invention should be subject to the protection scope of the claims.

Claims (7)

1. A three-dimensional point cloud automatic registration method based on feature information change degree, wherein the point cloud data to be registered have an overlapping part, is characterized by comprising the following steps:
step 1, acquiring feature point sets Pt and Qt of two point cloud data P and Q to be registered based on normal vector information of different neighborhood radii, wherein Q is reference point cloud;
step 2, establishing the feature description of each feature point in Pt and Qt;
step 3, based on the characteristic description of Pt and Qt obtained in step 2, finding the most similar point for each point in Pt in Qt as a matching point to obtain an initial matching point pair set;
step 4, combining the initial matching point pair set obtained in the step 3 with the distance constraint condition between points, and using a self-adaptive RANSAC algorithm to obtain an accurate matching point pair set;
step 5, calculating an initial registration parameter by using a four-element method;
and 6, performing point cloud registration by adopting an improved iterative closest point ICP algorithm.
2. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 1, wherein the step 1 specifically comprises:
step 1.1, the characteristic degree of a certain point p in the point cloud data is
Figure FDA0002321510790000011
Where θ is p in different neighborhood radius r1、r2Normal vector of
Figure FDA0002321510790000012
Angle of (d) r1≠r2
Step 1.2, extracting the characteristic degree in the point cloud dataGreater than a set threshold epsilon1The points of (2) are used as characteristic points of the point cloud data;
step 1.3, feature point extraction is performed on P and Q by the methods of step 1.1 and step 1.2, respectively, and the set of feature points of P is obtained as Pt ═ { Pt ═ Pt1,pt2,pt3,…ptm'Q has a feature point set of Qt ═ Qt1,qt2,qt3,…qtn'Where m 'and n' are the numbers of characteristic points of P and Q, respectively.
3. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 2, wherein the step 2 specifically comprises:
step 2.1, for the ith point Pt in PtiThe feature degree of the point is taken as the first feature quantity f at the point1(pti);
Step 2.2: for ptiIn P with ptiThe point in the spherical region having the origin and the radius of γ is denoted as ptiTo calculate ptiNeighborhood center of gravity O (pt)i) In ptiTo O (pt)i) As the second characteristic quantity f of the point2(pti)=||O(pti)-pti||;
Step 2.3: for ptiIn ptiAnd O (pt)i) The line between pt anditaking the inverse cosine value of the included angle of the normal vector as the third characteristic quantity of the point
Figure FDA0002321510790000013
Step 2.4: for ptiIf the jth point Qt exists at QtjMeet the following three conditions, then qtjIs ptiThe corresponding points of (1):
|f1(pti)-f1(qtj)|/(f1(pti)+f1(qtj))≤ε2
|f2(pti)-f2(qtj)|/(f2(pti)+f2(qtj))≤ε3
|f3(pti)-f3(qtj)|/(f3(pti)+f3(qtj))≤ε4
wherein epsilon2、ε3、ε4All are set thresholds.
4. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 3, wherein the step 3 is specifically as follows:
based on the information obtained in step 2, finding a corresponding point for each point in Pt in Qt as its initial matching point, and the initial matching point pair is set as:
Figure FDA0002321510790000021
where num (matchdots) is the number of initial matching point pairs.
5. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 4, wherein the step 4 specifically comprises:
step 4.1: for point pairs in Matchdots
Figure FDA0002321510790000022
And
Figure FDA0002321510790000023
if it satisfies
Figure FDA0002321510790000024
Then
Figure FDA0002321510790000025
And
Figure FDA0002321510790000026
according to the constraint condition of distance between pointsMathhdots neutralization
Figure FDA0002321510790000027
Number num of matching point pairs satisfying distance constraint between pointsgWherein h is 1,2,3 … num (Matchdots), dist (·,) represents the distance between two points, ε5Setting a threshold;
step 4.2: selecting the matching point pair corresponding to the larger N values in the calculation result of the step 4.1, and checking whether the N matching point pairs are correct matching point pairs by using a RANSAC algorithm, wherein the specific checking method comprises the following steps:
1) randomly selecting 3 matching point pairs from the N matching point pairs as a sample;
2) setting 3 selected matching point pairs in the step 1) as correct matching point pairs, and calculating a rigid body transformation matrix U;
3) judging whether the rest N-3 matching points are correctly matched under the rigid body transformation matrix U in the step 2): for any point(s) in the remaining N-3 matching points1,s2) If | | | U(s)1),s2If | is less than the predefined threshold value, then(s) is considered1,s2) If the matching point pair is correct, marking as an inner point; otherwise,(s)1,s2) The matching point pairs are wrong and are marked as outer points, and all inner points form a consistent set of the sampling; wherein, U(s)1) Denotes s1Calculating a new coordinate point through a rigid body transformation matrix;
4) repeating the steps 1) to 3) until the iteration times reaches the upper limit of the sampling times, finally, taking the rigid body transformation matrix corresponding to the consistent set with the maximum number of the internal points obtained by iteration as a correct rigid body transformation matrix, and obtaining the internal points in Matchdots as final accurate matching point pairs according to the correct rigid body transformation matrix, wherein the accurate matching point pair set is as follows:
Matchdots1={(me,m'e)|me∈P,m'e∈Q,e=1,2,3…t}
wherein (m)e,m'e) To exactly match a pair of points, t is the number of exactly matching pairs of points.
6. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 5, wherein the calculation of the initial registration parameter in the step 5 is specifically as follows:
1): separately compute a set of points { me|meE.g. P, e 1, 2.e|m'eE.g., centroid of Q, e 1, 2.
Figure FDA0002321510790000031
Wherein μ, μ' are each point me,m'eThe center of mass of;
2): respectively set points { me|meE.g. P, e 1, 2.e|m'eE Q, e 1,2, t, do a translation relative to the centroid:
Figure FDA0002321510790000032
wherein s ise,s'eAre respectively a point me,m'eNew feature points after the relative centroid translation;
3): according to the point set seAnd { s'eComputing a correlation matrix K:
Figure FDA0002321510790000033
4): by the elements K in the correlation matrix KbcConstructing a four-dimensional symmetric matrix Ka
Figure FDA0002321510790000034
Wherein b, c is 1,2,3, 4;
5): calculating KaThe unit feature vector q corresponding to the maximum feature root:
q=[q0q1q2q3]T
6): calculating the moment of rotationArray R0
Figure FDA0002321510790000041
7): solving translation vector T0
T0=μ'-Rμ。
7. The method for automatically registering the three-dimensional point cloud based on the change degree of the feature information as claimed in claim 6, wherein the step 6 specifically comprises:
step 6.1, utilizing the initial registration parameter R obtained in step 50And T0Transforming each point in the target point cloud P to a coordinate system of the reference point cloud Q, and obtaining an initial target point cloud set P in the ICP algorithm after transformation0={p01,p02,…,p0mIn which p is0i=R0·pi+T0,piIs the ith point in the target point cloud P, i is 1,2, …, m;
step 6.2, for P0At any point in the four points, the gravity center P of a tetrahedron formed by four points with the nearest Euclidean distance on Q is obtained through a polyhedron gravity center formula0The ith point and its corresponding center of gravity constitute the initial closest point pair (p) in the ICP algorithm0i,q0i);
Step 6.3, according to an iterative formula Ps+1=Rs+1Ps+Ts+1Performing iterative calculation until a set convergence condition d is mets-ds+1If the registration parameter is less than E, the obtained final registration parameter is used for registration of the target point cloud P to obtain an accurate registration result; wherein E is a set threshold value,
Figure FDA0002321510790000042
(p(s-1)i,q(s-1)i)、(psi,qsi) Respectively the s-1 th iteration and the nearest neighbor point pair, R, obtained by the registration parameter obtained by the iteration after the s iteration according to the method in the steps 6.1 and 6.2s、Rs+1Are respectively asObtaining a rotation matrix after the s-th iteration and the s + 1-th iteration; t iss、Ts+1Respectively obtaining translation vectors after the s-th iteration and the s + 1-th iteration; ps、Ps+1Respectively is a point cloud set of the target point cloud P after s iterations and s +1 iterations, ds、ds+1Are respectively Ps、Ps+1And the average distance of Q.
CN201911299529.7A 2019-12-17 2019-12-17 Three-dimensional point cloud automatic registration method based on characteristic information change degree Pending CN111145232A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911299529.7A CN111145232A (en) 2019-12-17 2019-12-17 Three-dimensional point cloud automatic registration method based on characteristic information change degree

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911299529.7A CN111145232A (en) 2019-12-17 2019-12-17 Three-dimensional point cloud automatic registration method based on characteristic information change degree

Publications (1)

Publication Number Publication Date
CN111145232A true CN111145232A (en) 2020-05-12

Family

ID=70518551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911299529.7A Pending CN111145232A (en) 2019-12-17 2019-12-17 Three-dimensional point cloud automatic registration method based on characteristic information change degree

Country Status (1)

Country Link
CN (1) CN111145232A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111681282A (en) * 2020-06-18 2020-09-18 浙江大华技术股份有限公司 Pallet identification processing method and device
CN111862173A (en) * 2020-07-03 2020-10-30 广州大学 On-line fitting and wearing method based on point cloud registration
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 Point cloud accurate registration method based on robust constraint least square algorithm
CN112085793A (en) * 2020-09-04 2020-12-15 上海理工大学 Three-dimensional imaging scanning system based on combined lens group and point cloud registration method
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 Three-dimensional point cloud matching method and device, electronic equipment and storage medium
CN113421290A (en) * 2021-07-05 2021-09-21 山西大学 Power plant boiler interior three-dimensional reconstruction method based on unmanned aerial vehicle image acquisition
CN113643270A (en) * 2021-08-24 2021-11-12 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN114037745A (en) * 2021-11-17 2022-02-11 北京容积视觉科技有限公司 Multi-view three-dimensional point cloud data rough registration method based on branch and bound
CN114648445A (en) * 2022-03-03 2022-06-21 电子科技大学 Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN111768490B (en) * 2020-05-14 2023-06-27 华南农业大学 Plant three-dimensional modeling method and system based on iteration closest point and manual intervention

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄源 等: "一种基于特征提取的点云自动配准算法" *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111768490B (en) * 2020-05-14 2023-06-27 华南农业大学 Plant three-dimensional modeling method and system based on iteration closest point and manual intervention
CN111681282A (en) * 2020-06-18 2020-09-18 浙江大华技术股份有限公司 Pallet identification processing method and device
CN111862173A (en) * 2020-07-03 2020-10-30 广州大学 On-line fitting and wearing method based on point cloud registration
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 Point cloud accurate registration method based on robust constraint least square algorithm
CN112017220B (en) * 2020-08-27 2023-07-28 南京工业大学 Point cloud accurate registration method based on robust constraint least square algorithm
CN112085793B (en) * 2020-09-04 2022-07-05 上海理工大学 Three-dimensional imaging scanning system based on combined lens group and point cloud registration method
CN112116638A (en) * 2020-09-04 2020-12-22 季华实验室 Three-dimensional point cloud matching method and device, electronic equipment and storage medium
CN112085793A (en) * 2020-09-04 2020-12-15 上海理工大学 Three-dimensional imaging scanning system based on combined lens group and point cloud registration method
CN113421290A (en) * 2021-07-05 2021-09-21 山西大学 Power plant boiler interior three-dimensional reconstruction method based on unmanned aerial vehicle image acquisition
CN113643270A (en) * 2021-08-24 2021-11-12 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN113643270B (en) * 2021-08-24 2024-04-26 凌云光技术股份有限公司 Image registration method and device based on point cloud data
CN114037745A (en) * 2021-11-17 2022-02-11 北京容积视觉科技有限公司 Multi-view three-dimensional point cloud data rough registration method based on branch and bound
CN114648445A (en) * 2022-03-03 2022-06-21 电子科技大学 Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization
CN114648445B (en) * 2022-03-03 2023-05-30 电子科技大学 Multi-view high-resolution point cloud splicing method based on feature point extraction and fine registration optimization

Similar Documents

Publication Publication Date Title
CN111145232A (en) Three-dimensional point cloud automatic registration method based on characteristic information change degree
CN107063228B (en) Target attitude calculation method based on binocular vision
JP6216508B2 (en) Method for recognition and pose determination of 3D objects in 3D scenes
He et al. Sparse template-based 6-D pose estimation of metal parts using a monocular camera
Liu et al. Relative pose estimation for cylinder-shaped spacecrafts using single image
CN107358629B (en) Indoor mapping and positioning method based on target identification
CN114972459B (en) Point cloud registration method based on low-dimensional point cloud local feature descriptor
CN107818598B (en) Three-dimensional point cloud map fusion method based on visual correction
Sharp et al. Invariant features and the registration of rigid bodies
WO2011105134A1 (en) Position and orientation estimation method and apparatus therefor
CN107492107B (en) Object identification and reconstruction method based on plane and space information fusion
CN108830888B (en) Coarse matching method based on improved multi-scale covariance matrix characteristic descriptor
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN110009680B (en) Monocular image position and posture measuring method based on circle feature and different-surface feature points
CN109766903B (en) Point cloud model curved surface matching method based on curved surface features
CN108447084B (en) Stereo matching compensation method based on ORB characteristics
CN112767457A (en) Principal component analysis-based plane point cloud matching method and device
Torre-Ferrero et al. 3D point cloud registration based on a purpose-designed similarity measure
JP5462662B2 (en) Position / orientation measurement apparatus, object identification apparatus, position / orientation measurement method, and program
CN114545412B (en) Space target attitude estimation method based on ISAR image sequence equivalent radar line-of-sight fitting
CN109741389A (en) One kind being based on the matched sectional perspective matching process of region base
CN114972539A (en) Machine room camera plane online calibration method, system, computer equipment and medium
Cui et al. Registration and integration algorithm in structured light three-dimensional scanning based on scale-invariant feature matching of multi-source images
Ling et al. Aircraft pose estimation based on mathematical morphological algorithm and Radon transform
Oh et al. A reliable quasi-dense corresponding points for structure from motion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination