CN111127667B - Point cloud initial registration method based on region curvature binary descriptor - Google Patents

Point cloud initial registration method based on region curvature binary descriptor Download PDF

Info

Publication number
CN111127667B
CN111127667B CN201911133616.5A CN201911133616A CN111127667B CN 111127667 B CN111127667 B CN 111127667B CN 201911133616 A CN201911133616 A CN 201911133616A CN 111127667 B CN111127667 B CN 111127667B
Authority
CN
China
Prior art keywords
point
points
module
feature
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911133616.5A
Other languages
Chinese (zh)
Other versions
CN111127667A (en
Inventor
恒一陟
耿国华
张雨禾
陆正杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest University
Original Assignee
Northwest University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest University filed Critical Northwest University
Priority to CN201911133616.5A priority Critical patent/CN111127667B/en
Publication of CN111127667A publication Critical patent/CN111127667A/en
Application granted granted Critical
Publication of CN111127667B publication Critical patent/CN111127667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2008Assembling, disassembling

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a point cloud initial registration method based on a region curvature binary descriptor. The method has the advantages of high calculation speed, high registration accuracy, strong anti-interference capability and the like, can resist normal errors and changed point cloud density, and can meet the requirements of processing cultural relic fragments with different shapes, serious appearance damage and high noise when being applied specifically.

Description

Point cloud initial registration method based on region curvature binary descriptor
Technical Field
The invention belongs to the technical field of three-dimensional point cloud registration, and particularly relates to a point cloud initial registration method based on a region curvature binary descriptor.
Background
Three-dimensional point cloud registration is a popular research direction in computer graphics at present, and has profound influence on various fields such as modern medicine, cultural relic protection, aerospace and the like. The three-dimensional point cloud registration refers to that the model point cloud is transformed to a coordinate system which is the same as the scene point cloud through rigid transformation to obtain finished three-dimensional point cloud data. The correspondence of points between two point clouds is a core problem that the registration needs to solve. Existing registration methods are generally divided into two steps, initial registration and fine registration. The initial registration starts from the overall situation, an approximate rotation translation matrix of two point clouds is found, the optimization efficiency is improved, and the fine registration is prevented from falling into the local optimum. The fine registration means that after the initial registration, a more accurate registration matrix is obtained through further calculation.
The initial registration of three-dimensional point clouds can be currently divided into three categories: global registration based methods, local feature description based algorithms, and random sample consensus (RANSAC) framework based methods. Based on an algorithm of global feature description, converting point cloud registration into a point cloud feature search problem under different visual angles; based on the algorithm of local feature description, registering by establishing, searching and matching the corresponding relation of key feature points of the point cloud; and determining corresponding points by utilizing an overlapping area between the point cloud data based on a random sampling consistency algorithm, and solving a rigid body transformation relation between the point clouds to be matched according to the corresponding points. Zhou et al propose a fast global registration algorithm. The algorithm performs a matching operation on points of the point cloud surface. This optimization achieves close alignment of the target point cloud without initialization in order to align the surfaces and remove false matches to optimize a single target.
Tombari et al propose a SHOT descriptor, divide 32 subspaces of a spherical space of feature points along radial, azimuth and pitching 3 directions, and count included angles between a normal of a neighborhood point and a normal of a key point in each region to generate the SHOT descriptor. The SHOT method has better local feature description capability, but is sensitive to density variation of point cloud and Gaussian noise.
Shen et al first combined coarse registration of FPFH with fine registration of ICP. The FPFH operation speed is very fast, and the dimension is lower, saves the operation space, but does not consider the selection standard of neighborhood radius when calculating, and the degree of dependence on the point cloud precision is very high. When the accuracy of the cloud data sets of the points to be registered is different, the registration is inefficient.
In conclusion, although scholars at home and abroad make extensive research in the field of initial registration of three-dimensional vision, the existing initial registration method has the defects of low registration precision, poor robustness and low iteration speed.
Disclosure of Invention
In order to solve the problems, the invention provides a point cloud initial registration method based on a region curvature binary descriptor, and solves the defects of low registration accuracy, poor robustness and low iteration speed of the conventional initial registration method.
In order to solve the technical problems, the invention adopts the following technical scheme to realize:
the point cloud initial registration method based on the region curvature binary descriptor specifically comprises the following steps:
step 1, obtaining a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,21,2,...,m,i≠j,m≥2;
Step 2, extracting S i Forming a set of feature points of
Figure BDA0002278994460000021
Figure BDA0002278994460000022
Representing a point cloud S i The v-th feature point in (1);
converting these feature points into binary strings
Figure BDA0002278994460000031
Figure BDA0002278994460000032
Figure BDA0002278994460000033
Represents->
Figure BDA0002278994460000034
The v-th descriptor in (1);
step 3, extracting S j Forming a set of feature points of
Figure BDA0002278994460000035
Figure BDA0002278994460000036
Representing a point cloud S j The u-th feature point in (1);
converting these feature points into binary strings
Figure BDA0002278994460000037
Figure BDA0002278994460000038
Figure BDA0002278994460000039
Represents->
Figure BDA00022789944600000310
The u-th descriptor in (1);
step 4, searching
Figure BDA00022789944600000311
And &>
Figure BDA00022789944600000312
The same characteristic points in (1):
step 4.1, in
Figure BDA00022789944600000313
In and->
Figure BDA00022789944600000314
Closest->
Figure BDA00022789944600000315
And next closest->
Figure BDA00022789944600000316
Figure BDA00022789944600000317
Figure BDA00022789944600000318
When the ratio of the two distances->
Figure BDA00022789944600000319
Below a threshold value T 1 ,0.6≤T 1 Less than or equal to 0.9, and storing>
Figure BDA00022789944600000320
Step 4.2 is carried out; otherwise, repeat 4.1;
Figure BDA00022789944600000321
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600000322
represents->
Figure BDA00022789944600000323
And/or>
Figure BDA00022789944600000324
Is at a distance of->
Figure BDA00022789944600000325
Represents->
Figure BDA00022789944600000326
And/or>
Figure BDA00022789944600000327
The distance between them;
step 4.2 in
Figure BDA00022789944600000328
The middle search and the decision in step 4.1 are->
Figure BDA00022789944600000329
Closest->
Figure BDA00022789944600000330
And next closest>
Figure BDA00022789944600000331
When the ratio of the two distances +>
Figure BDA00022789944600000332
Below a threshold value T 2 ,0.6≤T 2 Less than or equal to 0.9 percent and is preserved>
Figure BDA00022789944600000333
Step 4.3 is carried out; otherwise, returning to the step 4.1;
Figure BDA00022789944600000334
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600000335
represents->
Figure BDA00022789944600000336
And/or>
Figure BDA00022789944600000337
Is at a distance of->
Figure BDA00022789944600000338
Represents->
Figure BDA00022789944600000339
And/or>
Figure BDA00022789944600000340
The distance between them;
step 4.3, calculating the result obtained in step 4.1
Figure BDA00022789944600000341
And ^ obtained in step 4.2>
Figure BDA00022789944600000342
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 Performing step 4.4 when the concentration is less than or equal to 0.85 mr; otherwise, returning to the step 4.1;
step 4.4, judge the descriptor
Figure BDA0002278994460000041
And descriptor>
Figure BDA0002278994460000042
Whether or not the same feature point corresponds, if the descriptor->
Figure BDA0002278994460000043
And descriptor
Figure BDA0002278994460000044
Corresponding to the same characteristic point, obtaining a group of matched characteristic points; otherwise, returning to the step 4.1;
step 4.7, repeat steps 4.1 through 4.4 until found
Figure BDA0002278994460000045
And &>
Figure BDA0002278994460000046
All matched feature points;
and 5, repeating the steps 2 to 4, traversing all point cloud data in the data set S to obtain all matched feature points, and thus obtaining the point cloud data set S' after initial registration.
Specifically, in the step 2, the feature points are converted into binary strings
Figure BDA0002278994460000047
The process comprises the following steps:
for each feature point
Figure BDA0002278994460000048
The following operations are performed: />
Step 2.1, obtaining characteristic points
Figure BDA0002278994460000049
Is a set of characteristic points whose central radius is within r>
Figure BDA00022789944600000410
R is more than or equal to 25mr and less than or equal to 75mr, and is calculated>
Figure BDA00022789944600000411
The covariance matrix D of (a);
Figure BDA00022789944600000412
Figure BDA00022789944600000413
in the formula (d) i Representing characteristic points
Figure BDA00022789944600000414
And field point->
Figure BDA00022789944600000415
Is greater than or equal to>
Figure BDA00022789944600000416
Means a sum within a radius r>
Figure BDA00022789944600000417
Adjacent arbitrary points;
step 2.2, establishing characteristic points according to the covariance matrix D
Figure BDA00022789944600000418
And then combining the three-dimensional local coordinate system>
Figure BDA00022789944600000419
And all the points adjacent to the point are projected on a two-dimensional plane;
step 2.3, divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s Representing the number of each edge division of the two-dimensional plane;
step 2.4, calculating the weighted curvature of the projection point in each grid, aggregating all weighted curvatures to form a curvature distribution matrix, and automatically forming a curvature map by the curvature distribution matrix;
step 2.5, the curvature graph in the step 2.4 is converted into a binary string, and the characteristic points are obtained
Figure BDA00022789944600000420
The corresponding descriptors.
Specifically, the process of converting the feature points into the binary character strings in the step 3 is the same as the step 2.
Specifically, the step 2 and the step 3 both use an ISS feature point extraction method to extract feature points.
Specifically, in the step 1, the obtained point cloud data needs to be preprocessed, and the preprocessing process includes filtering denoising and outlier denoising.
The invention also discloses a point cloud initial registration system based on the region curvature binary descriptor, which comprises the following modules:
a data acquisition module for acquiring a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,2,.. The m, j =1,2,. The m, i ≠ j, and m ≧ 2;
a feature point extraction module for extracting S i Forming a set of feature points of
Figure BDA0002278994460000051
Figure BDA0002278994460000052
Representing a point cloud S i The v-th feature point in (1); extraction of S j Is formed by a characteristic point-forming set>
Figure BDA0002278994460000053
Figure BDA0002278994460000054
Representing a point cloud S j The u-th feature point in (1);
a feature point conversion module for converting the feature points in the feature point extraction module into binary character strings
Figure BDA0002278994460000055
And
Figure BDA0002278994460000056
Figure BDA0002278994460000057
Figure BDA0002278994460000058
represents->
Figure BDA0002278994460000059
The v-th descriptor in (1); />
Figure BDA00022789944600000510
Represents->
Figure BDA00022789944600000511
The u-th descriptor in (1);
a feature point matching module for finding
Figure BDA00022789944600000512
And &>
Figure BDA00022789944600000513
The same characteristic points specifically include:
a positive sequence search module for searching in
Figure BDA00022789944600000514
In and->
Figure BDA00022789944600000515
Nearest-neighbor>
Figure BDA00022789944600000516
And next closest>
Figure BDA00022789944600000517
/>
Figure BDA00022789944600000518
When the ratio of the two distances->
Figure BDA00022789944600000519
Below a threshold value T 1 ,0.6≤T 1 Less than or equal to 0.9, and storing>
Figure BDA00022789944600000520
Entering a reverse order searching module; otherwise, repeating the positive sequence searching process;
Figure BDA00022789944600000521
in the formula (I), the compound is shown in the specification,
Figure BDA0002278994460000061
represents->
Figure BDA0002278994460000062
And/or>
Figure BDA0002278994460000063
Is at a distance of->
Figure BDA0002278994460000064
Represents->
Figure BDA0002278994460000065
And/or>
Figure BDA0002278994460000066
The distance between them;
a reverse order lookup module for
Figure BDA0002278994460000067
Get ^ in the middle search and positive sequence search module>
Figure BDA0002278994460000068
Closest->
Figure BDA0002278994460000069
And next closest->
Figure BDA00022789944600000610
When the ratio of the two distances +>
Figure BDA00022789944600000611
Below a threshold value T 2 ,0.6≤T 2 Less than or equal to 0.9 percent and is preserved>
Figure BDA00022789944600000612
Entering; otherwise, returning to the positive sequence searching module;
Figure BDA00022789944600000613
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600000614
represents->
Figure BDA00022789944600000615
And/or>
Figure BDA00022789944600000616
Is at a distance of->
Figure BDA00022789944600000617
Represents->
Figure BDA00022789944600000618
And &>
Figure BDA00022789944600000619
The distance between them;
an Lp distance calculation module for calculating the distance obtained by the positive sequence search module
Figure BDA00022789944600000620
And the result of the reverse order lookup module>
Figure BDA00022789944600000621
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 Matching the characteristic points with a module less than or equal to 0.85 mr; otherwise, returning to the positive sequence searching module;
a feature point pairing module for judging the descriptor
Figure BDA00022789944600000622
And descriptor->
Figure BDA00022789944600000623
Whether or not the same feature point corresponds, if the descriptor->
Figure BDA00022789944600000624
And descriptor->
Figure BDA00022789944600000625
Corresponding to the same characteristic point, obtaining a group of matched characteristic points; otherwise, returning to the positive sequence searching module;
a repeated calculation module for searching the module from the positive sequence to the characteristic point matching module until the characteristic point is found
Figure BDA00022789944600000626
And &>
Figure BDA00022789944600000627
All the mutually matched feature points are obtained;
and the initially registered point cloud data set S 'generation module is used for repeating the feature point extraction module to the feature point matching module until all point cloud data in the data set S are traversed to obtain all matched feature points, and thus the initially registered point cloud data set S' can be obtained.
Specifically, the feature point conversion module specifically includes:
for each feature point
Figure BDA00022789944600000628
The following operations are performed:
obtaining characteristic points
Figure BDA00022789944600000629
Is a set of characteristic points whose central radius is within r>
Figure BDA00022789944600000630
R is more than or equal to 25mr and less than or equal to 75mr, and calculating
Figure BDA00022789944600000631
The covariance matrix D of (a);
Figure BDA0002278994460000071
Figure BDA0002278994460000072
/>
in the formula (d) i Representing characteristic points
Figure BDA0002278994460000073
And field point->
Figure BDA0002278994460000074
Is greater than or equal to>
Figure BDA0002278994460000075
Means a sum within a radius r>
Figure BDA0002278994460000076
Adjacent arbitrary points;
establishing characteristic points according to the covariance matrix D
Figure BDA0002278994460000077
And then the three-dimensional local coordinate system is set
Figure BDA0002278994460000078
And all the points adjacent to the point are projected on a two-dimensional plane;
divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s Representing the number of each edge division of the two-dimensional plane;
calculating the weighted curvature of the projection point in each grid, aggregating all weighted curvatures to form a curvature distribution matrix, and automatically forming a curvature map by the curvature distribution matrix;
converting the obtained curvature diagram into binary strings to obtain the characteristic points
Figure BDA0002278994460000079
The corresponding descriptors.
Specifically, each one
Figure BDA00022789944600000710
Conversion into a binary string is performed in parallel with>
Figure BDA00022789944600000711
The transformation process of (1).
Specifically, the feature point extraction module extracts feature points by using an ISS feature point extraction method.
Further, the system also comprises a data preprocessing module for preprocessing the point cloud data obtained by the data acquisition module, wherein the preprocessing process comprises filtering denoising and outlier denoising.
Compared with the prior art, the invention has the beneficial effects that:
firstly, converting the description of a 3D point cloud into a 2D image block, calculating the regional curvature of the distribution of projection points, generating a binary character string, and providing an efficient and compact binary 3D local descriptor; and then carrying out feature matching through a threshold matching algorithm pair of the bidirectional distance ratio. As an initial registration method of point cloud data, the method has the advantages of high calculation speed, high registration accuracy, strong anti-jamming capability and the like, can resist normal errors and changed point cloud density, and can meet the requirements of processing cultural relic fragments with different shapes, serious appearance damage and high noise when being applied specifically.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is an implementation schematic of a descriptor, (a) a 2D distribution projection of feature points; (b) proxel meshing; (c) curvature profile; and (d) a descriptor.
Fig. 3 shows the fragment data of terracotta soldiers and horses, (a) chest; (b) a waist; (c) a shoulder; (d) a knee; (e) a leg portion.
Fig. 4 is a set of best matches for a fragment group, (a) a breast; (b) a waist; (c) a shoulder; (d) a knee; (e) a leg portion.
Fig. 5 shows the effect of each method on initial registration, (a) the thorax; (b) a waist; (c) a shoulder; (d) a knee; (e) a leg portion.
Fig. 6 is a fine registration of the present registration results using ICP algorithm, (a) thorax; (b) a waist; (c) a shoulder; (d) a knee; (e) a leg portion.
The invention is explained in more detail below with reference to the drawings and the description of the preferred embodiments.
Detailed Description
The scheme of the invention mainly aims at the initial registration stage of point cloud data, and for the later fine registration, some fine registration methods which are widely used at present can be used, such as a closest point iterative ICP algorithm for fine registration and the like, and the fine registration process will not be elaborated in detail herein.
In addition, the feature point referred to in the present invention refers to a point at which the image gradation value changes drastically or a point at which the curvature is large on the edge of the image (i.e., the intersection of two edges).
The point cloud initial registration method based on the binary descriptor of the curvature of the region of the present invention is described in detail below, it should be noted that the present invention is not limited to the following specific embodiments, and any equivalent transformation based on the technical solution of the present invention falls within the scope of the present invention.
The method specifically comprises the following steps:
step 1, point cloud data of a measured object is obtained, and the data is preprocessed:
obtaining a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,2, · m, j =1,2,. M, i ≠ j, m ≧ 2; in the following embodimentsIn the examples, m =2.
The invention can use different devices such as radar, laser scanning, kinect device and the like to obtain the surface information of the measured object, wherein m is the scanning times.
When the point cloud data is scanned and acquired, due to the influence of problems such as unskilled operators and equipment precision, some redundant noise points are generated in the acquired point cloud data. Filtering and denoising the acquired cultural relic point cloud data and filtering outliers. The method mainly uses a voxelization grid sampling method to filter the point cloud, firstly, the maximum length of the point cloud data in each coordinate axis direction in a three-dimensional coordinate system is calculated, and then the point cloud is divided into a plurality of cubic grids according to the volume of the point cloud data. After the grid is established, detecting whether the grid contains point cloud data points or not, and if not, directly deleting the point cloud data points; if the grid contains data points, the central point of the grid is calculated, the data points with the threshold value reserved part close to the central point are set, and the rest data points are deleted to obtain a point cloud data set after preprocessing.
Next, converting the feature value in each point cloud data in the data set into a binary character string, first, performing conventional feature point extraction:
step 2, extracting S i Form a set of feature points
Figure BDA0002278994460000091
Figure BDA0002278994460000092
Representing a point cloud S i The v-th feature point of (1).
Common feature point detection methods include a feature point extraction method based on a normal vector, a feature point extraction method based on a curvature, an LSP feature point detection method, an ISS feature point extraction method, and the like. The ISS feature point extraction method is preferably selected for processing.
Converting these feature points into binary strings
Figure BDA0002278994460000101
Figure BDA0002278994460000102
Figure BDA0002278994460000103
Represents->
Figure BDA0002278994460000104
The v-th descriptor in (1). The transformation method can select LBP method and its variant algorithm machine, such as MB-LBP, CLBP, LTP and other methods, preferably, the invention describes the point cloud characteristic by a region curvature binary descriptor, the thought is shown in figure 2, which includes the following steps:
for each feature point
Figure BDA0002278994460000105
The following operations are performed:
step 2.1, obtaining characteristic points
Figure BDA0002278994460000106
Is a set of characteristic points whose central radius is within r>
Figure BDA0002278994460000107
Calculate->
Figure BDA0002278994460000108
The covariance matrix D of (a); the value range of r is determined according to the size of the original point cloud data, generally, r is more than or equal to 25mr and less than or equal to 75mr, and in the following embodiment, r =65mr;
Figure BDA0002278994460000109
Figure BDA00022789944600001010
/>
in the formula (d) i Representing characteristic points
Figure BDA00022789944600001011
And field point->
Figure BDA00022789944600001012
In conjunction with a distance of->
Figure BDA00022789944600001013
Means a sum within a radius r>
Figure BDA00022789944600001014
Adjacent arbitrary points. Preferably, is->
Figure BDA00022789944600001015
And weights are distributed, and the weights are larger when the distance is closer, so that the robustness of the descriptor is increased, and the interference of noise and outlier clutter on the feature description is reduced.
Step 2.2, establishing characteristic points according to the covariance matrix D
Figure BDA00022789944600001016
And then combining the three-dimensional local coordinate system>
Figure BDA00022789944600001017
And all the adjacent points are projected on a two-dimensional plane, specifically:
performing characteristic decomposition on the D matrix: DV = EV. E is a feature value diagonal matrix l { l 1 ,l 2 ,l 3 },V{v 1 ,v 2 ,v 3 Is a matrix of orthogonal eigenvectors.
Definition of
Figure BDA0002278994460000111
The local coordinate system O of the three-dimensional patch is constructed by arranging the feature values in descending order, v1 being the X-axis direction and v2 being the Y-axis direction, based on the right-hand coordinate system principle Z = X × Y, as the origin of LRF (three-dimensional local coordinate system) p -X p Y p Z p
Step 2.3, divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s The method for representing the number of each edge division of the two-dimensional plane specifically comprises the following steps:
according to the mapping M will
Figure BDA0002278994460000112
Is adjacent point->
Figure BDA0002278994460000113
Projection to O p '-X p 'Y p ' on this two-dimensional plane, the projection points lying on the two-dimensional plane are obtained>
Figure BDA0002278994460000114
The mapping M is defined as:
M(x,y,z)=(Rcosθ,Rsinθ) (3)
in the formula (I), the compound is shown in the specification,
Figure BDA0002278994460000115
is a neighbor point>
Figure BDA0002278994460000116
And a local coordinate system O p -X p Y p Z p Origin O p The distance between them. θ = arctan (y, x) is @>
Figure BDA0002278994460000117
Is measured.
The projected points on the two-dimensional plane are further discretized by dividing the two-dimensional plane into 2L along the X 'axis and the Y' axis s ×2L s Each grid, quantizing the projection points into the corresponding grid, and representing each grid as B in (x, y) with center p s
Figure BDA0002278994460000118
Wherein d is s Is the side length of each grid, falling into
Figure BDA0002278994460000119
Of a region
Figure BDA00022789944600001110
Will be computed into the grid; the side length of the two-dimensional plane is 2Ls, the center of the square is used as an origin point, a coordinate system is established, and then the boundary and the coordinate axis (L) of the two-dimensional plane are formed s ,0)(-L s ,0)(0,L s )(0,-L s ) Four points intersect. Each small grid has side length d s 2L on each of the x and y axes s +1 coordinate points, 2L s And (4) strip line segments. Each small grid B in Center p of (x, y) s The position is determined by x, y coordinates. For example, there is a small grid, first, let us look at the h and g axes, and multiply the edge length of the grid by h and g respectively to obtain the coordinates of the center point of the grid.
And 2.4, calculating the weighted curvature of the projection point in each grid, aggregating all weighted curvatures to form a curvature distribution matrix, and automatically forming a curvature map by the curvature distribution matrix.
In the calculation of the weighted curvature, according to the method for solving the curvature of the curved surface provided by Zhou Weiguang and the like, a Hessian matrix is calculated according to the adjacent surface formed by adjacent points, and the maximum eigenvalue of the matrix is the characteristic point
Figure BDA0002278994460000121
Maximum principal curvature k of i It can be expressed as: />
Figure BDA0002278994460000122
Fall into B in The curvature of the midpoint (x, y) is finally encoded as a curvature profile (equation 6):
Figure BDA0002278994460000123
wherein, w i Representing the assignment to proxels
Figure BDA0002278994460000124
Emphasizes points closer to the center. />
Figure BDA0002278994460000125
Is a standardized procedure, in conjunction with a standard evaluation program>
Figure BDA0002278994460000126
A weighted curvature profile of the two-dimensional plane is represented. The region on the two-dimensional plane without projected points is represented as ∞.
Step 2.5, the curvature graph in the step 2.4 is converted into a binary string, and the characteristic points are obtained
Figure BDA0002278994460000127
The corresponding descriptor.
In a particular transformation process, to make the process scale invariant, the present invention normalizes the area curvature to [0,1 ] by dividing by the maximum absolute value of each row of descriptors]And (3) a range. The CS-LBP method proposed by Schmid et al was used to describe the image block center. Then, the binary characters are connected in series to obtain characteristic points
Figure BDA0002278994460000128
The descriptor of (1).
The invention starts the process of feature matching, wherein the feature matching is to establish the corresponding relation between feature descriptors and is an important link for realizing initial registration, and the invention carries out the feature matching by the following method:
step 3, extracting S j Form a set of feature points
Figure BDA0002278994460000129
Figure BDA00022789944600001210
Representing a point cloud S j The u-th feature point in (1);
converting these feature points into binary strings
Figure BDA0002278994460000131
Figure BDA0002278994460000132
Figure BDA0002278994460000133
Represents->
Figure BDA0002278994460000134
The u-th descriptor in (1); the specific transformation process is the same as step 2, and is not described herein again.
Step 4, searching
Figure BDA0002278994460000135
And &>
Figure BDA0002278994460000136
The same characteristic points in (1):
step 4.1, in
Figure BDA0002278994460000137
In and->
Figure BDA0002278994460000138
Closest->
Figure BDA0002278994460000139
And next closest->
Figure BDA00022789944600001310
Figure BDA00022789944600001311
Figure BDA00022789944600001312
When the ratio of the two distances +>
Figure BDA00022789944600001313
Below a threshold value T 1 ,0.6≤T 1 ≦ 0.9 (set to 0.75 in the examples described below), ensureStorage/answer unit>
Figure BDA00022789944600001314
Step 4.2 is carried out; otherwise, it indicates not being->
Figure BDA00022789944600001315
In order to find and->
Figure BDA00022789944600001316
Descriptor matched, i.e. not at P j Is found with P i Repeating for 4.1 the point matched with a certain characteristic point;
Figure BDA00022789944600001317
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600001318
represents->
Figure BDA00022789944600001319
And/or>
Figure BDA00022789944600001320
Is at a distance of->
Figure BDA00022789944600001321
Represents->
Figure BDA00022789944600001322
And/or>
Figure BDA00022789944600001323
The distance between them;
step 4.2, in
Figure BDA00022789944600001324
The middle search and the decision in step 4.1 are->
Figure BDA00022789944600001325
Distance between two adjacent platesMost recently->
Figure BDA00022789944600001326
And next closest->
Figure BDA00022789944600001327
When the ratio of the two distances->
Figure BDA00022789944600001328
Below a threshold value T 2 ,0.6≤T 2 ≦ 0.9, (set to 0.85 in the examples described below), hold ≦ 0.9, save>
Figure BDA00022789944600001329
Step 4.3 is carried out; otherwise, returning to the step 4.1; />
Figure BDA00022789944600001330
In the formula (I), the compound is shown in the specification,
Figure BDA00022789944600001331
represents->
Figure BDA00022789944600001332
And &>
Figure BDA00022789944600001333
In combination with (B) in the interior of the housing>
Figure BDA00022789944600001334
Represents->
Figure BDA00022789944600001335
And &>
Figure BDA00022789944600001336
The distance between them;
step 4.3, calculating the result obtained in step 4.1
Figure BDA00022789944600001337
And ^ obtained in step 4.2>
Figure BDA00022789944600001338
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 0.85mr or less (in the following examples, 0.8mr is set), and step 4.4 is performed; otherwise, returning to the step 4.1;
step 4.4, judge the descriptor
Figure BDA00022789944600001339
And descriptor->
Figure BDA00022789944600001340
Whether or not the same feature point corresponds, if the descriptor->
Figure BDA00022789944600001341
And descriptor
Figure BDA00022789944600001342
Corresponding to the same characteristic point, obtaining a group of matched characteristic points; otherwise, returning to the step 4.1;
step 4.7, repeating steps 4.1 to 4.4 until finding
Figure BDA0002278994460000141
And &>
Figure BDA0002278994460000142
All the mutually matched feature points in the image are obtained;
and 5, repeating the steps 2 to 4, traversing all point cloud data in the data set S to obtain all matched feature points, namely a complete set C, and obtaining a new point cloud data set S' by registering point pairs in the set C.
The invention also discloses a point cloud initial registration system based on the region curvature binary descriptor, which mainly comprises the following modules:
a data acquisition module for acquiring a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,2,.. The m, j =1,2,. The m, i ≠ j, and m ≧ 2; m is the number of scanning times, and the acquisition equipment can adopt radar, laser scanning, kinect equipment and the like.
A feature point extraction module for extracting S i Forming a set of feature points of
Figure BDA0002278994460000143
Figure BDA0002278994460000144
Representing a point cloud S i The v-th feature point in (1); extraction of S j Forms a set->
Figure BDA0002278994460000145
Figure BDA0002278994460000146
Representing a point cloud S j The u-th feature point in (1). Common feature point detection methods in this module include a feature point extraction method based on normal vectors, a feature point extraction method based on curvatures, an LSP feature point detection method, an ISS feature point extraction method, and the like. The ISS feature point extraction method is preferably selected for processing.
A feature point conversion module for converting the feature points in the feature point extraction module into binary character strings
Figure BDA0002278994460000147
And
Figure BDA0002278994460000148
Figure BDA0002278994460000149
Figure BDA00022789944600001410
represents->
Figure BDA00022789944600001411
The v-th descriptor in (1); />
Figure BDA00022789944600001412
Represents->
Figure BDA00022789944600001413
The u-th descriptor in (1). The transformation method can select an LBP method and a variant algorithm machine thereof, such as MB-LBP, CLBP, LTP and other methods, preferably, the invention describes the point cloud characteristics through a region curvature binary descriptor, and specifically comprises the following steps:
for each feature point
Figure BDA00022789944600001414
The following operations are performed:
obtaining feature points
Figure BDA0002278994460000151
Set of feature points having a center radius within r>
Figure BDA0002278994460000152
Counting/or>
Figure BDA0002278994460000153
The covariance matrix D of (a); the value range of r is determined according to the size of the original point cloud data, generally, r is not less than 25mr and not more than 75mr, and in the following embodiment, r =65mr; />
Figure BDA0002278994460000154
Figure BDA0002278994460000155
In the formula (d) i Representing characteristic points
Figure BDA0002278994460000156
And field point->
Figure BDA0002278994460000157
Is greater than or equal to>
Figure BDA0002278994460000158
Means a sum within a radius r>
Figure BDA0002278994460000159
Adjacent arbitrary points. Preferably, is->
Figure BDA00022789944600001510
And weights are distributed, and the closer the distance is, the higher the weight is, so that the robustness of the descriptor is increased, and the interference of noise and outlier clutter on the feature description is reduced.
Establishing characteristic points according to the covariance matrix D
Figure BDA00022789944600001511
And then the three-dimensional local coordinate system is set
Figure BDA00022789944600001512
And all the adjacent points are projected on a two-dimensional plane, specifically:
performing characteristic decomposition on the D matrix: DV = EV. E is the eigenvalue diagonal matrix l 1 ,l 2 ,l 3 },V{v 1 ,v 2 ,v 3 Is a matrix of orthogonal eigenvectors.
Definition of
Figure BDA00022789944600001513
The local coordinate system O of the three-dimensional patch is constructed by arranging the feature values in descending order, v1 being the X-axis direction and v2 being the Y-axis direction, based on the right-hand coordinate system principle Z = X × Y, as the origin of LRF (three-dimensional local coordinate system) p -X p Y p Z p
Divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s The method for representing the number of each edge division of the two-dimensional plane specifically comprises the following steps:
according to the mapping M will
Figure BDA00022789944600001514
Is adjacent point->
Figure BDA00022789944600001515
Projection to O p '-X p 'Y p ' on this two-dimensional plane, projection points which are distributed over the two-dimensional plane are obtained>
Figure BDA00022789944600001516
The mapping M is defined as:
M(x,y,z)=(Rcosθ,Rsinθ) (3)
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600001517
is a neighbor point>
Figure BDA00022789944600001518
And a local coordinate system O p -X p Y p Z p Origin O p The distance between them. θ = arctan (y, x) is @>
Figure BDA00022789944600001519
Is measured.
The projected points on the two-dimensional plane are further discretized, i.e. the two-dimensional projection plane is divided into 2L along the X 'axis and Y' axis s ×2L s Each grid, quantizing the projection points into the corresponding grid, and representing each grid as B in (x, y) with center p s
Figure BDA0002278994460000161
Wherein, d s Is the side length of each grid, falling into
Figure BDA0002278994460000162
Of a region
Figure BDA0002278994460000163
Will be computed into the grid; the side length of the two-dimensional plane is 2Ls, the center of the square is used as an origin, a coordinate system is established, and then the boundary and the coordinate axis (L) of the two-dimensional plane are formed s ,0)(-L s ,0)(0,L s )(0,-L s ) Four points intersect. Each small grid has side length d s And 2L on each of the x and y axes s +1 coordinate points, 2L s And (4) strip line segments. Each small grid B in Center p of (x, y) s The position is determined by x, y coordinates. For example, there is a small grid, which is the h-th and g-th grid on the x and y axes, and the h and g are multiplied by the grid side length to obtain the grid center point coordinate.
And calculating the weighted curvature of the projection points in each grid, and aggregating all weighted curvatures to form a curvature distribution matrix, wherein the curvature distribution matrix automatically forms a curvature map.
In the calculation of the weighted curvature, according to the method for solving the curvature of the curved surface provided by Zhou Weiguang and the like, a Hessian matrix is calculated according to the adjacent surface formed by adjacent points, and the maximum eigenvalue of the matrix is the characteristic point
Figure BDA0002278994460000164
Maximum principal curvature k of i It can be expressed as:
Figure BDA0002278994460000165
fall into B in The curvature of the midpoint (x, y) is finally encoded as a curvature profile (equation 6):
Figure BDA0002278994460000166
wherein, w i Representing the assignment to proxels
Figure BDA0002278994460000171
Emphasizes points closer to the center. />
Figure BDA0002278994460000172
Is a normalization procedure, is based on>
Figure BDA0002278994460000173
A weighted curvature profile of the two-dimensional plane is represented. The region on the two-dimensional plane without projected points is represented as ∞.
Converting the obtained curvature diagram into binary strings to obtain the characteristic points
Figure BDA0002278994460000174
The corresponding descriptors.
Figure BDA0002278994460000175
Conversion into a binary string is effected by>
Figure BDA0002278994460000176
The above process is not described herein.
A feature point matching module for finding
Figure BDA0002278994460000177
And &>
Figure BDA0002278994460000178
The same characteristic points specifically include:
a positive sequence search module for searching positive sequence in
Figure BDA0002278994460000179
In and->
Figure BDA00022789944600001710
Closest->
Figure BDA00022789944600001711
And next closest->
Figure BDA00022789944600001712
Figure BDA00022789944600001713
When the ratio of the two distances +>
Figure BDA00022789944600001714
Below a threshold value T 1 ,0.6≤T 1 Less than or equal to 0.9, and storing>
Figure BDA00022789944600001715
Entering a reverse order searching module; otherwise, repeating the positive sequence searching process;
Figure BDA00022789944600001716
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600001717
represents->
Figure BDA00022789944600001718
And/or>
Figure BDA00022789944600001719
Is at a distance of->
Figure BDA00022789944600001720
Represents->
Figure BDA00022789944600001721
And &>
Figure BDA00022789944600001722
The distance between them;
a reverse order lookup module for
Figure BDA00022789944600001723
Get ^ in the middle search and positive sequence search module>
Figure BDA00022789944600001724
Nearest to each other/>
Figure BDA00022789944600001725
And next closest->
Figure BDA00022789944600001726
When the ratio of the two distances->
Figure BDA00022789944600001727
Below a threshold value T 2 ,0.6≤T 2 Less than or equal to 0.9 percent and is preserved>
Figure BDA00022789944600001728
Entering; otherwise, returning to the positive sequence searching module;
Figure BDA00022789944600001729
in the formula (I), the compound is shown in the specification,
Figure BDA00022789944600001730
represents->
Figure BDA00022789944600001731
And &>
Figure BDA00022789944600001732
In combination with (B) in the interior of the housing>
Figure BDA00022789944600001733
Represents->
Figure BDA00022789944600001734
And/or>
Figure BDA00022789944600001735
The distance between them;
an Lp distance calculation module for calculating the distance obtained by the positive sequence search module
Figure BDA00022789944600001736
And reverse order lookupObtained by a module>
Figure BDA00022789944600001737
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 Matching the characteristic points with a module less than or equal to 0.85 mr; otherwise, returning to the positive sequence searching module;
a feature point pairing module for judging the descriptor
Figure BDA0002278994460000181
And descriptor>
Figure BDA0002278994460000182
Whether the feature points correspond to the same feature point or not, if so, obtaining a group of matched feature points; otherwise, returning to the positive sequence searching module;
a repeated calculation module for searching the module from the positive sequence to the characteristic point matching module until the characteristic point is found
Figure BDA0002278994460000183
And &>
Figure BDA0002278994460000184
All the mutually matched feature points are obtained;
and the initially registered point cloud data set S 'generation module is used for repeating the feature point extraction module to the feature point matching module until all point cloud data in the data set S are traversed to obtain all matched feature points, and then the initially registered point cloud data set S' can be obtained.
Preferably, the system of the present invention further includes a data preprocessing module for preprocessing the point cloud data obtained by the data acquisition module, wherein the preprocessing process includes filtering and denoising and outlier denoising. The method mainly uses a voxelization grid sampling method to filter the point cloud, firstly, the maximum length of the point cloud data in each coordinate axis direction in a three-dimensional coordinate system is calculated, and then the point cloud is divided into a plurality of cubic grids according to the volume of the point cloud data. After the grid is established, detecting whether the grid contains point cloud data points or not, and if not, directly deleting the point cloud data points; if the grid contains data points, the central point of the grid is calculated, the data points with the threshold value reserved part close to the central point are set, and the rest data points are deleted to obtain a point cloud data set after preprocessing.
Example (b):
in the embodiment, a terracotta warriors fragment point cloud data set acquired by a K9901 terracotta warrior pit is used as experimental data, and as shown in FIG. 3, according to the embodiment, under the condition of the same point cloud and the same point cloud data overlapping rate, the point cloud initial registration method (RCBD for short) is compared with an improved RANSAC algorithm and an improved F-4PCS algorithm. In this embodiment, the fine registration uses a closest point iterative ICP algorithm. Fig. 4 shows the optimal matching set of the obtained group of fragments, fig. 5 shows the initial registration results of the three methods, and fig. 6 shows the result experiment after the fine registration is performed on the initial registration of this embodiment of the ICP algorithm. The result shows that the registration accuracy of the method is as high as 98.77%, the root mean square error is reduced by 37% compared with that of the traditional method, and the average registration time is reduced by 50%.
The respective specific technical features described in the above-described embodiments may be combined in any suitable manner without contradiction as long as they do not depart from the gist of the present invention, and should also be regarded as being disclosed in the present invention.

Claims (10)

1. The point cloud initial registration method based on the region curvature binary descriptor is characterized by comprising the following steps:
step 1, obtaining a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,2, · m, j =1,2,. M, i ≠ j, m ≧ 2;
step 2, extracting S i Forming a set of feature points of
Figure FDA0002278994450000011
Figure FDA0002278994450000012
Representing a point cloud S i The v-th feature point in (1);
converting these feature points into binary strings
Figure FDA0002278994450000013
Figure FDA0002278994450000014
Represents->
Figure FDA0002278994450000015
The v-th descriptor in (1);
step 3, extracting S j Forming a set of feature points of
Figure FDA0002278994450000016
Figure FDA0002278994450000017
Representing a point cloud S j The u-th feature point in (1);
converting these feature points into binary strings
Figure FDA0002278994450000018
Figure FDA0002278994450000019
Represents->
Figure FDA00022789944500000110
The u-th descriptor in (1);
step 4, searching
Figure FDA00022789944500000111
And &>
Figure FDA00022789944500000112
The same characteristic points in (1):
step 4.1, in
Figure FDA00022789944500000113
In and->
Figure FDA00022789944500000114
Closest->
Figure FDA00022789944500000115
And next closest->
Figure FDA00022789944500000116
Figure FDA00022789944500000117
When the ratio of the two distances +>
Figure FDA00022789944500000118
Below a threshold value T 1 ,0.6≤T 1 ≦ 0.9, preservation>
Figure FDA00022789944500000119
Step 4.2 is carried out; otherwise, repeat 4.1;
Figure FDA00022789944500000120
in the formula (I), the compound is shown in the specification,
Figure FDA00022789944500000121
represents->
Figure FDA00022789944500000122
And/or>
Figure FDA00022789944500000123
Is at a distance of->
Figure FDA00022789944500000124
Represents->
Figure FDA00022789944500000125
And/or>
Figure FDA00022789944500000126
The distance between them;
step 4.2 in
Figure FDA00022789944500000127
The middle search and the decision in step 4.1 are->
Figure FDA00022789944500000128
Closest->
Figure FDA00022789944500000129
And next closest>
Figure FDA00022789944500000130
When the ratio of the two distances->
Figure FDA00022789944500000131
Below a threshold value T 2 ,0.6≤T 2 Less than or equal to 0.9 percent and is preserved>
Figure FDA00022789944500000132
Step 4.3 is carried out; otherwise, returning to the step 4.1;
Figure FDA0002278994450000021
in the formula (I), the compound is shown in the specification,
Figure FDA0002278994450000022
represents->
Figure FDA0002278994450000023
And/or>
Figure FDA0002278994450000024
Is at a distance of->
Figure FDA0002278994450000025
Represents->
Figure FDA0002278994450000026
And/or>
Figure FDA0002278994450000027
The distance between them;
step 4.3, calculating the result obtained in step 4.1
Figure FDA0002278994450000028
And ^ obtained in step 4.2>
Figure FDA0002278994450000029
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 Performing step 4.4 when the concentration is less than or equal to 0.85 mr; otherwise, returning to the step 4.1;
step 4.4, judge the descriptor
Figure FDA00022789944500000210
And descriptor->
Figure FDA00022789944500000211
Whether or not the same feature point corresponds, if the descriptor->
Figure FDA00022789944500000212
And descriptor>
Figure FDA00022789944500000213
Corresponding to the same characteristic point, obtaining a group of matched characteristic points; otherwise, returning to the step 4.1;
step 4.7, repeat steps 4.1 through 4.4 until found
Figure FDA00022789944500000214
And &>
Figure FDA00022789944500000215
All the matched feature points are selected; />
And 5, repeating the steps 2 to 4, traversing all point cloud data in the data set S to obtain all matched feature points, and thus obtaining the point cloud data set S' after initial registration.
2. The method of claim 1, wherein in step 2, the feature points are converted into binary strings
Figure FDA00022789944500000216
The process comprises the following steps:
for each feature point
Figure FDA00022789944500000217
The following operations are performed:
step 2.1, obtaining characteristic points
Figure FDA00022789944500000218
Is a set of characteristic points whose central radius is within r>
Figure FDA00022789944500000219
R is more than or equal to 25mr and less than or equal to 75mr, calculating>
Figure FDA00022789944500000220
The covariance matrix D of (a);
Figure FDA00022789944500000221
Figure FDA00022789944500000222
in the formula (d) i Representing characteristic points
Figure FDA00022789944500000223
And field point->
Figure FDA00022789944500000224
Is greater than or equal to>
Figure FDA00022789944500000225
Means a sum within a radius r>
Figure FDA00022789944500000226
Adjacent arbitrary points;
step 2.2, establishing characteristic points according to the covariance matrix D
Figure FDA0002278994450000031
And then the three-dimensional local coordinate system is determined>
Figure FDA0002278994450000032
And all the points adjacent to the point are projected on a two-dimensional plane;
step 2.3, divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s Representing the number of each edge division of the two-dimensional plane;
step 2.4, calculating the weighted curvature of the projection point in each grid, aggregating all weighted curvatures to form a curvature distribution matrix, and automatically forming a curvature map by the curvature distribution matrix;
step 2.5, the curvature graph in the step 2.4 is converted into a binary string, and the characteristic points are obtained
Figure FDA0002278994450000033
The corresponding descriptor.
3. The method as claimed in claim 1, wherein the step 3 of converting the feature points into binary character strings is the same as the step 2.
4. The method as claimed in claim 1, wherein the step 2 and the step 3 use an ISS feature point extraction method to extract the feature points.
5. The method as claimed in claim 1, wherein in step 1, the obtained point cloud data is further preprocessed, and the preprocessing includes filtering denoising and outlier denoising.
6. The system for initial registration of point clouds based on binary descriptors of regional curvatures is characterized by comprising the following modules:
a data acquisition module for acquiring a three-dimensional point cloud data set S, S = [ S ] of a measured object 1 ,S 2 ,...,S i ,...S j ,...,S m ];S i Represents the ith point cloud data, i =1,2,.. The m, j =1,2,. The m, i ≠ j, and m ≧ 2;
a feature point extraction module for extracting S i Form a set of feature points
Figure FDA0002278994450000041
Figure FDA0002278994450000042
Representing a point cloud S i The v-th feature point in (1); extraction of S j Forms a set->
Figure FDA0002278994450000043
Figure FDA0002278994450000044
Representing a point cloud S j The u-th feature point in (1);
a feature point conversion module for converting the feature points in the feature point extraction module into binary character strings
Figure FDA0002278994450000045
And &>
Figure FDA0002278994450000046
Figure FDA0002278994450000047
Represents->
Figure FDA0002278994450000048
The v-th descriptor in (1); />
Figure FDA0002278994450000049
Represents->
Figure FDA00022789944500000410
The u-th descriptor in (1);
feature point matching module for finding
Figure FDA00022789944500000411
And &>
Figure FDA00022789944500000412
The same characteristic points specifically include:
a positive sequence search module for searching in
Figure FDA00022789944500000413
In and->
Figure FDA00022789944500000414
Closest->
Figure FDA00022789944500000415
And next closest->
Figure FDA00022789944500000416
Figure FDA00022789944500000417
When the ratio of the two distances->
Figure FDA00022789944500000418
Below a threshold value T 1 ,0.6≤T 1 ≦ 0.9, preservation>
Figure FDA00022789944500000419
Entering a reverse order searching module; otherwise, repeating the positive sequence searching process;
Figure FDA00022789944500000420
in the formula (I), the compound is shown in the specification,
Figure FDA00022789944500000421
represents->
Figure FDA00022789944500000422
And/or>
Figure FDA00022789944500000423
Is at a distance of->
Figure FDA00022789944500000424
Represents->
Figure FDA00022789944500000425
And/or>
Figure FDA00022789944500000426
The distance therebetween;
a reverse order lookup module for
Figure FDA00022789944500000427
Get ^ in the middle search and positive sequence search module>
Figure FDA00022789944500000428
Closest->
Figure FDA00022789944500000429
And second nearest
Figure FDA00022789944500000430
When the ratio of the two distances->
Figure FDA00022789944500000431
Below a threshold value T 2 ,0.6≤T 2 Less than or equal to 0.9 percent and is preserved>
Figure FDA00022789944500000432
Entering; otherwise, returning to the positive sequence searching module;
Figure FDA00022789944500000433
in the formula (I), the compound is shown in the specification,
Figure FDA00022789944500000434
represents->
Figure FDA00022789944500000435
And/or>
Figure FDA00022789944500000436
In combination with (B) in the interior of the housing>
Figure FDA00022789944500000437
Represents->
Figure FDA00022789944500000438
And/or>
Figure FDA00022789944500000439
The distance between them;
an Lp distance calculation module for calculating the distance obtained by the positive sequence search module
Figure FDA00022789944500000440
And found by the reverse order look-up module>
Figure FDA00022789944500000441
If the obtained Lp distance is larger than the threshold value T 3 ,0.75mr≤T 3 Matching the characteristic points with a module less than or equal to 0.85 mr; otherwise, returning to the positive sequence searching module;
a feature point pairing module for judging the descriptor
Figure FDA0002278994450000051
And descriptor->
Figure FDA0002278994450000052
Whether or not the same feature point corresponds, if the descriptor->
Figure FDA0002278994450000053
And descriptor->
Figure FDA0002278994450000054
Corresponding to the same characteristic point, obtaining a group of matched characteristic points; otherwise, returning to the positive sequence searching module;
a repeated calculation module for searching the positive sequence from the module to the characteristic point matching module until finding out
Figure FDA0002278994450000055
And &>
Figure FDA0002278994450000056
All the mutually matched feature points are obtained;
and the initially registered point cloud data set S 'generation module is used for repeating the feature point extraction module to the feature point matching module until all point cloud data in the data set S are traversed to obtain all matched feature points, and then the initially registered point cloud data set S' can be obtained.
7. The initial point cloud registration system based on binary descriptors of regional curvature according to claim 6, wherein the feature point transformation module specifically comprises:
for each feature point
Figure FDA0002278994450000057
The following operations are performed: />
Obtaining characteristic points
Figure FDA0002278994450000058
Is a set of characteristic points whose central radius is within r>
Figure FDA0002278994450000059
R is more than or equal to 25mr and less than or equal to 75mr, calculating>
Figure FDA00022789944500000510
The covariance matrix D of (2);
Figure FDA00022789944500000511
Figure FDA00022789944500000512
in the formula (d) i Representing characteristic points
Figure FDA00022789944500000513
And field point->
Figure FDA00022789944500000514
In conjunction with a distance of->
Figure FDA00022789944500000515
Means a sum within a radius r>
Figure FDA00022789944500000516
Adjacent arbitrary points;
establishing characteristic points according to the covariance matrix D
Figure FDA00022789944500000517
And then the three-dimensional local coordinate system is determined>
Figure FDA00022789944500000518
And all the points adjacent to the point are projected on a two-dimensional plane;
divide the two-dimensional plane into 2L s ×2L s A grid, quantized projection points into the corresponding grid, 2L s Representing the number of each edge division of the two-dimensional plane;
calculating the weighted curvature of the projection point in each grid, aggregating all weighted curvatures to form a curvature distribution matrix, and automatically forming a curvature map by the curvature distribution matrix;
converting the obtained curvature diagram into binary strings to obtain the characteristic points
Figure FDA0002278994450000061
The corresponding descriptor.
8. The initial registration system for point clouds based on binary descriptors of regional curvature of claim 6, wherein each one
Figure FDA0002278994450000062
Conversion into a binary string is performed in parallel with>
Figure FDA0002278994450000063
The transformation process of (1).
9. The system as claimed in claim 6, wherein the feature point extraction module extracts feature points by using an ISS feature point extraction method.
10. The system as claimed in claim 6, further comprising a data pre-processing module for pre-processing the point cloud data obtained by the data acquisition module, wherein the pre-processing includes filtering and de-noising and outlier de-noising.
CN201911133616.5A 2019-11-19 2019-11-19 Point cloud initial registration method based on region curvature binary descriptor Active CN111127667B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911133616.5A CN111127667B (en) 2019-11-19 2019-11-19 Point cloud initial registration method based on region curvature binary descriptor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911133616.5A CN111127667B (en) 2019-11-19 2019-11-19 Point cloud initial registration method based on region curvature binary descriptor

Publications (2)

Publication Number Publication Date
CN111127667A CN111127667A (en) 2020-05-08
CN111127667B true CN111127667B (en) 2023-03-31

Family

ID=70495773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911133616.5A Active CN111127667B (en) 2019-11-19 2019-11-19 Point cloud initial registration method based on region curvature binary descriptor

Country Status (1)

Country Link
CN (1) CN111127667B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112330661A (en) * 2020-11-24 2021-02-05 交通运输部公路科学研究所 Multi-period vehicle-mounted laser point cloud road change monitoring method
CN114639115B (en) * 2022-02-21 2024-07-05 北京航空航天大学 Human body key point and laser radar fused 3D pedestrian detection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489218B (en) * 2013-09-17 2016-06-29 中国科学院深圳先进技术研究院 Point cloud data quality automatic optimization method and system
US9280825B2 (en) * 2014-03-10 2016-03-08 Sony Corporation Image processing system with registration mechanism and method of operation thereof
CN110443840A (en) * 2019-08-07 2019-11-12 山东理工大学 The optimization method of sampling point set initial registration in surface in kind

Also Published As

Publication number Publication date
CN111127667A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN110097093B (en) Method for accurately matching heterogeneous images
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
CN111080684B (en) Point cloud registration method for point neighborhood scale difference description
Zhong Intrinsic shape signatures: A shape descriptor for 3D object recognition
CN108389250B (en) Method for rapidly generating building section map based on point cloud data
CN108107444B (en) Transformer substation foreign matter identification method based on laser data
CN110807781B (en) Point cloud simplifying method for retaining details and boundary characteristics
CN114677418B (en) Registration method based on point cloud feature point extraction
CN108022262A (en) A kind of point cloud registration method based on neighborhood of a point center of gravity vector characteristics
CN113628263A (en) Point cloud registration method based on local curvature and neighbor characteristics thereof
Cheng et al. Building boundary extraction from high resolution imagery and lidar data
CN110211163A (en) A kind of point cloud matching algorithm based on EPFH feature
CN109697729A (en) Based on the matched 3D rock mass point cloud registration method of plane polygon
CN110222661B (en) Feature extraction method for moving target identification and tracking
CN111127667B (en) Point cloud initial registration method based on region curvature binary descriptor
CN111145129A (en) Point cloud denoising method based on hyper-voxels
CN110009745B (en) Method for extracting plane from point cloud according to plane element and model drive
CN113409332B (en) Building plane segmentation method based on three-dimensional point cloud
CN109766903B (en) Point cloud model curved surface matching method based on curved surface features
CN114494368A (en) Low-overlapping-rate point cloud registration method combining dimensionality reduction projection and feature matching
Zhang et al. Spectral clustering of straight-line segments for roof plane extraction from airborne LiDAR point clouds
CN110348310A (en) A kind of Hough ballot 3D colour point clouds recognition methods
Lu et al. Point cloud registration algorithm fusing of super 4pcs and icp based on the key points
CN110942077B (en) Feature line extraction method based on weight local change degree and L1 median optimization
Omidalizarandi et al. Segmentation and classification of point clouds from dense aerial image matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant