CN115511935A - Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation - Google Patents

Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation Download PDF

Info

Publication number
CN115511935A
CN115511935A CN202211323356.XA CN202211323356A CN115511935A CN 115511935 A CN115511935 A CN 115511935A CN 202211323356 A CN202211323356 A CN 202211323356A CN 115511935 A CN115511935 A CN 115511935A
Authority
CN
China
Prior art keywords
point cloud
iterative
discretization
vector
pose estimation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211323356.XA
Other languages
Chinese (zh)
Inventor
李春泉
刘家风
苏志勇
姚凯文
杨大勇
陈利民
刘小平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang University
Original Assignee
Nanchang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang University filed Critical Nanchang University
Priority to CN202211323356.XA priority Critical patent/CN115511935A/en
Publication of CN115511935A publication Critical patent/CN115511935A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/37Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A registration method of point cloud with normal distribution transformation based on iterative discretization and linear interpolation includes obtaining X and target point clouds
Figure DDA0003911419260000011
Determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale; then, selecting a cell resolution according to the current discrete iteration times to divide the target point cloud space into a plurality of cube cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cell; then traversing each source point cloud
Figure DDA0003911419260000012
Iteratively updating pose estimation vectors
Figure DDA0003911419260000013
Score function of (2)
Figure DDA0003911419260000014
Gradient vector of (2)
Figure DDA0003911419260000015
And a Hessian matrix H; and then iteratively updating the pose estimation vector by using a Newton method
Figure DDA0003911419260000016
Finally, updating the source point cloud chi, judging whether iteration is needed to be continued or not, and after repeated iteration, accurately registering the source point cloud chi to the target point cloud
Figure DDA0003911419260000017
In the space in which it is located. The invention can quickly and accurately complete the point cloud registration work.

Description

Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation
Technical Field
The invention relates to the technical field of three-dimensional reconstruction and point cloud registration, in particular to a point cloud registration method based on iterative discretization and linear interpolation normal distribution transformation.
Background
With the development of computer technology and three-dimensional machine vision technology, the acquisition of point cloud data is becoming more and more convenient, and the task of point cloud processing is also becoming more and more. In most point cloud processing tasks, the point cloud registration technology is the basis of subsequent processing, because most existing devices such as laser radars, depth cameras and the like can only acquire local point clouds of objects or scenes, and at the moment, the point cloud registration technology is needed to register and splice multi-frame point clouds to form a complete point cloud scene. There are many scenarios that require the use of point cloud registration techniques, such as: three-dimensional reconstruction, automatic driving, mobile robot positioning and mapping, medical image splicing and the like.
The point cloud registration technology and method have wide application prospect and strong market demand in various industries. However. The existing point cloud registration technology has various problems, such as: the method for the point cloud registration comprises the following steps of solving the problem that a point cloud registration method is invalid when initial errors of point clouds to be registered are large, solving the problem that the point cloud registration speed is slow when the point cloud scale is large, and solving the problem that the registration accuracy is low when the overlapped parts of the point clouds to be registered are few. Therefore, the point cloud registration method with good robustness, high precision and low speed has important practical significance.
Disclosure of Invention
The invention aims to provide a normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation aiming at the defects of low robustness and low speed of a pure point cloud-based registration method (such as an ICP algorithm).
In order to achieve the above purpose, the present invention provides the following technical solutions.
The invention relates to a point cloud registration method based on normal distribution transformation of iterative discretization and linear interpolation, which comprises the following steps of:
step 1: obtaining a source point cloud χ and a target point cloud
Figure BDA0003911419240000011
And determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale.
Because the most important parameter of the point cloud registration algorithm based on normal distribution transformation is the resolution of a cube cell, the registration precision can be reduced by too high resolution, and the registration task cannot be completed by too low resolution when the initial error of two frames of point clouds is large. Therefore, the invention adopts the iterative discretization method to carry out point cloud registration of multiple times of normal distribution transformation with different resolutions, and simultaneously ensures the robustness and the accuracy of the registration. Firstly, a source point cloud chi and a target point cloud are obtained
Figure BDA0003911419240000012
Wherein the final objective of the whole method is to make the source point cloud chiRegistration to target point cloud
Figure BDA0003911419240000013
In the spatial coordinates of the location. And then determining a cube cell resolution sequence phi required by iterative discretization according to the acquired point cloud scale.
The cubic cell resolution sequence phi, phi = { phi = phi required for iterative discretization described in step 1 of the invention 1 ,φ 2 ,...,φ l In which phi a =φ a-1 Per 2, first resolution of the sequence φ 1 Can be generally set to the target point cloud
Figure BDA0003911419240000014
One tenth of the maximum pitch of (1); for the length l of the cube cell resolution sequence Φ, namely, the value of the iterative computation times of the iterative discretization, l = 2-5 can be obtained according to the precision requirement, and the higher the precision requirement is, the larger the value is.
And 2, step: target point cloud based on iterative discretization method
Figure BDA0003911419240000021
And dividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell.
Then according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloud
Figure BDA0003911419240000022
The occupied space is divided into cubic cells of equal size. Then traversing all the target point clouds
Figure BDA0003911419240000023
And classifying the point clouds into corresponding cubic cells according to the three-dimensional coordinates. Due to the subsequently used source point cloud scoring function based on linear interpolation
Figure BDA0003911419240000024
The centroids μ and covariance matrices Σ of eight cubic cells adjacent to the point cloud are required, and since the inverse of the covariance matrix Σ may not exist when the number of point clouds in a cell is less than 5, only the centroids μ and covariance matrices Σ of those cubic cells whose number of point clouds is not less than 5 need to be calculated here. Furthermore, before proceeding to step 3, it is necessary to initialize the pose estimation vector
Figure BDA0003911419240000025
The calculation formula of the centroid mu and the covariance matrix sigma of the cubic cell is as follows:
Figure BDA0003911419240000026
Figure BDA0003911419240000027
wherein,
Figure BDA0003911419240000028
representing the target point cloud located in the cube cell, and m is more than or equal to 5.
The pose estimation vector uses 3 variables to represent displacement and 3 variables to represent rotation, i.e. using a six-dimensional parameter vector
Figure BDA0003911419240000029
Estimated vector of alignment posture
Figure BDA00039114192400000210
Coding is carried out, where t x ,t y ,t z Displacement values in x, y, z directions respectively,
Figure BDA00039114192400000211
respectively the corresponding euler angles.
And step 3: traversing each source point cloud
Figure BDA00039114192400000212
Iteratively updating pose estimation vectors
Figure BDA00039114192400000213
Score function of
Figure BDA00039114192400000214
Gradient vector of
Figure BDA00039114192400000215
And a Hessian matrix H.
Before traversing a source point cloud chi, a pose estimation vector needs to be initialized
Figure BDA00039114192400000216
Score function of
Figure BDA00039114192400000232
Gradient vector of
Figure BDA00039114192400000218
And Hessian matrix H =0 and total score =0. Then traversing each source point cloud
Figure BDA00039114192400000219
First, the source point cloud needs to be found
Figure BDA00039114192400000220
Transformed function of pose
Figure BDA00039114192400000221
Eight cubic cells near the converted point; then according to a source point cloud scoring function based on linear interpolation
Figure BDA00039114192400000222
Calculate the score for that point and add the score to the total score and update the score function
Figure BDA00039114192400000223
Gradient vector of
Figure BDA00039114192400000224
And the Hessian matrix H, and then iteratively compute the next source point cloud.
The pose transformation function
Figure BDA00039114192400000225
For rigid body transformation processes, i.e. point clouds
Figure BDA00039114192400000226
Estimating vector according to position
Figure BDA00039114192400000227
Performing rigid body transformation and pose transformation functions
Figure BDA00039114192400000228
The calculation formula is as follows:
Figure BDA00039114192400000229
wherein,
Figure BDA00039114192400000230
the source point cloud scoring function based on linear interpolation
Figure BDA00039114192400000231
In order to solve the problem that the surfaces of the edges of the cubic cells are discontinuous, the traditional point cloud registration method based on normal distribution transformation only uses the cells where the point cloud is located to score, and when the point cloud is located at the edge of the cell, the scoring error is large. Therefore, the invention adopts a source point cloud scoring function based on linear interpolation
Figure BDA0003911419240000031
The point cloud scoring is more accurate, and the scoring function
Figure BDA0003911419240000032
The calculation formula of (c) is:
Figure BDA0003911419240000033
wherein,
Figure BDA0003911419240000034
indicating the point cloud to be scored, and subscript b indicating the point cloud
Figure BDA0003911419240000035
The b-th cell of the eight cells in the vicinity,
Figure BDA0003911419240000036
weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
Figure BDA0003911419240000037
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one cell
Figure BDA0003911419240000038
Is equal to 1.
The pose estimation vector
Figure BDA0003911419240000039
Score function of
Figure BDA00039114192400000310
The formula is as follows:
Figure BDA00039114192400000311
wherein,
Figure BDA00039114192400000312
is a point in the source point cloud χ. Then the scoring function
Figure BDA00039114192400000313
Gradient vector of
Figure BDA00039114192400000314
And Hessian matrix H at
Figure BDA00039114192400000315
The components in each direction are:
Figure BDA00039114192400000316
Figure BDA00039114192400000317
wherein p is i And p j Are all pose estimation vectors
Figure BDA00039114192400000318
The components in each of the directions in (a) are,
Figure BDA00039114192400000319
and 4, step 4: iterative updating of pose estimate vectors using newton's method
Figure BDA00039114192400000320
The whole method aims to find a pose estimation vector
Figure BDA00039114192400000321
So that
Figure BDA00039114192400000322
Score function of
Figure BDA00039114192400000331
And max. For this objective function, an iterative solution can be made using newton's method: according to the gradient vector obtained in step 3
Figure BDA00039114192400000324
And Hessian matrix H, solving the equation
Figure BDA00039114192400000332
Obtaining an increment of pose estimation
Figure BDA00039114192400000326
Then updating the pose estimation vector
Figure BDA00039114192400000327
And finally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, and if not, returning to the step 3 to continue to perform iteration.
The convergence condition is the increment of pose estimation
Figure BDA00039114192400000328
Less than the expected value or more than a set value.
And 5: and updating the source point cloud χ and judging whether iteration is needed to be continued.
Acquiring a pose estimation vector in step 4
Figure BDA00039114192400000329
Thereafter, using the pose transformation function
Figure BDA00039114192400000330
And updating the source point cloud chi. Then judging whether the current iteration discretization frequency is less than the length l of the cubic cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, ending the algorithm, and precisely registering the source point cloud chi to the target point cloud
Figure BDA0003911419240000041
Is in the space of (a).
Compared with the prior art, the method adopts the point cloud registration algorithm based on normal distribution transformation, greatly accelerates the point cloud registration speed and improves the robustness of point cloud registration facing various different scenes. Aiming at the problem that the registration accuracy and robustness cannot be considered in the classical normal distribution transformation point cloud registration algorithm, the point cloud registration of the normal distribution transformation is improved in a discrete iteration mode, the robustness of initial registration is guaranteed by using the cell resolution of large scale, and the accuracy of subsequent registration is guaranteed by using the cell resolution of small scale. In addition, aiming at the problem of discontinuous cell edge surfaces in the classical normal distribution transformation point cloud registration algorithm, the invention adopts a linear interpolation mode to carry out scoring investigation on eight cells near the point cloud, thereby improving the scoring accuracy and further improving the registration accuracy.
Drawings
Fig. 1 is a schematic flow diagram of a point cloud registration method based on iterative discretization and linear interpolation for normal distribution transformation provided by the present invention.
Fig. 2 is (a) a top view, (b) a front view, and (c) an oblique view of an input point cloud used in an embodiment of the normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation proposed in the present invention.
Fig. 3 is (a) a top view, (b) a front view, and (c) an oblique view of a point cloud registration result of an embodiment of a normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation proposed in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and the embodiments. The embodiments described herein are only for explaining the technical solution of the present invention and are not limited to the present invention.
The invention provides a technical scheme that: the specific flow of the point cloud registration method based on the iterative discretization and linear interpolation based on the normal distribution transformation is shown in fig. 1, and the method comprises the following steps:
step 1: obtaining a source point cloud chi and a target point cloud
Figure BDA0003911419240000042
And determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale.
Firstly, a source point cloud chi and a target point cloud are required to be input
Figure BDA0003911419240000043
The point cloud scene is obtained by scanning in a mine hole by using a laser radar scanner, the whole width of the mine hole is 4.5m, the height is 4m, the length is about 17m, and the source point cloud chi and the target point cloud
Figure BDA0003911419240000044
The initial angle deviation of (2) is 13 degrees, the displacement deviation is 0.96m, and the initial state of the point cloud is shown in fig. 2. Then we need to determine the cube cell resolution sequence phi, phi = { phi 1, phi required for iterative discretization according to the scale of the input point cloud 2 ,...,φ l In which phi a =φ a-1 Per 2, first resolution of the sequence φ 1 Can be generally set to the target point cloud
Figure BDA0003911419240000045
The length l of the cube cell resolution sequence phi, namely the value of the iterative computation times of the iterative discretization, can be l = 2-5 according to the precision requirement, and the higher the precision requirement, the larger the value. The target point cloud input this time
Figure BDA0003911419240000046
Is about 20m, so the first resolution of the sequence can be set 1 Set to 2m, and the length l of the resolution series Φ to 4, the resolution series Φ = {2,1,0.5,0.25}.
And 2, step: target point alignment method based on iterative discretizationCloud
Figure BDA0003911419240000047
And dividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell.
Then according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloud
Figure BDA0003911419240000048
The occupied space is divided into cubic cells of equal size. Then traversing all the target point clouds
Figure BDA0003911419240000049
And classifying the point clouds into corresponding cubic cells according to the three-dimensional coordinates. Due to the subsequently used source point cloud scoring function based on linear interpolation
Figure BDA0003911419240000051
The centroids μ and covariance matrices Σ of eight cubic cells adjacent to the point cloud are required, and since the inverse of the covariance matrix Σ may not exist when the number of point clouds in a cell is less than 5, it is only necessary to calculate the centroids μ and covariance matrices Σ of those cubic cells whose number of point clouds is not less than 5. The calculation formula of the centroid mu and the covariance matrix sigma is as follows:
Figure BDA0003911419240000052
Figure BDA0003911419240000053
and due to the fact that
Figure BDA0003911419240000054
And the target point cloud located in the cubic cell is represented, and m is more than or equal to 5. Pose estimation vector uses 3 variable tablesIndicating displacement, 3 variables indicating rotation, i.e. using a six-dimensional parameter vector
Figure BDA0003911419240000055
Estimated vector of alignment posture
Figure BDA0003911419240000056
Performing coding with t x ,t y ,t z Displacement values in x, y, z directions respectively,
Figure BDA0003911419240000057
are respectively corresponding Euler angles, so the pose estimation vector needs to be initialized finally
Figure BDA0003911419240000058
And step 3: traverse each source point cloud
Figure BDA0003911419240000059
Iteratively updating pose estimation vectors
Figure BDA00039114192400000510
Score function of (2)
Figure BDA00039114192400000511
Gradient vector of (2)
Figure BDA00039114192400000512
And Hessian matrix H.
(a) Before traversing the source point cloud chi, the pose estimation vector needs to be initialized
Figure BDA00039114192400000513
Score function of
Figure BDA00039114192400000529
Gradient vector of
Figure BDA00039114192400000515
And Hessian matrix H =0 and total scoreAnd =0. Then traversing each source point cloud
Figure BDA00039114192400000516
(b) First, the source point cloud needs to be found
Figure BDA00039114192400000530
Transformed function of pose
Figure BDA00039114192400000517
Eight cube cells near the transformed point, pose transformation function
Figure BDA00039114192400000518
The calculation formula is as follows:
Figure BDA00039114192400000519
wherein,
Figure BDA00039114192400000520
(c) Then according to a source point cloud scoring function based on linear interpolation
Figure BDA00039114192400000521
Calculating the score of the point and adding the score to the total score, score function
Figure BDA00039114192400000522
The calculation formula of (2) is as follows:
Figure BDA00039114192400000523
wherein,
Figure BDA00039114192400000524
indicating the point cloud to be scored, subscript b indicating the point cloud
Figure BDA00039114192400000525
The b-th cell of the eight cells in the vicinity,
Figure BDA00039114192400000526
weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
Figure BDA00039114192400000527
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one unit cell
Figure BDA00039114192400000528
Is equal to 1, c is set in this embodiment 1 =3,c 2 =5。
(d) Then updating the scoring function
Figure BDA00039114192400000626
Gradient vector of
Figure BDA0003911419240000062
And Hessian matrix H, pose estimation vector
Figure BDA0003911419240000063
Score function of (2)
Figure BDA00039114192400000627
The formula is as follows:
Figure BDA0003911419240000065
wherein,
Figure BDA0003911419240000066
is a point in the source point cloud χ. Then the scoring function
Figure BDA0003911419240000067
Gradient vector of
Figure BDA0003911419240000068
And Hessian matrix H at
Figure BDA0003911419240000069
The components in each direction are:
Figure BDA00039114192400000610
Figure BDA00039114192400000611
wherein p is i And p j Are all pose estimation vectors
Figure BDA00039114192400000612
The components in each of the directions in (a) and (b),
Figure BDA00039114192400000613
(e) And continuously and iteratively calculating the next source point cloud.
And 4, step 4: iterative updating of pose estimate vectors using newton's method
Figure BDA00039114192400000614
The whole method aims to find a pose estimation vector
Figure BDA00039114192400000615
So that
Figure BDA00039114192400000616
Score function of
Figure BDA00039114192400000617
And maximum. For this objective function, an iterative solution can be made using newton's method: according to the gradient vector obtained in step 3
Figure BDA00039114192400000618
And Hessian matrix H, solving the equation
Figure BDA00039114192400000628
Obtaining an increment of pose estimates
Figure BDA00039114192400000620
Then updating the pose estimation vector
Figure BDA00039114192400000621
And finally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, otherwise returning to the step 3 to continue to perform iteration. Wherein the convergence condition is the increment of the current pose estimation
Figure BDA00039114192400000622
Less than the expected value (set to 0.000001 in this embodiment).
And 5: and updating the source point cloud χ and judging whether iteration is needed to be continued.
Acquiring the pose estimation vector in step 4
Figure BDA00039114192400000623
Thereafter, using the pose transformation function
Figure BDA00039114192400000624
And updating the source point cloud chi. Then judging whether the current iteration discretization frequency is smaller than the length l of the cube cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, finishing the algorithm, and precisely registering the X to the target point cloud
Figure BDA00039114192400000625
In space of (2), the final registration effect is as shown in fig. 3。
The above description is only an example of the present patent and should not be interpreted as limiting the scope of the present patent. It should be noted that various changes, modifications and substitutions may be made by those skilled in the art without departing from the spirit and principles of the invention, which is within the scope of the invention as defined by the appended claims and their equivalents.

Claims (1)

1. A point cloud registration method based on iteration discretization and linear interpolation for normal distribution transformation is characterized by comprising the following steps:
step 1: obtaining a source point cloud
Figure FDA00039114192300000129
And target point cloud
Figure FDA00039114192300000121
Determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale;
first, a source point cloud is obtained
Figure FDA00039114192300000130
With a target point cloud
Figure FDA00039114192300000122
Determining a cube cell resolution sequence phi required by iterative discretization according to the obtained point cloud scale, and then performing point cloud registration of multiple normal distribution transformations by adopting an iterative discretization method at different resolutions;
the resolution sequence phi, phi = { phi = of cubic cells required by iterative discretization 1 ,φ 2 ,...,φ l In which phi a =φ a-1 2, first resolution of the sequence φ 1 Is set as the target point cloud
Figure FDA00039114192300000131
One tenth of the maximum pitch of (1); the length l of the cube cell resolution sequence phi, namely the value of iterative computation times of iterative discretization, is 1 = 2-5, and the higher the precision requirement is, the larger the value is;
and 2, step: target point cloud based on iterative discretization method
Figure FDA00039114192300000132
Dividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell;
according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloud
Figure FDA00039114192300000133
The occupied space is divided into cubic cells with equal size; then traversing all the target point clouds
Figure FDA00039114192300000123
Classifying the point clouds into corresponding cubic cells according to three-dimensional coordinates; calculating the mass center mu and the covariance matrix sigma of the cubic cells with the point cloud number not less than 5, and initializing pose estimation vectors
Figure FDA0003911419230000011
The calculation formula of the centroid mu and the covariance matrix sigma of the cubic unit grid is as follows:
Figure FDA0003911419230000012
Figure FDA0003911419230000013
wherein,
Figure FDA0003911419230000014
representing the target point cloud in the cubic cell, wherein m is more than or equal to 5;
the pose estimation vector uses 3 variables to represent displacement, 3 variables to represent rotation and a six-dimensional parameter vector
Figure FDA0003911419230000015
Estimated vector of alignment posture
Figure FDA0003911419230000016
Performing coding with t x ,t y ,t z The displacement values in the x, y and z directions,
Figure FDA00039114192300000135
respectively corresponding euler angles;
and step 3: traversing each source point cloud
Figure FDA00039114192300000124
Iteratively updating pose estimation vectors
Figure FDA0003911419230000017
Score function of (2)
Figure FDA0003911419230000018
Gradient vector of
Figure FDA0003911419230000019
And a Hessian matrix H;
traversing a cloud of source points
Figure FDA00039114192300000134
Firstly, initializing the pose estimation vector
Figure FDA00039114192300000110
Score function of (2)
Figure FDA00039114192300000111
Gradient vector of
Figure FDA00039114192300000112
And Hessian matrix H =0 and total score =0; then traversing each source point cloud
Figure FDA00039114192300000125
First, find the source point cloud
Figure FDA00039114192300000113
Transformed function of pose
Figure FDA00039114192300000114
Eight cubic cells near the converted point; then according to a source point cloud scoring function based on linear interpolation
Figure FDA00039114192300000115
Calculate the score for that point and add the score to the total score and update the score function
Figure FDA00039114192300000116
Gradient vector of (2)
Figure FDA00039114192300000117
And a Hessian matrix H, and then iteratively calculating the next source point cloud;
the pose transformation function
Figure FDA00039114192300000118
For rigid body transformation process, point clouds
Figure FDA00039114192300000126
Estimating vector according to position
Figure FDA00039114192300000119
Performing rigid body transformation and pose transformation functions
Figure FDA00039114192300000120
The calculation formula is as follows:
Figure FDA0003911419230000021
wherein,
Figure FDA0003911419230000022
the source point cloud scoring function based on linear interpolation
Figure FDA0003911419230000023
The calculation formula of (c) is:
Figure FDA0003911419230000024
wherein,
Figure FDA0003911419230000025
indicating the point cloud to be scored, subscript b indicating the point cloud
Figure FDA0003911419230000026
The b-th cell of the eight cells in the vicinity,
Figure FDA0003911419230000027
weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
Figure FDA0003911419230000028
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one cell
Figure FDA0003911419230000029
Is equal to 1;
the pose estimation vector
Figure FDA00039114192300000210
Score function of
Figure FDA00039114192300000211
The formula is as follows:
Figure FDA00039114192300000212
wherein,
Figure FDA00039114192300000213
as a cloud of sources
Figure FDA00039114192300000230
A point of (1); scoring function
Figure FDA00039114192300000214
Gradient vector of
Figure FDA00039114192300000215
And Hessian matrix H in
Figure FDA00039114192300000216
The components in each direction are:
Figure FDA00039114192300000217
Figure FDA00039114192300000218
wherein p is i And p j Are all pose estimation vectors
Figure FDA00039114192300000219
The components in each of the directions in (a) and (b),
Figure FDA00039114192300000220
and 4, step 4: iterative updating of pose estimate vectors using newton's method
Figure FDA00039114192300000221
According to the gradient vector obtained in step 3
Figure FDA00039114192300000222
And Hessian matrix H, solving the equation
Figure FDA00039114192300000223
Obtaining an increment of pose estimation
Figure FDA00039114192300000224
Then updating the pose estimation vector
Figure FDA00039114192300000225
Finally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, otherwise returning to the step 3 to continue iterative execution;
the convergence condition is the increment of pose estimation
Figure FDA00039114192300000226
Less than the expected value or the iteration number is more than a certain set value;
and 5: updating source point clouds
Figure FDA00039114192300000231
Judging whether the iteration is needed to be continued;
acquiring a pose estimation vector in step 4
Figure FDA00039114192300000227
Thereafter, using the pose transformation function
Figure FDA00039114192300000228
Source point cloud
Figure FDA00039114192300000229
Updating, and then judging whether the current iteration discretization frequency is less than the length l of the cubic cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, ending the algorithm and obtaining the source point cloud
Figure FDA0003911419230000032
Is accurately registered to the target point cloud
Figure FDA0003911419230000031
In the space of (a).
CN202211323356.XA 2022-10-27 2022-10-27 Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation Pending CN115511935A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211323356.XA CN115511935A (en) 2022-10-27 2022-10-27 Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211323356.XA CN115511935A (en) 2022-10-27 2022-10-27 Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation

Publications (1)

Publication Number Publication Date
CN115511935A true CN115511935A (en) 2022-12-23

Family

ID=84513134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211323356.XA Pending CN115511935A (en) 2022-10-27 2022-10-27 Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation

Country Status (1)

Country Link
CN (1) CN115511935A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116572253A (en) * 2023-06-29 2023-08-11 深圳技术大学 Grabbing control method and device for test tube

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116572253A (en) * 2023-06-29 2023-08-11 深圳技术大学 Grabbing control method and device for test tube
CN116572253B (en) * 2023-06-29 2024-02-20 深圳技术大学 Grabbing control method and device for test tube

Similar Documents

Publication Publication Date Title
CN110116407B (en) Flexible robot position and posture measuring method and device
Garro et al. Solving the pnp problem with anisotropic orthogonal procrustes analysis
CN101907459B (en) Monocular video based real-time posture estimation and distance measurement method for three-dimensional rigid body object
JP5627325B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
CN106056643B (en) A kind of indoor dynamic scene SLAM method and system based on cloud
CN105869136A (en) Collaborative visual SLAM method based on multiple cameras
Buch et al. Prediction of ICP pose uncertainties using Monte Carlo simulation with synthetic depth images
CN110930444B (en) Point cloud matching method, medium, terminal and device based on bilateral optimization
CN112154429A (en) High-precision map positioning method, system, platform and computer readable storage medium
CN115511935A (en) Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation
CN113298870A (en) Object posture tracking method and device, terminal equipment and storage medium
CN114310901A (en) Coordinate system calibration method, apparatus, system and medium for robot
CN104615880A (en) Rapid ICP (inductively coupled plasma) method for point cloud matching of three-dimensional laser radar
CN115345934A (en) Laser positioning and mapping method and system based on gradient factor
CN113313200B (en) Point cloud precision matching method based on normal constraint
CN113658194B (en) Point cloud splicing method and device based on reference object and storage medium
CN117237428B (en) Data registration method, device and medium for three-dimensional point cloud
Choi et al. Fast and versatile feature-based lidar odometry via efficient local quadratic surface approximation
Dinc et al. Mirage: an O (n) time analytical solution to 3D camera pose estimation with multi-camera support
CN114998561B (en) Category-level pose optimization method and device
CN116363205A (en) Space target pose resolving method based on deep learning and computer program product
CN116577801A (en) Positioning and mapping method and system based on laser radar and IMU
JPH07146121A (en) Recognition method and device for three dimensional position and attitude based on vision
CN114119684B (en) Marker point registration method based on tetrahedral structure

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination