CN115511935A - Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation - Google Patents
Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation Download PDFInfo
- Publication number
- CN115511935A CN115511935A CN202211323356.XA CN202211323356A CN115511935A CN 115511935 A CN115511935 A CN 115511935A CN 202211323356 A CN202211323356 A CN 202211323356A CN 115511935 A CN115511935 A CN 115511935A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- iterative
- discretization
- vector
- pose estimation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000009466 transformation Effects 0.000 title claims abstract description 31
- 239000013598 vector Substances 0.000 claims abstract description 55
- 239000011159 matrix material Substances 0.000 claims abstract description 26
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000006073 displacement reaction Methods 0.000 claims description 7
- 238000000844 transformation Methods 0.000 claims 1
- 238000012545 processing Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/37—Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
A registration method of point cloud with normal distribution transformation based on iterative discretization and linear interpolation includes obtaining X and target point cloudsDetermining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale; then, selecting a cell resolution according to the current discrete iteration times to divide the target point cloud space into a plurality of cube cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cell; then traversing each source point cloudIteratively updating pose estimation vectorsScore function of (2)Gradient vector of (2)And a Hessian matrix H; and then iteratively updating the pose estimation vector by using a Newton methodFinally, updating the source point cloud chi, judging whether iteration is needed to be continued or not, and after repeated iteration, accurately registering the source point cloud chi to the target point cloudIn the space in which it is located. The invention can quickly and accurately complete the point cloud registration work.
Description
Technical Field
The invention relates to the technical field of three-dimensional reconstruction and point cloud registration, in particular to a point cloud registration method based on iterative discretization and linear interpolation normal distribution transformation.
Background
With the development of computer technology and three-dimensional machine vision technology, the acquisition of point cloud data is becoming more and more convenient, and the task of point cloud processing is also becoming more and more. In most point cloud processing tasks, the point cloud registration technology is the basis of subsequent processing, because most existing devices such as laser radars, depth cameras and the like can only acquire local point clouds of objects or scenes, and at the moment, the point cloud registration technology is needed to register and splice multi-frame point clouds to form a complete point cloud scene. There are many scenarios that require the use of point cloud registration techniques, such as: three-dimensional reconstruction, automatic driving, mobile robot positioning and mapping, medical image splicing and the like.
The point cloud registration technology and method have wide application prospect and strong market demand in various industries. However. The existing point cloud registration technology has various problems, such as: the method for the point cloud registration comprises the following steps of solving the problem that a point cloud registration method is invalid when initial errors of point clouds to be registered are large, solving the problem that the point cloud registration speed is slow when the point cloud scale is large, and solving the problem that the registration accuracy is low when the overlapped parts of the point clouds to be registered are few. Therefore, the point cloud registration method with good robustness, high precision and low speed has important practical significance.
Disclosure of Invention
The invention aims to provide a normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation aiming at the defects of low robustness and low speed of a pure point cloud-based registration method (such as an ICP algorithm).
In order to achieve the above purpose, the present invention provides the following technical solutions.
The invention relates to a point cloud registration method based on normal distribution transformation of iterative discretization and linear interpolation, which comprises the following steps of:
step 1: obtaining a source point cloud χ and a target point cloudAnd determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale.
Because the most important parameter of the point cloud registration algorithm based on normal distribution transformation is the resolution of a cube cell, the registration precision can be reduced by too high resolution, and the registration task cannot be completed by too low resolution when the initial error of two frames of point clouds is large. Therefore, the invention adopts the iterative discretization method to carry out point cloud registration of multiple times of normal distribution transformation with different resolutions, and simultaneously ensures the robustness and the accuracy of the registration. Firstly, a source point cloud chi and a target point cloud are obtainedWherein the final objective of the whole method is to make the source point cloud chiRegistration to target point cloudIn the spatial coordinates of the location. And then determining a cube cell resolution sequence phi required by iterative discretization according to the acquired point cloud scale.
The cubic cell resolution sequence phi, phi = { phi = phi required for iterative discretization described in step 1 of the invention 1 ,φ 2 ,...,φ l In which phi a =φ a-1 Per 2, first resolution of the sequence φ 1 Can be generally set to the target point cloudOne tenth of the maximum pitch of (1); for the length l of the cube cell resolution sequence Φ, namely, the value of the iterative computation times of the iterative discretization, l = 2-5 can be obtained according to the precision requirement, and the higher the precision requirement is, the larger the value is.
And 2, step: target point cloud based on iterative discretization methodAnd dividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell.
Then according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloudThe occupied space is divided into cubic cells of equal size. Then traversing all the target point cloudsAnd classifying the point clouds into corresponding cubic cells according to the three-dimensional coordinates. Due to the subsequently used source point cloud scoring function based on linear interpolationThe centroids μ and covariance matrices Σ of eight cubic cells adjacent to the point cloud are required, and since the inverse of the covariance matrix Σ may not exist when the number of point clouds in a cell is less than 5, only the centroids μ and covariance matrices Σ of those cubic cells whose number of point clouds is not less than 5 need to be calculated here. Furthermore, before proceeding to step 3, it is necessary to initialize the pose estimation vector
The calculation formula of the centroid mu and the covariance matrix sigma of the cubic cell is as follows:
wherein,representing the target point cloud located in the cube cell, and m is more than or equal to 5.
The pose estimation vector uses 3 variables to represent displacement and 3 variables to represent rotation, i.e. using a six-dimensional parameter vectorEstimated vector of alignment postureCoding is carried out, where t x ,t y ,t z Displacement values in x, y, z directions respectively,respectively the corresponding euler angles.
And step 3: traversing each source point cloudIteratively updating pose estimation vectorsScore function ofGradient vector ofAnd a Hessian matrix H.
Before traversing a source point cloud chi, a pose estimation vector needs to be initializedScore function ofGradient vector ofAnd Hessian matrix H =0 and total score =0. Then traversing each source point cloudFirst, the source point cloud needs to be foundTransformed function of poseEight cubic cells near the converted point; then according to a source point cloud scoring function based on linear interpolationCalculate the score for that point and add the score to the total score and update the score functionGradient vector ofAnd the Hessian matrix H, and then iteratively compute the next source point cloud.
The pose transformation functionFor rigid body transformation processes, i.e. point cloudsEstimating vector according to positionPerforming rigid body transformation and pose transformation functionsThe calculation formula is as follows:
the source point cloud scoring function based on linear interpolationIn order to solve the problem that the surfaces of the edges of the cubic cells are discontinuous, the traditional point cloud registration method based on normal distribution transformation only uses the cells where the point cloud is located to score, and when the point cloud is located at the edge of the cell, the scoring error is large. Therefore, the invention adopts a source point cloud scoring function based on linear interpolationThe point cloud scoring is more accurate, and the scoring functionThe calculation formula of (c) is:
wherein,indicating the point cloud to be scored, and subscript b indicating the point cloudThe b-th cell of the eight cells in the vicinity,weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one cellIs equal to 1.
wherein,is a point in the source point cloud χ. Then the scoring functionGradient vector ofAnd Hessian matrix H atThe components in each direction are:
wherein p is i And p j Are all pose estimation vectorsThe components in each of the directions in (a) are,
The whole method aims to find a pose estimation vectorSo thatScore function ofAnd max. For this objective function, an iterative solution can be made using newton's method: according to the gradient vector obtained in step 3And Hessian matrix H, solving the equationObtaining an increment of pose estimationThen updating the pose estimation vectorAnd finally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, and if not, returning to the step 3 to continue to perform iteration.
The convergence condition is the increment of pose estimationLess than the expected value or more than a set value.
And 5: and updating the source point cloud χ and judging whether iteration is needed to be continued.
Acquiring a pose estimation vector in step 4Thereafter, using the pose transformation functionAnd updating the source point cloud chi. Then judging whether the current iteration discretization frequency is less than the length l of the cubic cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, ending the algorithm, and precisely registering the source point cloud chi to the target point cloudIs in the space of (a).
Compared with the prior art, the method adopts the point cloud registration algorithm based on normal distribution transformation, greatly accelerates the point cloud registration speed and improves the robustness of point cloud registration facing various different scenes. Aiming at the problem that the registration accuracy and robustness cannot be considered in the classical normal distribution transformation point cloud registration algorithm, the point cloud registration of the normal distribution transformation is improved in a discrete iteration mode, the robustness of initial registration is guaranteed by using the cell resolution of large scale, and the accuracy of subsequent registration is guaranteed by using the cell resolution of small scale. In addition, aiming at the problem of discontinuous cell edge surfaces in the classical normal distribution transformation point cloud registration algorithm, the invention adopts a linear interpolation mode to carry out scoring investigation on eight cells near the point cloud, thereby improving the scoring accuracy and further improving the registration accuracy.
Drawings
Fig. 1 is a schematic flow diagram of a point cloud registration method based on iterative discretization and linear interpolation for normal distribution transformation provided by the present invention.
Fig. 2 is (a) a top view, (b) a front view, and (c) an oblique view of an input point cloud used in an embodiment of the normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation proposed in the present invention.
Fig. 3 is (a) a top view, (b) a front view, and (c) an oblique view of a point cloud registration result of an embodiment of a normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation proposed in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and the embodiments. The embodiments described herein are only for explaining the technical solution of the present invention and are not limited to the present invention.
The invention provides a technical scheme that: the specific flow of the point cloud registration method based on the iterative discretization and linear interpolation based on the normal distribution transformation is shown in fig. 1, and the method comprises the following steps:
step 1: obtaining a source point cloud chi and a target point cloudAnd determining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale.
Firstly, a source point cloud chi and a target point cloud are required to be inputThe point cloud scene is obtained by scanning in a mine hole by using a laser radar scanner, the whole width of the mine hole is 4.5m, the height is 4m, the length is about 17m, and the source point cloud chi and the target point cloudThe initial angle deviation of (2) is 13 degrees, the displacement deviation is 0.96m, and the initial state of the point cloud is shown in fig. 2. Then we need to determine the cube cell resolution sequence phi, phi = { phi 1, phi required for iterative discretization according to the scale of the input point cloud 2 ,...,φ l In which phi a =φ a-1 Per 2, first resolution of the sequence φ 1 Can be generally set to the target point cloudThe length l of the cube cell resolution sequence phi, namely the value of the iterative computation times of the iterative discretization, can be l = 2-5 according to the precision requirement, and the higher the precision requirement, the larger the value. The target point cloud input this timeIs about 20m, so the first resolution of the sequence can be set 1 Set to 2m, and the length l of the resolution series Φ to 4, the resolution series Φ = {2,1,0.5,0.25}.
And 2, step: target point alignment method based on iterative discretizationCloudAnd dividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell.
Then according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloudThe occupied space is divided into cubic cells of equal size. Then traversing all the target point cloudsAnd classifying the point clouds into corresponding cubic cells according to the three-dimensional coordinates. Due to the subsequently used source point cloud scoring function based on linear interpolationThe centroids μ and covariance matrices Σ of eight cubic cells adjacent to the point cloud are required, and since the inverse of the covariance matrix Σ may not exist when the number of point clouds in a cell is less than 5, it is only necessary to calculate the centroids μ and covariance matrices Σ of those cubic cells whose number of point clouds is not less than 5. The calculation formula of the centroid mu and the covariance matrix sigma is as follows:
and due to the fact thatAnd the target point cloud located in the cubic cell is represented, and m is more than or equal to 5. Pose estimation vector uses 3 variable tablesIndicating displacement, 3 variables indicating rotation, i.e. using a six-dimensional parameter vectorEstimated vector of alignment posturePerforming coding with t x ,t y ,t z Displacement values in x, y, z directions respectively,are respectively corresponding Euler angles, so the pose estimation vector needs to be initialized finally
And step 3: traverse each source point cloudIteratively updating pose estimation vectorsScore function of (2)Gradient vector of (2)And Hessian matrix H.
(a) Before traversing the source point cloud chi, the pose estimation vector needs to be initializedScore function ofGradient vector ofAnd Hessian matrix H =0 and total scoreAnd =0. Then traversing each source point cloud
(b) First, the source point cloud needs to be foundTransformed function of poseEight cube cells near the transformed point, pose transformation functionThe calculation formula is as follows:
(c) Then according to a source point cloud scoring function based on linear interpolationCalculating the score of the point and adding the score to the total score, score functionThe calculation formula of (2) is as follows:
wherein,indicating the point cloud to be scored, subscript b indicating the point cloudThe b-th cell of the eight cells in the vicinity,weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one unit cellIs equal to 1, c is set in this embodiment 1 =3,c 2 =5。
(d) Then updating the scoring functionGradient vector ofAnd Hessian matrix H, pose estimation vectorScore function of (2)The formula is as follows:
wherein,is a point in the source point cloud χ. Then the scoring functionGradient vector ofAnd Hessian matrix H atThe components in each direction are:
wherein p is i And p j Are all pose estimation vectorsThe components in each of the directions in (a) and (b),
(e) And continuously and iteratively calculating the next source point cloud.
The whole method aims to find a pose estimation vectorSo thatScore function ofAnd maximum. For this objective function, an iterative solution can be made using newton's method: according to the gradient vector obtained in step 3And Hessian matrix H, solving the equationObtaining an increment of pose estimatesThen updating the pose estimation vectorAnd finally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, otherwise returning to the step 3 to continue to perform iteration. Wherein the convergence condition is the increment of the current pose estimationLess than the expected value (set to 0.000001 in this embodiment).
And 5: and updating the source point cloud χ and judging whether iteration is needed to be continued.
Acquiring the pose estimation vector in step 4Thereafter, using the pose transformation functionAnd updating the source point cloud chi. Then judging whether the current iteration discretization frequency is smaller than the length l of the cube cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, finishing the algorithm, and precisely registering the X to the target point cloudIn space of (2), the final registration effect is as shown in fig. 3。
The above description is only an example of the present patent and should not be interpreted as limiting the scope of the present patent. It should be noted that various changes, modifications and substitutions may be made by those skilled in the art without departing from the spirit and principles of the invention, which is within the scope of the invention as defined by the appended claims and their equivalents.
Claims (1)
1. A point cloud registration method based on iteration discretization and linear interpolation for normal distribution transformation is characterized by comprising the following steps:
step 1: obtaining a source point cloudAnd target point cloudDetermining a cube cell resolution sequence phi required by iterative discretization according to the point cloud scale;
first, a source point cloud is obtainedWith a target point cloudDetermining a cube cell resolution sequence phi required by iterative discretization according to the obtained point cloud scale, and then performing point cloud registration of multiple normal distribution transformations by adopting an iterative discretization method at different resolutions;
the resolution sequence phi, phi = { phi = of cubic cells required by iterative discretization 1 ,φ 2 ,...,φ l In which phi a =φ a-1 2, first resolution of the sequence φ 1 Is set as the target point cloudOne tenth of the maximum pitch of (1); the length l of the cube cell resolution sequence phi, namely the value of iterative computation times of iterative discretization, is 1 = 2-5, and the higher the precision requirement is, the larger the value is;
and 2, step: target point cloud based on iterative discretization methodDividing the cubic cells, and calculating the point cloud mass center mu and the covariance matrix sigma in each cubic cell;
according to the cube cell resolution phi corresponding to the iteration number of the current iteration discretization i The target point cloudThe occupied space is divided into cubic cells with equal size; then traversing all the target point cloudsClassifying the point clouds into corresponding cubic cells according to three-dimensional coordinates; calculating the mass center mu and the covariance matrix sigma of the cubic cells with the point cloud number not less than 5, and initializing pose estimation vectors
The calculation formula of the centroid mu and the covariance matrix sigma of the cubic unit grid is as follows:
wherein,representing the target point cloud in the cubic cell, wherein m is more than or equal to 5;
the pose estimation vector uses 3 variables to represent displacement, 3 variables to represent rotation and a six-dimensional parameter vectorEstimated vector of alignment posturePerforming coding with t x ,t y ,t z The displacement values in the x, y and z directions,respectively corresponding euler angles;
and step 3: traversing each source point cloudIteratively updating pose estimation vectorsScore function of (2)Gradient vector ofAnd a Hessian matrix H;
traversing a cloud of source pointsFirstly, initializing the pose estimation vectorScore function of (2)Gradient vector ofAnd Hessian matrix H =0 and total score =0; then traversing each source point cloudFirst, find the source point cloudTransformed function of poseEight cubic cells near the converted point; then according to a source point cloud scoring function based on linear interpolationCalculate the score for that point and add the score to the total score and update the score functionGradient vector of (2)And a Hessian matrix H, and then iteratively calculating the next source point cloud;
the pose transformation functionFor rigid body transformation process, point cloudsEstimating vector according to positionPerforming rigid body transformation and pose transformation functionsThe calculation formula is as follows:
the source point cloud scoring function based on linear interpolationThe calculation formula of (c) is:
wherein,indicating the point cloud to be scored, subscript b indicating the point cloudThe b-th cell of the eight cells in the vicinity,weight function representing linear interpolation, d 1 And d 2 The calculation formula of (2) is as follows:
d 1 =-log(c 1 +c 2 )+log(c 2 )
wherein, constant c 1 ,c 2 Can be controlled by the requirement of the space spanned by one cellIs equal to 1;
wherein,as a cloud of sourcesA point of (1); scoring functionGradient vector ofAnd Hessian matrix H inThe components in each direction are:
wherein p is i And p j Are all pose estimation vectorsThe components in each of the directions in (a) and (b),
According to the gradient vector obtained in step 3And Hessian matrix H, solving the equationObtaining an increment of pose estimationThen updating the pose estimation vectorFinally, judging whether the convergence condition is reached, if the convergence condition is reached, continuing to move downwards, otherwise returning to the step 3 to continue iterative execution;
the convergence condition is the increment of pose estimationLess than the expected value or the iteration number is more than a certain set value;
acquiring a pose estimation vector in step 4Thereafter, using the pose transformation functionSource point cloudUpdating, and then judging whether the current iteration discretization frequency is less than the length l of the cubic cell resolution sequence phi: if yes, returning to the step 2 to continue the iterative execution; if not, ending the algorithm and obtaining the source point cloudIs accurately registered to the target point cloudIn the space of (a).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211323356.XA CN115511935A (en) | 2022-10-27 | 2022-10-27 | Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211323356.XA CN115511935A (en) | 2022-10-27 | 2022-10-27 | Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115511935A true CN115511935A (en) | 2022-12-23 |
Family
ID=84513134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211323356.XA Pending CN115511935A (en) | 2022-10-27 | 2022-10-27 | Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115511935A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116572253A (en) * | 2023-06-29 | 2023-08-11 | 深圳技术大学 | Grabbing control method and device for test tube |
-
2022
- 2022-10-27 CN CN202211323356.XA patent/CN115511935A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116572253A (en) * | 2023-06-29 | 2023-08-11 | 深圳技术大学 | Grabbing control method and device for test tube |
CN116572253B (en) * | 2023-06-29 | 2024-02-20 | 深圳技术大学 | Grabbing control method and device for test tube |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110116407B (en) | Flexible robot position and posture measuring method and device | |
Garro et al. | Solving the pnp problem with anisotropic orthogonal procrustes analysis | |
CN101907459B (en) | Monocular video based real-time posture estimation and distance measurement method for three-dimensional rigid body object | |
JP5627325B2 (en) | Position / orientation measuring apparatus, position / orientation measuring method, and program | |
CN110930495A (en) | Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium | |
CN106056643B (en) | A kind of indoor dynamic scene SLAM method and system based on cloud | |
CN105869136A (en) | Collaborative visual SLAM method based on multiple cameras | |
Buch et al. | Prediction of ICP pose uncertainties using Monte Carlo simulation with synthetic depth images | |
CN110930444B (en) | Point cloud matching method, medium, terminal and device based on bilateral optimization | |
CN112154429A (en) | High-precision map positioning method, system, platform and computer readable storage medium | |
CN115511935A (en) | Normal distribution transformation point cloud registration method based on iterative discretization and linear interpolation | |
CN113298870A (en) | Object posture tracking method and device, terminal equipment and storage medium | |
CN114310901A (en) | Coordinate system calibration method, apparatus, system and medium for robot | |
CN104615880A (en) | Rapid ICP (inductively coupled plasma) method for point cloud matching of three-dimensional laser radar | |
CN115345934A (en) | Laser positioning and mapping method and system based on gradient factor | |
CN113313200B (en) | Point cloud precision matching method based on normal constraint | |
CN113658194B (en) | Point cloud splicing method and device based on reference object and storage medium | |
CN117237428B (en) | Data registration method, device and medium for three-dimensional point cloud | |
Choi et al. | Fast and versatile feature-based lidar odometry via efficient local quadratic surface approximation | |
Dinc et al. | Mirage: an O (n) time analytical solution to 3D camera pose estimation with multi-camera support | |
CN114998561B (en) | Category-level pose optimization method and device | |
CN116363205A (en) | Space target pose resolving method based on deep learning and computer program product | |
CN116577801A (en) | Positioning and mapping method and system based on laser radar and IMU | |
JPH07146121A (en) | Recognition method and device for three dimensional position and attitude based on vision | |
CN114119684B (en) | Marker point registration method based on tetrahedral structure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |