CN113284170A - Point cloud rapid registration method - Google Patents

Point cloud rapid registration method Download PDF

Info

Publication number
CN113284170A
CN113284170A CN202110581142.1A CN202110581142A CN113284170A CN 113284170 A CN113284170 A CN 113284170A CN 202110581142 A CN202110581142 A CN 202110581142A CN 113284170 A CN113284170 A CN 113284170A
Authority
CN
China
Prior art keywords
matrix
point
algorithm
transformation
registration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110581142.1A
Other languages
Chinese (zh)
Inventor
吕京兆
高增辅
魏源
石璕
鞠游
安玉玺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhiji Technology Co ltd
Original Assignee
Beijing Zhiji Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhiji Technology Co ltd filed Critical Beijing Zhiji Technology Co ltd
Priority to CN202110581142.1A priority Critical patent/CN113284170A/en
Publication of CN113284170A publication Critical patent/CN113284170A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention provides a point cloud rapid registration method. An improved ICP point cloud registration algorithm is provided by utilizing image information obtained by a coaxial camera of a laser scanner. The method comprises the following specific steps: and S1 preprocessing data. Filtering and sampling are carried out on the data set, so that the efficiency of the algorithm is improved; s2 estimates a transformation matrix. Estimating a transformation matrix by using three pairs of corresponding points, and performing coarse registration on point cloud data on the basis; s3 calculates the essential matrix. Calculating an essential matrix by using a 5-point algorithm and a camera internal parameter matrix; s4 obtains a rotation matrix. Obtaining accurate rotation transformation between scanning point clouds through an image shot by a coaxial camera; s5 obtains a translation vector. The traditional ICP algorithm is improved, only the translation vector is updated in the iteration process, and the convergence speed is accelerated; s6 point cloud registration. The two point clouds are registered by the obtained transformation matrix. The method can realize automatic registration, not only reduces algorithm complexity and operation time, but also ensures good accuracy.

Description

Point cloud rapid registration method
Technical Field
The invention relates to the technical field of industrial automatic detection, in particular to a point cloud rapid registration method.
Background
In industrial application, the three-dimensional laser scanner can rapidly scan and measure a measured object on a measurement site to directly obtain the three-dimensional coordinate information of the surface of the object. The measuring technology can operate in a complex scene, can also finish the measurement of complex entities, has various advantages of high precision, high speed and the like, and saves a large amount of time and material resources. Original data are directly obtained through a laser scanner, and due to the limitation of the three-dimensional scanner, the original data can be applied to the establishment of a three-dimensional model only through splicing and integration. How to register the point cloud data quickly and accurately becomes an important component in the research of the laser scanning three-dimensional reconstruction technology.
The point cloud registration is to obtain a spatial transformation relation between point cloud data and make the point cloud data consistent spatially after transformation. Due to the high accuracy of the three-dimensional scanning technology, the registration problem is basically a rigid registration problem. The point cloud registration studied here has only a rigid registration problem, i.e. registration in which point clouds can be stitched together by only calculating translation and rotation parameters between the point clouds.
The existing point cloud registration methods have three types: 1. a marker is utilized. In the three-dimensional laser scanning process, a special calibration reference object is placed, and after scanning is finished, the position relation between adjacent point cloud data is calculated through the calibration object. The registration method has fewer parameters and high efficiency, but is often limited by sites and equipment in practical application. 2. Direct registration with a measurement device. And (3) acquiring point cloud data by utilizing the rotation and the displacement of the three-dimensional laser scanner workbench, and registering by combining the recorded position transformation data. The accuracy of the registration method depends on the accuracy of the stage and the instrumentation. This approach is more accurate but places higher demands on the equipment. 3. And (6) automatic registration. And through a certain algorithm or statistical rule, the dislocation and deviation between the two point clouds are removed by using a computer, so that the automatic registration of the point clouds is completed. Since there is still no universal registration algorithm, it is necessary to adopt corresponding processing algorithms for point cloud data in different situations. Common algorithms usually require multiple iterative processes, and have high computational complexity and long registration time.
Disclosure of Invention
The invention provides a point cloud registration method, which aims to solve the problems of high calculation complexity and long time consumption in the existing automatic registration method and ensure the registration accuracy.
The laser scanner is usually provided with a coaxial visible light camera, and the scanning scene is shot while the scanner collects three-dimensional point cloud data. By taking pictures of a scanned scene at different scanning sites by a camera, the relative rotation transformation of the camera between the different scanning sites can be accurately obtained, and additional information is provided for point cloud registration. The image information provided by the camera is combined with the improved ICP algorithm, and a new point cloud rapid registration algorithm based on the image information is provided.
The invention effectively utilizes the image information provided by the camera equipped on the scanner and provides an improved ICP point cloud registration algorithm. Because the rotation transformation among the point clouds is calculated through the picture information, the improved ICP algorithm only needs to calculate the translation vector, the possibility that the algorithm converges to a wrong extreme value is reduced, meanwhile, the calculation complexity of the iterative process is reduced, and the iterative convergence speed is accelerated. The method comprises the following specific steps:
and S1 preprocessing data. The method adopts a voxel filter to filter and sample the data set uniformly, reduces the number of points and improves the efficiency of the algorithm.
S2 estimates a transformation matrix. Assume a set of target points P and a set of sample points Q. The transformation between these two sets of points can be represented by a matrix H by which the set of points P can be transformed to the coordinate system in which the set of points Q lies. The transformation matrix can be estimated by three pairs of corresponding points, coarse registration is carried out on point cloud data on the basis, and then the result of the coarse registration is calibrated by a fine algorithm to achieve fine registration.
S3 calculates the essential matrix. Firstly, calibrating a camera on a scanner to obtain an internal parameter matrix K of the camera; next, detecting and matching feature points of pictures shot by a camera on a scanner by using an SIFT algorithm or a Harris angular point detection algorithm to obtain 5 groups of corresponding points; then, an intrinsic matrix E between the two pictures is calculated using a 5-point algorithm and K.
S4 obtains a rotation matrix. The relative transformation RC between cameras is obtained through the decomposition of the essential matrix E, however, the coordinate system of the coaxial camera and the coordinate system of the scanner have an inherent transformation relation. Therefore, in order to obtain the rotation transformation between the scanning point clouds, the transformation relation between the coaxial camera coordinate system and the laser scanner coordinate system must be obtained.
Through the essential matrix between the images, relative rotation and translation transformation RC and tC of the coaxial camera at two different positions can be obtained, and the transformation relation between the two scanners is obtained, namely the scanners obtain the rotation and translation transformation between point clouds at two scanning positions. And obtaining accurate rotation transformation R between scanning point clouds through an image shot by a coaxial camera.
S5 obtains a translation vector. After obtaining the rotational transformation R between the point clouds from the image, we also need to solve the translation vectors t of the two point clouds. In order to quickly obtain accurate translation vectors between point clouds, a traditional ICP algorithm is improved, an obtained rotation matrix is directly substituted into an algorithm initialization step, and only t is updated in an iteration process.
S6 point cloud registration. And registering the two point clouds through the obtained rotation matrix R and the translation vector t.
More preferably, the S5 specifically includes the following steps:
the source point cloud dataset is known as P and the target point cloud dataset is known as Q. k represents iteration times, Pk is a data set of P after the kth update, Dk is a nearest point set corresponding to Pk in the kth iteration, R is a point cloud rotation matrix obtained by image information, tk is a translation vector calculated by the kth iteration, and Dk represents an estimation error of the kth iteration. The specific steps for improving the ICP algorithm are described as follows:
(1) initialization: let P0 ═ RP, R0 ═ I, t0 ═ 0, and k ═ 0.
(2) Calculating a recent point set: calculating a nearest point set Dk corresponding to the data point set Pk in Q;
(3) calculating registration parameters and calculating an estimation error dk;
(4) updating a source dataset
(5) Repeating iteration: the iteration process is stopped when the estimated error variation between two iterations is less than some threshold delta.
As a further preference, the algorithm evaluates some measure of error on the basis of the correspondence.
As a further preferred method, the algorithm is to eliminate the error point pairs that have an influence on the registration, and random sampling consistency estimation may be adopted, or other methods are used to eliminate the correspondence of errors.
Further preferably, the solution method of the intrinsic matrix E in S3 is a singular value decomposition method.
Generally, compared with the prior art, the above technical solution conceived by the present invention mainly has the following technical advantages:
1. compared with the traditional ICP algorithm, the improved ICP algorithm provided by the invention has no strict requirement on the initial position, and point clouds in any relative position can be tested.
2. In the iterative process of the traditional ICP algorithm, a large amount of operations are concentrated on solving the rotation matrix R. In the improved ICP algorithm, the iterative unknown quantity only has 3 parameters in the translation vector t, so that the algorithm computation quantity is greatly reduced, the cost function complexity is reduced, the computation time is reduced, and the convergence speed is accelerated.
3. The algorithm can realize automatic registration, reduce manual participation and facilitate the construction of an automatic measurement system.
Drawings
FIG. 1 is a flow chart of a method for fast registration of point clouds in an embodiment of the invention;
FIG. 2 is a flow chart of an improved ICP algorithm in an embodiment of the invention;
FIG. 3 is a schematic diagram of the relationship between the coordinate system of the scanner and the coordinate system of the on-axis camera according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments.
The embodiment of the invention provides a point cloud rapid registration method, which comprises the following steps:
step 1: data pre-processing
The method adopts a voxel filter to filter and sample the data set uniformly, reduces the number of points and improves the efficiency of the algorithm.
Step 2: estimating a transformation matrix
Assume that the point sets P and Q: where P is a set of target points, containing NP points, and may be represented by { pi }; q is a sample point set, contains NQ points, and is represented by { pQ }. The transformation between these two sets of points can be represented by a matrix H by which the set of points P can be transformed to the coordinate system in which the set of points Q lies.
According to the relation between the equation and the solution, only three pairs of corresponding points in two pieces of point cloud data are needed to obtain the unique solution of the transformation matrix H. However, in practical application, the three pairs of corresponding points have high precision requirement on the corresponding points by solving the transformation matrix. The accuracy of the measurement of the corresponding points determines the accuracy of the derived transformation matrix H. Once the corresponding points are deviated, the linear solving transformation matrix is also deviated greatly, which will seriously affect the precision of point cloud registration.
Therefore, in practical application, a transformation matrix can be estimated by using three pairs of corresponding points, point cloud data is subjected to coarse registration on the basis, and then the result of the coarse registration is calibrated by using a fine algorithm to achieve fine registration.
And step 3: computing essence matrices
Firstly, calibrating a camera on a scanner to obtain an internal parameter matrix K of the camera; next, detecting and matching feature points of pictures shot by a camera on a scanner by using an SIFT algorithm or a Harris angular point detection algorithm to obtain 5 groups of corresponding points; then, an intrinsic matrix E between the two pictures is calculated by using a 5-point algorithm and K:
when the camera internal parameters are known, the following relationship can be obtained:
where ql and qr denote the coordinates of the proxels pl and pr, respectively, on the respective camera imaging planes.
The coordinates of the obtained 5 sets of corresponding points on the imaging plane are substituted to obtain 5 equations, and a basic solution system (X Y Z W) is calculated by SVD to obtain a general solution of E.
E=xX+yY+zZ+W
From the two properties of the intrinsic matrix E, the equation set MX ═ 0 is obtained, where M is the parameter matrix and X is
[x3 y3 x2y xy2 x2z x2 y2z y2 xyz xy xz2 xz x y2z yz y z3 z2 z 1]
And performing column principal element Gaussian elimination on the equation set. Then, z is used as an implicit variable, the equation is changed into c (z) X (X, y) being 0, the determinant of c (z) is calculated, the parameter z can be solved, and the coefficients X and y can be obtained, so that the intrinsic matrix E is obtained.
And carrying out SVD on the E to obtain a translation vector T and a rotation transformation R, wherein the R is the relative rotation transformation between the two cameras.
By the five-point algorithm described in the above section, we can obtain the relative rotation transformation of the camera between two stations from the shot images of the coaxial camera between different scanning stations.
And 4, step 4: obtaining a rotation matrix
The relative transformation RC between cameras is obtained through the decomposition of the essential matrix E, however, the coordinate system of the coaxial camera and the coordinate system of the scanner have an inherent transformation relation. Therefore, in order to obtain the rotation transformation between the scanning point clouds, the transformation relation between the coaxial camera coordinate system and the laser scanner coordinate system must be obtained.
And setting XS,1 and XS,2 as coordinates of corresponding points in two point clouds obtained by the scanner at two positions, wherein XC,1 and XC,2 are coordinates of XS,1 and XS,2 under corresponding coaxial camera coordinate systems respectively, and then
By the essential matrix between the images, we can obtain the relative rotation and translation transformation RC, tC of the coaxial camera at two different positions, namely
The transformation relationship between the two scanners is:
that is, the rotational-translational transformation between the point clouds obtained by the scanner at two scanning positions is respectively as follows:
since λ and tE are unknown, an accurate rotational transformation R between the scan point clouds is obtained from the images taken with the on-axis camera.
And 5: obtaining a translation vector
And iterating the point cloud through an improved ICP algorithm to obtain a correct translation vector t.
After obtaining the rotational transformation R between the point clouds from the image, we also need to solve the translation vectors t of the two point clouds. In order to quickly obtain accurate translation vectors between point clouds, a traditional ICP algorithm is improved, an obtained rotation matrix is directly substituted into an algorithm initialization step, and only t is updated in an iteration process.
The source point cloud dataset is known as P and the target point cloud dataset is known as Q. k represents iteration times, Pk is a data set of P after the kth update, Dk is a nearest point set corresponding to Pk in the kth iteration, R is a point cloud rotation matrix obtained by image information, tk is a translation vector calculated by the kth iteration, and Dk represents an estimation error of the kth iteration. The specific steps for improving the ICP algorithm are described as follows:
(1) initialization: let P0 ═ RP, R0 ═ I, t0 ═ 0, and k ═ 0.
(2) Calculating a recent point set: calculating a nearest point set Dk corresponding to the data point set Pk in Q;
(3) and (3) calculation of registration parameters:
and calculating an estimation error dk, where dk satisfies the following equation:
(4) updating the source data set: the data set Pk is updated with tk,
(5) repeating iteration: the iteration process is stopped when the estimated error variation between two iterations is less than some threshold δ, i.e., dk-dk +1< δ.
In the iterative process of the traditional ICP algorithm, a large amount of operations are concentrated on solving the rotation matrix R. In the improved ICP algorithm, the iterative unknown quantity only has 3 parameters in the translation vector t, so that the algorithm computation quantity is greatly reduced, the cost function complexity is reduced, the computation time is reduced, and the convergence speed is accelerated.
Step 6: point cloud registration
Finally, the two point clouds are registered through the obtained transformation matrix.
The above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (4)

1. A point cloud rapid registration method is characterized by comprising the following steps:
and S1 preprocessing data. Filtering and sampling the data set by adopting a voxel filter, and reducing the number of points to improve the efficiency of the algorithm;
s2 estimates a transformation matrix. Assume a set of target points P and a set of sample points Q. The transformation between these two sets of points can be represented by a matrix H by which the set of points P can be transformed to the coordinate system in which the set of points Q lies. The transformation matrix can be estimated by using three pairs of corresponding points, the point cloud data is subjected to coarse registration on the basis, and then the result of the coarse registration is calibrated by using a fine algorithm to achieve fine registration;
s3 calculates the essential matrix. Firstly, calibrating a camera on a scanner to obtain an internal parameter matrix K of the camera; next, detecting and matching feature points of pictures shot by a camera on a scanner by using an SIFT algorithm or a Harris angular point detection algorithm to obtain 5 groups of corresponding points; then, calculating an essential matrix E between the two pictures by using a 5-point algorithm and K;
s4 obtains a rotation matrix. The relative transformation RC between cameras is obtained through the decomposition of the essential matrix E, however, the coordinate system of the coaxial camera and the coordinate system of the scanner have an inherent transformation relation. Therefore, in order to obtain the rotation transformation between the scanning point clouds, the transformation relation between the coaxial camera coordinate system and the laser scanner coordinate system must be obtained;
through the essential matrix between the images, relative rotation and translation transformation RC and tC of the coaxial camera at two different positions can be obtained, and the transformation relation between the two scanners is obtained, namely the scanners obtain the rotation and translation transformation between point clouds at two scanning positions. Obtaining accurate rotation transformation R between scanning point clouds through an image shot by a coaxial camera;
s5 obtains a translation vector. After obtaining the rotational transformation R between the point clouds from the image, we also need to solve the translation vectors t of the two point clouds. In order to quickly obtain accurate translation vectors among point clouds, a traditional ICP algorithm is improved, an obtained rotation matrix is directly substituted into an algorithm initialization step, and only t is updated in an iteration process;
s6 point cloud registration. And registering the two point clouds through the obtained rotation matrix R and the translation vector t.
2. The point cloud rapid registration method of claim 1, wherein the step S5 specifically includes the following steps:
the source point cloud dataset is known as P and the target point cloud dataset is known as Q. k represents iteration times, Pk is a data set of P after the kth update, Dk is a nearest point set corresponding to Pk in the kth iteration, R is a point cloud rotation matrix obtained by image information, tk is a translation vector calculated by the kth iteration, and Dk represents an estimation error of the kth iteration. The specific steps for improving the ICP algorithm are described as follows:
(1) initialization: let P0 ═ RP, R0 ═ I, t0 ═ 0, and k ═ 0.
(2) Calculating a recent point set: calculating a nearest point set Dk corresponding to the data point set Pk in Q;
(3) and (3) calculation of registration parameters:
and calculating an estimation error dk, where dk satisfies the following equation:
(4) updating the source data set: the data set Pk is updated with tk,
(5) repeating iteration: the iteration process is stopped when the estimated error variation between two iterations is less than some threshold δ, i.e., dk-dk +1< δ.
3. The point cloud rapid registration method of claim 1, wherein the algorithm is to eliminate the error point pairs which have influence on registration, and random sampling consistency estimation can be adopted, or other methods are adopted to eliminate the corresponding relation of errors.
4. The point cloud rapid registration method of claim 1, wherein the solution method of the essential matrix E in S3 is a singular value decomposition method.
CN202110581142.1A 2021-05-26 2021-05-26 Point cloud rapid registration method Pending CN113284170A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110581142.1A CN113284170A (en) 2021-05-26 2021-05-26 Point cloud rapid registration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110581142.1A CN113284170A (en) 2021-05-26 2021-05-26 Point cloud rapid registration method

Publications (1)

Publication Number Publication Date
CN113284170A true CN113284170A (en) 2021-08-20

Family

ID=77281896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110581142.1A Pending CN113284170A (en) 2021-05-26 2021-05-26 Point cloud rapid registration method

Country Status (1)

Country Link
CN (1) CN113284170A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830080A (en) * 2022-10-27 2023-03-21 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN116185077A (en) * 2023-04-27 2023-05-30 北京历正飞控科技有限公司 Narrow-band accurate striking method of black flying unmanned aerial vehicle
CN117788529A (en) * 2024-02-26 2024-03-29 苏州艾吉威机器人有限公司 Three-dimensional plane point cloud coarse registration method, system, medium and equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955939A (en) * 2014-05-16 2014-07-30 重庆理工大学 Boundary feature point registering method for point cloud splicing in three-dimensional scanning system
CN107220995A (en) * 2017-04-21 2017-09-29 西安交通大学 A kind of improved method of the quick point cloud registration algorithms of ICP based on ORB characteristics of image
CN109816703A (en) * 2017-11-21 2019-05-28 西安交通大学 A kind of point cloud registration method based on camera calibration and ICP algorithm
CN109903319A (en) * 2019-03-13 2019-06-18 北京信息科技大学 A kind of iteratively faster closest approach registration Algorithm based on multiresolution
CN111754464A (en) * 2020-06-03 2020-10-09 北京汉飞航空科技有限公司 Part accurate alignment method combining PD-like algorithm with ICP algorithm
CN111784770A (en) * 2020-06-28 2020-10-16 河北工业大学 Three-dimensional attitude estimation method in disordered grabbing based on SHOT and ICP algorithm
CN111915677A (en) * 2020-07-08 2020-11-10 哈尔滨工程大学 Ship pose estimation method based on three-dimensional point cloud characteristics
CN112001955A (en) * 2020-08-24 2020-11-27 深圳市建设综合勘察设计院有限公司 Point cloud registration method and system based on two-dimensional projection plane matching constraint
WO2021088481A1 (en) * 2019-11-08 2021-05-14 南京理工大学 High-precision dynamic real-time 360-degree omnibearing point cloud acquisition method based on fringe projection

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955939A (en) * 2014-05-16 2014-07-30 重庆理工大学 Boundary feature point registering method for point cloud splicing in three-dimensional scanning system
CN107220995A (en) * 2017-04-21 2017-09-29 西安交通大学 A kind of improved method of the quick point cloud registration algorithms of ICP based on ORB characteristics of image
CN109816703A (en) * 2017-11-21 2019-05-28 西安交通大学 A kind of point cloud registration method based on camera calibration and ICP algorithm
CN109903319A (en) * 2019-03-13 2019-06-18 北京信息科技大学 A kind of iteratively faster closest approach registration Algorithm based on multiresolution
WO2021088481A1 (en) * 2019-11-08 2021-05-14 南京理工大学 High-precision dynamic real-time 360-degree omnibearing point cloud acquisition method based on fringe projection
CN111754464A (en) * 2020-06-03 2020-10-09 北京汉飞航空科技有限公司 Part accurate alignment method combining PD-like algorithm with ICP algorithm
CN111784770A (en) * 2020-06-28 2020-10-16 河北工业大学 Three-dimensional attitude estimation method in disordered grabbing based on SHOT and ICP algorithm
CN111915677A (en) * 2020-07-08 2020-11-10 哈尔滨工程大学 Ship pose estimation method based on three-dimensional point cloud characteristics
CN112001955A (en) * 2020-08-24 2020-11-27 深圳市建设综合勘察设计院有限公司 Point cloud registration method and system based on two-dimensional projection plane matching constraint

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周丽敏;卜昆;董一巍;乔燕;程云勇;黄胜利;: "基于简化点云带动的涡轮叶片快速配准技术", 计算机集成制造系统, no. 05, pages 94 - 98 *
王瑞岩;姜光;高全学;: "结合图像信息的快速点云拼接算法", 测绘学报, no. 01, pages 96 - 102 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115830080A (en) * 2022-10-27 2023-03-21 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN115830080B (en) * 2022-10-27 2024-05-03 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN116185077A (en) * 2023-04-27 2023-05-30 北京历正飞控科技有限公司 Narrow-band accurate striking method of black flying unmanned aerial vehicle
CN116185077B (en) * 2023-04-27 2024-01-26 北京历正飞控科技有限公司 Narrow-band accurate striking method of black flying unmanned aerial vehicle
CN117788529A (en) * 2024-02-26 2024-03-29 苏州艾吉威机器人有限公司 Three-dimensional plane point cloud coarse registration method, system, medium and equipment

Similar Documents

Publication Publication Date Title
CN113284170A (en) Point cloud rapid registration method
Kiebel et al. MRI and PET coregistration—a cross validation of statistical parametric mapping and automated image registration
Zhang Determining the epipolar geometry and its uncertainty: A review
Candocia et al. A similarity measure for stereo feature matching
CN107481284A (en) Method, apparatus, terminal and the system of target tracking path accuracy measurement
US7487063B2 (en) Three-dimensional modeling from arbitrary three-dimensional curves
CN108801218B (en) High-precision orientation and orientation precision evaluation method of large-size dynamic photogrammetry system
Ravishankar et al. Automated inspection of aircraft parts using a modified ICP algorithm
Hajder et al. Relative planar motion for vehicle-mounted cameras from a single affine correspondence
Belhaoua et al. Error evaluation in a stereovision-based 3D reconstruction system
Eichhardt et al. Affine correspondences between central cameras for rapid relative pose estimation
Li et al. Cross-ratio invariant based line scan camera geometric calibration with static linear data
CN111612731B (en) Measuring method, device, system and medium based on binocular microscopic vision
Pennec et al. Feature-based registration of medical images: Estimation and validation of the pose accuracy
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
Renò et al. Comparative analysis of multimodal feature-based 3D point cloud stitching techniques for aeronautic applications
CN114170321A (en) Camera self-calibration method and system based on distance measurement
Li et al. An allowance optimal distribution method based on improved iterative closest point algorithm
Moron et al. Automatic inspection of industrial parts using 3D optical range sensor
Amodio et al. Finite strain analysis by image processing: smoothing techniques
CN108230377B (en) Point cloud data fitting method and system
Graf et al. Analytically solving radial distortion parameters
CN112381721A (en) Human face three-dimensional reconstruction method based on binocular vision
Ristic et al. A framework for non-contact measurement and analysis of NURBS surfaces
KR100823070B1 (en) Camera calibration method using a plurality of calibration images for 3 dimensional measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination