CN116883469B - Point cloud registration method based on EIV model description under plane feature constraint - Google Patents

Point cloud registration method based on EIV model description under plane feature constraint Download PDF

Info

Publication number
CN116883469B
CN116883469B CN202310893158.5A CN202310893158A CN116883469B CN 116883469 B CN116883469 B CN 116883469B CN 202310893158 A CN202310893158 A CN 202310893158A CN 116883469 B CN116883469 B CN 116883469B
Authority
CN
China
Prior art keywords
planar
plane
parameters
feature
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310893158.5A
Other languages
Chinese (zh)
Other versions
CN116883469A (en
Inventor
王永波
郑南山
卞正富
张秋昭
杨敏
袁坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN202310893158.5A priority Critical patent/CN116883469B/en
Publication of CN116883469A publication Critical patent/CN116883469A/en
Application granted granted Critical
Publication of CN116883469B publication Critical patent/CN116883469B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a point cloud registration method based on EIV model description under plane feature constraint, which comprises the following steps: on the basis of a four-parameter expression method of planar features in a three-dimensional space, unit quaternion is introduced as a mathematical description operator of spatial rotation transformation, and on the basis of ensuring the uniqueness of the mathematical expression form of the planar features, the construction of a spatial similarity transformation model based on planar feature constraint is realized. The extraction error of the planar features is described by using an EIV model, parameters of the planar features with the same name after registration are used as constraint conditions, an objective function of three-dimensional space similarity transformation is constructed based on a total least squares criterion, and iterative solution of point cloud registration parameters between adjacent LiDAR measuring stations based on planar feature constraint is realized through extremum analysis of the function. The method can effectively realize registration of LiDAR point clouds between adjacent measuring stations, and the accuracy of the registration result is higher on the premise of ensuring asymptotic unbiasedness of the registration parameter solving result.

Description

Point cloud registration method based on EIV model description under plane feature constraint
Technical Field
The invention relates to a point cloud registration method based on EIV model description under plane feature constraint.
Background
With the advent of LiDAR technology and its successful application in production, registration of LiDAR point clouds has been of great interest to researchers as a necessary means to achieve adjacent station LiDAR point cloud fusion. The essence of point cloud registration is to seek and establish a feature corresponding relation between adjacent station clouds, and solve 7 parameters for describing a relative position relation between adjacent station coordinate references based on a space similarity transformation model (which can be simplified into a rigid transformation model when scaling is not considered), namely: the rotation angles (delta alpha, delta beta, delta gamma) of three coordinate axes X, Y and Z, three coordinate translation amounts (delta X, delta Y, delta Z) and a scale factor mu, so that the unified description and expression of the point cloud coordinate reference between adjacent LiDAR measuring stations are realized.
According to different selection of registration primitives, the existing LiDAR point cloud registration algorithm can be divided into 4 classes: liDAR point cloud registration based on homonymous point matching, liDAR point cloud registration based on homonymous straight line feature matching, liDAR point cloud registration based on homonymous plane feature matching, and LiDAR point cloud registration based on joint constraint of point, line and surface features [20-21] Etc. Currently, most of the existing research results focus on LiDAR point cloud registration based on homonymous point matching, however, point features are simply selected as elements of registration, and sufficient condition constraint cannot be provided for solving registration parameters due to shielding in some cases, so that more types of features such as straight lines or planes are necessarily introduced to establish homonymous feature corresponding relations between adjacent measuring stations; in addition, compared with the point feature, under the same sampling condition, the extraction of the straight line and the plane feature is less influenced by the sampling density, and the precision is obviously better than that of the point feature. Because of this, the LiDAR point cloud registration algorithm based on the linear/plane feature constraint is researched, more condition constraints can be provided for LiDAR point cloud registration, the registration problem of adjacent station LiDAR point clouds under complex conditions is solved, high-precision fusion of multi-station LiDAR point clouds is realized, and reliable data assurance is further provided for rapid and high-fidelity reconstruction of geospatial entities and environmental information thereof. It should be noted, however, that for straight and planar features in three dimensions,the mathematical expression is usually realized by means of the combination of a direction vector (normal vector) and a point through which the vector passes, the selection of the point through which the vector passes is different, the expression form of the vector also can be different, and the problem of how to overcome and effectively treat the problem in the realization of an algorithm has a vital effect on the realization of the algorithm.
According to different description modes of characteristic sampling errors in the registration process, the existing algorithm can be divided into an algorithm based on classical least squares constraint and an algorithm based on total least squares constraint. The former describes sampling Errors of the registration primitives by using a Gauss-Markov model, and the latter describes sampling Errors by using an EIV (error-In-variability) model. Due to the influence of the sampling and feature extraction processes, errors may exist in all the registration primitives, so that the influence of the errors of all the registration primitives on the result needs to be comprehensively considered in the solving process of the registration parameters, however, the Gauss-Markov model only describes the errors existing in one of the two sets of registration primitives, which obviously is not in line with the actual situation. In comparison, the EIV model takes all errors of the two sets of registration primitives into consideration, and ensures asymptotic unbiasedness of registration parameters.
Disclosure of Invention
The invention aims to: the technical problems to be solved by the invention are as follows: 1) Currently, when LiDAR is used for realizing acquisition of point cloud data on the surface of an urban building, the problem of shielding generally causes the problems of insufficient constraint conditions and low precision of point cloud registration between adjacent LiDAR measuring stations; 2) The city has a large number of artificial structures, and the surface of the city has a large number of plane features, however, the plane features are less applied in the LiDAR point cloud registration process due to the influence of the diversity of mathematical expression forms of the plane features; 3) Most of the existing LiDAR point cloud registration methods are based on Gauss-Markov model to realize the description of the extraction errors of the equivalent registration primitives of the registration station, and the extraction errors of the registration primitives of the reference station cannot be considered. Aiming at the defects, the invention provides a point cloud registration method based on EIV model description under plane feature constraint, which comprises the following steps:
step 1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method, expressing planar features in a three-dimensional space by using normal vectors and modes of the planar features, and finally obtaining a specific form of mathematical expression of the planar features in the three-dimensional space by regularization processing of parameters;
step 2, realizing the description of a planar feature space similarity transformation process based on the operation between the unit quaternion and the planar feature parameters in the three-dimensional space, and describing and expressing the error of the planar feature data acquisition by using an EIV model;
step 3, constructing a solving model of the space similarity transformation parameters based on the constraint of the overall least square rule, and solving according to the solving model to obtain the parameters of point cloud registration between two adjacent LiDAR measuring stations, wherein the method specifically comprises the following steps: rotation angle (3 parameters), translation vector (3 parameters), and scaling factor (1 parameter).
Step 4, more than three pairs of homonymous plane features are respectively extracted from point clouds of two adjacent LiDAR measuring stations S1 and S2, mathematical expressions of the plane features are obtained by processing the homonymous features based on the step 1, the S1 measuring station is selected as a reference measuring station, the S2 measuring station is selected as a measuring station to be registered, and point cloud registration parameters between the two adjacent LiDAR measuring stations are obtained by calculation based on the step 2 and the step 3; and (3) performing spatial similarity transformation processing on all the point cloud data in the S2 measuring station based on the calculated point cloud registration parameters, so as to realize fusion of the point cloud data between two adjacent LiDAR measuring stations.
The step 1 comprises the following steps:
step 1-1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method:
defining a set of sampling points N from a planar feature in space p ={p i },p i Representing the ith sample point on the plane, constructing a corresponding covariance matrix C according to equation (1):
wherein p is c Representing a set of sampling points N p K represents the sampling pointA total number; t represents the transpose of the matrix;
as defined, the covariance matrix C is a symmetric, semi-positive definite matrix, and therefore, all eigenvalues λ of the covariance matrix C j Are all non-negative real numbers, and the eigenvector n corresponding to the eigenvalue j Two-by-two orthogonal to the sampling point set N p Lambda corresponds to the main component of (a) j Characterizing a set N of adjacent sampling points p The degree of variation in the corresponding feature vector direction, where j = 1,2,3;
for planar features in space, let λ be given that the sampling plane is a 2-dimensional manifold surface, provided that the sampling density is sufficiently large 123 Over center p c And perpendicular to vector n 0 Plane equation T (x): (x-p) c )·n 0 =0 such that p i The sum of the squares of the distances to the plane is minimal, and therefore n 0 The normal vector of the planar feature is approximately represented;
step 1-2, the expression of the planar features in the three-dimensional space is realized by utilizing the combination of the normal vector and the mode of the planar features, and the mathematical expression of the planar features in the space is obtained through the regularization processing of parameters:
the normal vector of the plane geometric feature in the three-dimensional space is calculated as n in the step 1-1, p is set to represent any point on the plane feature before the space similarity transformation, and the following regularization processing is carried out based on the normal vector n and p:
and carrying out unitization treatment on the normal direction of the plane: l=n/| n is;
taking the distance m (also called the modulo of the plane) from the origin of coordinates to the plane feature as the fourth element for expressing the plane feature, the unit normal vector l of the plane and any point p through which the unit normal vector l passes are known as follows:
m=p·l (2)
under the condition of determining the normal direction of the plane characteristics in the three-dimensional space, the regularization normal direction l and the mode m are combined to obtain a four-element groupUtilization stationThe four-element group realizes the expression of any plane characteristic in the three-dimensional space, and the corresponding parameter is unique.
The step 2 comprises the following steps:
step 2-1, respectively implementing three-dimensional space similarity transformation on normal vectors and modes of plane features in space;
step 2-2, describing and expressing normal vector acquisition errors of the plane characteristics by using an EIV model;
and 2-3, describing and expressing the module acquisition error of the plane characteristics by using an EIV model.
Step 2-1 includes: the correspondence between the normal vector of the planar features before and after the spatial similarity transformation and the modes is expressed as:
wherein l b 、l a A unit normal vector representing the planar feature before the spatial similarity transformation and a unit normal vector representing the planar feature after the spatial similarity transformation, respectively; m is m a A module representing the planar characteristics after the spatial similarity transformation; p is p b 、p a Respectively representing a sampling point on the planar feature before the spatial similarity transformation and a sampling point on the planar feature after the spatial similarity transformation; r, t and μ represent mathematical parameters corresponding to the spatial similarity transformation process, respectively, namely: rotation matrix, translation vector and scaling factor;
using unit quaternionsDescribing the rotation transformation in three-dimensional space, the relationship between the rotation matrix R and the unit quaternion is expressed as follows:
step 2-2 includes: as shown in equation (3), the mathematical description of the spatial similarity transformation process for the unit direction vector is as follows:
l a =Rl b (5)
considering the extraction error of the plane feature direction vector, equation (5) is further expressed as:
wherein,respectively represent l a Error of (1) a Is a function of the error of (a).
The step 2-3 comprises the following steps: in the case of planar feature normal vector determination, the normal vector of the planar feature and any point p on the plane before transformation are transformed using spatial similarity as shown in formula (3) b The calculation of the plane mould is as follows:
m a =μ(Rp b )·(Rl b )+t·Rl b (7)
considering the extraction error of the planar feature, equation (7) is further expressed as:
the step 3 comprises the following steps:
step 3-1, constructing corresponding condition constraint based on unit normal vector equality of homonymous features after spatial similarity transformation:
let functionExpansion (6), yield:
intermediate parameters of the orderIntermediate parameters->Intermediate parameter delta zeta 1 =[δq 0 δq 1 δq 2 δq 3 ] T Formula (9) is further expressed as:
wherein,representation->Is the initial value of (2);
unit quaternionInitial value of +.>And the initial value R of the rotation matrix R 0 Substituting into the above, and calculating to obtain A 1 And B is connected with 1 Is the value of (1):B 1 =R 0
let intermediate parameter G 1 =[I-B 1 ]Intermediate parametersI represents an identity matrix, and formula (10) is rewritten as follows:
G 1 e 1 =A 1 δξ 1 -l 1 (11)
wherein I represents an identity matrix, an intermediate parameter
Step 3-2, constructing corresponding condition constraint based on the modulo equality of the homonymy features after the spatial similarity transformation:
order the Is an intermediate parameter, linearized equation (7):
consider δt= [ δt ] x δt y δt z ] T Intermediate parameterIntermediate parametersIntermediate parameters->Intermediate parameters->Intermediate parameter delta zeta 2 =[δμ δt x δt y δt z ] T Wherein T represents the transpose of the matrix, δμ, δt x 、δt y 、δt z Respectively represent mu and t x 、t y 、t z Is a correction of (a);
then formula (12) is further expressed as:
order theEach element in formula (13) is further expressed as:
let intermediate parameter G 2 =[0 -B 2 ]Intermediate parameter G 3 =[I -B 3 ]Intermediate parametersFormula (13) is rewritten as follows:
G 2 e 1 +G 3 e 2 =A 2 δξ 1 +A 3 δξ 2 -l 2 (14)
wherein the intermediate parameter
Step 3-3, constructing corresponding condition constraints based on the characteristics of the unit quaternions;
step 3-4, solving space similarity transformation parameters based on constraint of total least square;
and 3-5, evaluating the precision.
Step 3-3 includes: using unit quaternionsWhen expressing rotation transformation in three-dimensional space, the following conditions need to be satisfied:
taylor expansion was performed on formula (15) and taken to get the primary term:
let intermediate parameter A q =[2q 0 2q 1 2q 2 2q 3 0 0 0 0]Intermediate parametersThen equation (16) is further rewritten as:
A q δξ+l q =0 (17)。
the steps 3-4 comprise: intermediate parameters of the orderIntermediate parameters->Intermediate parameters->Intermediate parameters->Intermediate parameters->Then formulas (11) and (14) are collectively expressed as:
Ge=Aδξ-l (18)
e is as follows T e=min as a target, and constructing Lagrange extremum function based on overall least squares criterion constraint
Wherein lambda is 1 、λ 2 Lagrange multiplication constant vectors and Lagrange multiplication constants corresponding to the formulas (18) and (17), respectively;
formula (19) is shown as the following for e, delta xi and lambda 1 、λ 2 Taking the partial derivative and letting the resulting expression equal 0 yields:
according to equation (20), the expression of e is obtained:
e=-G T λ 1 (24)
substituting formula (24) into formula (22) yields:
Q XX λ 1 +Aδξ=l (25)
wherein the intermediate parameter Q XX =GG T
Combining formula (25), formula (21), and formula (17) to obtain:
in step 3-5, the accuracy σ of the registration is calculated using the following formula:
wherein z represents the logarithm of the homonymous feature.
According to the method, plane characteristics are selected as primitives of LiDAR point cloud registration, expression of the plane characteristics is achieved based on combination of normal vectors and distances between an origin and a plane, unit quaternion is introduced to serve as a basic operator for space rotation transformation description, construction of a plane characteristic space similarity transformation expression is achieved, on the basis, an EIV model is adopted to describe errors of registration primitives, parameters of the same-name plane characteristics after LiDAR point cloud registration are used as constraint conditions, an objective function of three-dimensional space similarity transformation is constructed based on constraint of a total least square criterion, and iterative solution of LiDAR point cloud registration parameters of adjacent measuring stations under the constraint of the plane characteristics is achieved through extremum analysis of the function. And finally, verifying the correctness and the effectiveness of the algorithm through two groups of ground LiDAR point cloud data acquired on site.
The beneficial effects are that: compared with two geometric features such as a point and a straight line, the planar feature extracted based on the LiDAR point cloud has the characteristics of high precision, convenient extraction and the like, and particularly, when manual mark points are difficult to effectively set in the data acquisition process, the method can provide great convenience for fusion of the LiDAR point clouds of multiple measuring stations; when the four-element group formed based on the combination of the normal vector and the original point to plane distance is used for expressing the plane characteristics in the space, the judgment of the consistency (complete coincidence) of two planes in the space can be directly completed by means of parameter comparison, and the programming realization of an algorithm is effectively simplified. More importantly, the error of primitive extraction is described by using the EIV model, so that asymptotic unbiasedness of a registration parameter solving result is effectively ensured theoretically.
Drawings
The foregoing and/or other advantages of the invention will become more apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings and detailed description.
FIG. 1 is a schematic diagram of a sample point set of planar features and its covariance analysis.
Fig. 2 is a schematic representation of planar features in three-dimensional space and their representation.
FIG. 3a is a schematic view of a reference station LiDAR point cloud.
Fig. 3b is a schematic view of a LiDAR point cloud of a station to be registered.
FIG. 4a is a schematic diagram of the relative positional relationship of point clouds prior to registration of adjacent LiDAR stations.
FIG. 4b is a schematic diagram of the relative positional relationship of two adjacent LiDAR stations in FIG. 4a after point cloud registration.
Detailed Description
The invention provides a point cloud registration method based on EIV model description under plane feature constraint, which comprises the following steps:
step 1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method, expressing planar features in a three-dimensional space by using normal vectors and modes of the planar features, and finally obtaining a specific form of mathematical expression of the planar features in the three-dimensional space by regularization processing of parameters;
step 2, realizing the description of a planar feature space similarity transformation process based on the operation between the unit quaternion and the planar feature parameters in the three-dimensional space, and describing and expressing the error of the planar feature data acquisition by using an EIV model;
step 3, constructing a solution model of a space similarity transformation parameter based on constraint of a total least square rule, and solving the solution model to obtain a parameter of point cloud registration between two adjacent LiDAR measuring stations, wherein the method specifically comprises the following steps: rotation angle (3 parameters), translation vector (3 parameters), and scaling factor (1 parameter).
Step 4, extracting more than three pairs of homonymous plane features from point clouds of two adjacent LiDAR measuring stations S1 and S2 respectively, processing the homonymous features based on the step 1 to obtain mathematical expression forms of the plane features, selecting the S1 measuring station as a reference measuring station and the S2 measuring station as a measuring station to be registered, and calculating based on the step 2 and the step 3 to obtain point cloud registration parameters between the two adjacent LiDAR measuring stations; and (3) realizing space similarity transformation processing on all the point cloud data in the S2 measuring station based on the calculated point cloud registration parameters, and further realizing fusion of the point cloud data between two adjacent LiDAR measuring stations.
The step 1 comprises the following steps:
step 1-1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method:
defining a set of sampling points N from a planar feature in space p ={p i },p i Representing the ith sample point on the plane, constructing a corresponding covariance matrix C according to equation (1):
wherein p is c Representing a set of sampling points N p K represents the total number of sampling points; t represents the transpose of the matrix;
as defined, the covariance matrix C is a symmetric, semi-positive definite matrix, and therefore, all eigenvalues λ of the covariance matrix C j Are all non-negative real numbers, and the eigenvector n corresponding to the eigenvalue j Two-by-two orthogonal to the sampling point set N p Lambda corresponds to the main component of (a) j Characterizing a set N of adjacent sampling points p The degree of variation in the corresponding feature vector direction, where j = 1,2,3;
for planar features in space, let λ be given that the sampling plane is a 2-dimensional manifold surface, provided that the sampling density is sufficiently large 123 Over center p c And perpendicular to vector n 0 Plane equation T (x): (x-p) c )·n 0 =0 such that p i The sum of the squares of the distances to the plane is minimal, and therefore n 0 The normal vector of the planar feature is approximately represented;
step 1-2, the expression of the planar features in the three-dimensional space is realized by utilizing the combination of the normal vector and the mode of the planar features, and the mathematical expression of the planar features in the space is obtained through the regularization processing of parameters:
the normal vector of the plane geometric feature in the three-dimensional space is calculated as n in the step 1-1, p is set to represent any point (shown in fig. 2) on the plane feature before the space similarity transformation, and the following regularization processing is carried out based on the normal vectors n and p:
and carrying out unitization treatment on the normal direction of the plane: l=n/| n is;
taking the distance m (also called the modulo of the plane) from the origin of coordinates to the plane feature as the fourth element for expressing the plane feature, the unit normal vector l of the plane and any point p through which the unit normal vector l passes are known as follows:
m=p·l (2)
under the condition of determining the normal direction of the plane characteristics in the three-dimensional space, the regularization normal direction l and the mode m are combined to obtain a four-element groupAnd expressing any planar characteristic in the three-dimensional space by using the quadruple, wherein the corresponding parameter is unique.
The step 2 comprises the following steps:
step 2-1, respectively implementing three-dimensional space similarity transformation on normal vectors and modes of the space plane characteristics;
step 2-2, describing and expressing normal vector acquisition errors of the plane characteristics by using an EIV model;
and 2-3, describing and expressing the module acquisition error of the plane characteristics by using an EIV model.
Step 2-1 includes: the correspondence between the normal vector of the planar features before and after the spatial similarity transformation and the modes is expressed as:
wherein l b 、l a A unit normal vector representing the planar feature before the spatial similarity transformation and a unit normal vector representing the planar feature after the spatial similarity transformation, respectively; m is m a A module representing the planar characteristics after the spatial similarity transformation; p is p b 、p a Respectively representing a sampling point on the planar feature before the spatial similarity transformation and a sampling point on the planar feature after the spatial similarity transformation; r, t and mu Table respectivelyMathematical parameters corresponding to the spatial similarity transformation process are shown, namely: rotation matrix, translation vector and scaling factor;
using unit quaternionsDescribing the rotation transformation in three-dimensional space, the relationship between the rotation matrix R and the unit quaternion can be expressed as follows:
step 2-2 includes: as shown in equation (3), the mathematical description of the spatial similarity transformation process for the unit direction vector is as follows (without scaling:
l a =Rl b (5)
considering the extraction error of the plane feature direction vector, equation (5) is further expressed as:
wherein,respectively represent l a Error of (1) a Is a function of the error of (a).
The step 2-3 comprises the following steps: in the case of planar feature normal vector determination, the normal vector of the planar feature and any point p on the plane before transformation are transformed using spatial similarity as shown in formula (3) b The calculation of the plane mould is as follows:
m a =μ(Rp b )·(Rl b )+t·Rl b (7)
considering the extraction error of the planar feature, equation (7) is further expressed as:
the step 3 comprises the following steps:
step 3-1, constructing corresponding condition constraint based on unit normal vector equality of homonymous features after spatial similarity transformation:
let functionExpansion (6), yield:
intermediate parameters of the orderIntermediate parameters->Intermediate parameter delta zeta 1 =[δq 0 δq 1 δq 2 δq 3 ] T Formula (9) is further expressed as:
wherein,representation->Is the initial value of (2);
unit quaternionInitial value of +.>And the initial value R of the rotation matrix R 0 Substituting into the above, and calculating to obtain A 1 And B is connected with 1 Is the value of (1):B 1 =R 0
let intermediate parameter G 1 =[I -B 1 ]Intermediate parametersI represents an identity matrix, and formula (10) is rewritten as follows:
G 1 e 1 =A 1 δξ 1 -l 1 (11)
wherein I represents an identity matrix, an intermediate parameter
Step 3-2, constructing corresponding condition constraint based on the modulo equality of the homonymy features after the spatial similarity transformation:
order the Is an intermediate parameter, linearized equation (7):
consider δt= [ δt ] x δt y δt z ] T Intermediate parameterIntermediate parametersIntermediate parameters->Intermediate parameters->Intermediate parameter delta zeta 2 =[δμ δt x δt y δt z ] T Wherein T represents the transpose of the matrix, δμ, δt x 、δt y 、δt z Respectively represent mu and t x 、t y 、t z Is a correction of (a);
then formula (12) is further expressed as:
order theEach element in formula (13) is further expressed as:
let intermediate parameter G 2 =[0 -B 2 ]Intermediate parameter G 3 =[I -B 3 ]Intermediate parametersFormula (13) is rewritten as follows:
G 2 e 1 +G 3 e 2 =A 2 δξ 1 +A 3 δξ 2 -l 2 (14)
wherein the intermediate parameter
Step 3-3, constructing corresponding condition constraints based on the characteristics of the unit quaternions;
step 3-4, solving space similarity transformation parameters based on constraint of total least square;
and 3-5, evaluating the precision.
Step 3-3 includes: using unit quaternionsWhen expressing rotation transformation in three-dimensional space, the following conditions need to be satisfied:
taylor expansion was performed on formula (15) and taken to get the primary term:
let intermediate parameter A q =[2q 0 2q 1 2q 2 2q 3 0 0 0 0]Intermediate parametersThen equation (16) is further rewritten as:
A q δξ+l q =0 (17)。
the steps 3-4 comprise: intermediate parameters of the orderIntermediate parameters->Intermediate parameters->Intermediate parameters->Intermediate parameters->Then formulas (11) and (14) are collectively expressed as:
Ge=Aδξ-l (18)
e is as follows T e=min as a target, and constructing Lagrange extremum function based on overall least squares criterion constraint
Wherein lambda is 1 、λ 2 Lagrange multiplication constant vectors and Lagrange multiplication constants corresponding to the formulas (18) and (17), respectively;
formula (19) is shown as the following for e, delta xi and lambda 1 、λ 2 Solving the bias guide, and making the obtained expression equal to 0 to obtain:
/>
according to equation (20), the expression of e is obtained:
e=-G T λ 1 (24)
substituting formula (24) into formula (22) yields:
Q XX λ 1 +Aδξ=l (25)
wherein the intermediate parameter Q XX =GG T
Combining formula (25), formula (21), and formula (17) to obtain:
in step 3-5, according to the above formula, the plane feature is used as constraint condition for solving the point cloud registration parameter between adjacent measuring stations, the used observation value comprises normal vector of the same name plane feature of the reference station and the station to be registered, the model of the plane feature, and a point on the plane feature of the station to be registered, 4 error equations can be listed for each pair of the same name features, the number of parameters to be solved is 7, namely 7 parameters describing the space similarity transformation, therefore, the accuracy sigma of registration is calculated by the following formula:
wherein z represents the logarithm of the homonymous feature.
Examples
The method is realized by programming Matlab, and the following two groups of experiments are designed based on simulation data and measured data respectively to verify the correctness and the effectiveness of the method.
Simulation experiment:
table 1 shows experimental data for simulation designed to verify the correctness of the present invention, wherein the station to be registered is 6 planes determined in three-dimensional space, and the reference station is 6 corresponding planes obtained by transforming preset spatial similarity transformation parameters (shown in table 2):
TABLE 1
TABLE 2
Based on the operation result of the method, the deviation between the homonymous features after registration can be obtained, and the error is shown in the table 3:
TABLE 3 Table 3
According to the results shown in tables 2 and 3, it can be seen that the spatial similarity transformation parameters between two adjacent measuring stations calculated by the method of the invention are highly consistent with the set parameters, and the deviation between the same names after registration is negligible, so that the algorithm model of the invention is considered to be correct.
True sampling experiment: the experimental data is a building vertical point cloud acquired by using LMS-Z420i series ground LiDAR equipment produced by Riegl company in Austria, and 7 pairs of homonymous plane characteristics shown in table 4 are respectively extracted from two adjacent measuring stations shown in fig. 3a and 3b based on a plane fitting and intersecting mode:
TABLE 4 Table 4
Based on the extracted homonymous plane characteristics (table 4), the method is utilized to calculate similar transformation parameters between the station to be registered and the reference station, and accordingly registration between two adjacent LiDAR station clouds is realized, and the relative position relationship between the two adjacent LiDAR station clouds before and after registration is respectively shown in fig. 4a and 4 b.
To further verify the effectiveness of the method of the present invention, three-dimensional spatial similarity transformation parameters between the station to be registered and the reference station are calculated based on the constraint of the overall least squares criterion (the method of the present invention) and the constraint of the classical least squares criterion (Wang et al, 2021), and the registration accuracy (medium error σ) of the two methods is counted, and the results are shown in table 5:
TABLE 5
Analysis of results:
based on the results of the operation test of the method of the present invention using the simulation data (shown in Table 3), the difference between the calculated spatially similar transformation parameters and the set parameters is extremely small (< 10) -4 ) In practical applications, this error is negligible, so the algorithm presented herein is considered to be running correctly; based on the operation result of the method by using LiDAR point cloud data acquired on site, firstly, the difference between the LiDAR point cloud data and the existing algorithm is smaller within an acceptable range, and secondly, the operation result of the method is obviously superior to classical least squares from the aspect of the accuracy (middle error) of the result. In conclusion, the method has correct theory and expected running result.
In a specific implementation, the application provides a computer storage medium and a corresponding data processing unit, wherein the computer storage medium can store a computer program, and the computer program can run the summary of the point cloud registration method described based on the EIV model under the plane feature constraint provided by the invention and part or all of the steps in each embodiment when being executed by the data processing unit. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the technical solutions in the embodiments of the present invention may be implemented by means of a computer program and its corresponding general hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in the form of a computer program, i.e. a software product, which may be stored in a storage medium, and include several instructions to cause a device (which may be a personal computer, a server, a single-chip microcomputer, MUU or a network device, etc.) including a data processing unit to perform the methods described in the embodiments or some parts of the embodiments of the present invention.
The invention provides a point cloud registration method based on EIV model description under plane feature constraint, and the method and the way for realizing the technical scheme are numerous, the above description is only a preferred embodiment of the invention, and it should be noted that, for those skilled in the art, several improvements and modifications can be made without departing from the principle of the invention, and the improvements and modifications should be regarded as the protection scope of the invention. The components not explicitly described in this embodiment can be implemented by using the prior art.

Claims (9)

1. The point cloud registration method based on the EIV model description under the plane feature constraint is characterized by comprising the following steps of:
step 1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method, expressing planar features in a three-dimensional space by using normal vectors and modes of the planar features, and finally obtaining a specific form of mathematical expression of the planar features in the three-dimensional space by regularization processing of parameters;
step 2, realizing the description of a planar feature space similarity transformation process based on the operation between the unit quaternion and the planar feature parameters in the three-dimensional space, and describing and expressing the error of the planar feature data acquisition by using an EIV model;
step 3, constructing a solving model of the space similarity transformation parameters based on the constraint of the overall least square rule, and solving according to the solving model to obtain the parameters of point cloud registration between two adjacent LiDAR measuring stations;
step 4, more than three pairs of homonymous plane features are respectively extracted from point clouds of two adjacent LiDAR measuring stations S1 and S2, mathematical expressions of the plane features are obtained by processing the homonymous features based on the step 1, the S1 measuring station is selected as a reference measuring station, the S2 measuring station is selected as a measuring station to be registered, and point cloud registration parameters between the two adjacent LiDAR measuring stations are obtained by calculation based on the step 2 and the step 3; performing spatial similarity transformation processing on all point cloud data in the S2 measuring station based on the calculated point cloud registration parameters, so as to realize fusion of the point cloud data between two adjacent LiDAR measuring stations;
the step 1 comprises the following steps:
step 1-1, based on collected planar LiDAR point cloud data, estimating normal vectors of planar features by using a principal component analysis method:
defining a set of sampling points N from a planar feature in space p ={p i },p i Representing the ith sample point on the plane, constructing a corresponding covariance matrix C according to equation (1):
wherein p is c Representing a set of sampling points N p K represents the total number of sampling points; t represents the transpose of the matrix;
all eigenvalues λ of covariance matrix C j Are all non-negative real numbers, and the eigenvector n corresponding to the eigenvalue j Two-by-two orthogonal to the sampling point set N p Lambda corresponds to the main component of (a) j Characterizing a set N of adjacent sampling points p The degree of variation in the corresponding feature vector direction, where j = 1,2,3;
let lambda be given that the sampling plane is a 2-dimensional manifold surface 123 Over center p c And perpendicular to vector n 0 Plane equation T (x): (x-p) c )·n 0 =0 such that p i The sum of the squares of the distances to the plane is minimal, and therefore n 0 The normal vector of the planar feature is approximately represented;
step 1-2, the expression of the planar features in the three-dimensional space is realized by utilizing the combination of the normal vector and the mode of the planar features, and the mathematical expression of the planar features in the space is obtained through the regularization processing of parameters:
the normal vector of the plane feature in the three-dimensional space is calculated as n in the step 1-1, p is set to represent any point on the plane feature before the space similarity transformation, and the following regularization processing is carried out based on the normal vector n and p:
and carrying out unitization treatment on the normal direction of the plane: l=n/| n is;
taking the distance m from the origin of coordinates to the plane feature as a fourth element for expressing the plane feature, the unit normal vector l of the known plane and any point p passing by the unit normal vector l are expressed in the following mathematical expression form of the modulus m:
m=p·l (2)
under the condition of determining the normal direction of the plane characteristics in the three-dimensional space, the regularization normal direction l and the mode m are combined to obtain a four-element groupAnd expressing any planar characteristic in the three-dimensional space by using the quadruple, wherein the corresponding parameter is unique.
2. The method of claim 1, wherein step 2 comprises:
step 2-1, respectively implementing three-dimensional space similarity transformation on normal vectors and modes of plane features in space;
step 2-2, describing and expressing normal vector acquisition errors of the plane characteristics by using an EIV model;
and 2-3, describing and expressing the module acquisition error of the plane characteristics by using an EIV model.
3. The method according to claim 2, wherein step 2-1 comprises: the correspondence between the normal vector of the planar features before and after the spatial similarity transformation and the modes is expressed as:
wherein l b 、l a The unit normal vector of the plane characteristic before the space similarity transformation and the unit normal vector of the plane characteristic after the space similarity transformation are respectively represented; m is m a A module representing the planar characteristics after the spatial similarity transformation; p is p b 、p a Respectively representing a sampling point on the planar feature before the spatial similarity transformation and a sampling point on the planar feature after the spatial similarity transformation; r, t and μ represent mathematical parameters corresponding to the spatial similarity transformation process, respectively, namely: rotation matrix, translation vector and scaling factor;
using unit quaternionsDescribing the rotation transformation in three-dimensional space, the relationship between the rotation matrix R and the unit quaternion is expressed as follows:
4. a method according to claim 3, wherein step 2-2 comprises: as shown in equation (3), the mathematical description of the spatial similarity transformation process for the unit direction vector is as follows:
l a =Rl b (5)
considering the extraction error of the plane feature direction vector, equation (5) is further expressed as:
wherein,respectively represent l a Error of (1) a Is a function of the error of (a).
5. The method of claim 4, wherein step 2-3 comprises: in the case of planar feature normal vector determination, the normal vector of the planar feature and any point p on the plane before transformation are transformed using spatial similarity as shown in formula (3) b The calculation of the plane mould is as follows:
m a =μ(Rp b )·(Rl b )+t·Rl b (7)
considering the extraction error of the planar features, equation (7) is further expressed as:
6. the method of claim 5, wherein step 3 comprises:
step 3-1, constructing corresponding condition constraint based on unit normal vector equality of homonymous features after spatial similarity transformation:
let functionExpansion (6), yield:
intermediate parameters of the orderIntermediate parameters->Formula (9) is further expressed as:
wherein,representation->Is the initial value of (2);
unit quaternionInitial value of +.>And the initial value R of the rotation matrix R 0 Substituting into the above, and calculating to obtain A 1 And B is connected with 1 Is the value of (1):B 1 =R 0
let intermediate parameter G 1 =[I -B 1 ]Intermediate parametersFormula (10) is rewritten as follows:
G 1 e 1 =A 1 δξ 1 -l 1 (11)
wherein I represents an identity matrix, an intermediate parameter
Step 3-2, constructing corresponding condition constraint based on the modulo equality of the homonymy features after the spatial similarity transformation:
order theIs an intermediate parameter, linearized equation (7):
consider δt= [ δt ] x δt y δt z ] T Intermediate parameterIntermediate parametersIntermediate parameters->Intermediate parameters->Intermediate parameter delta zeta 2 =[δμ δt x δt y δt z ] T Wherein T represents the transpose of the matrix, δμ, δt x 、δt y 、δt z Respectively represent mu and t x 、t y 、t z Is a correction of (a);
then formula (12) is further expressed as:
order theEach element in formula (13) is further expressed as:
let intermediate parameter G 2 =[0 -B 2 ]Intermediate parameter G 3 =[I -B 3 ]Intermediate parametersFormula (13) is rewritten as follows:
G 2 e 1 +G 3 e 2 =A 2 δξ 1 +A 3 δξ 2 -l 2 (14)
wherein the intermediate parameter
Step 3-3, constructing corresponding condition constraints based on the characteristics of the unit quaternions;
step 3-4, solving space similarity transformation parameters based on constraint of total least square;
and 3-5, evaluating the precision.
7. The method of claim 6, wherein step 3-3 comprises: using unit quaternionsWhen expressing rotation transformation in three-dimensional space, the following conditions need to be satisfied:
taylor expansion was performed on formula (15) and taken to get the primary term:
let intermediate parameter A q =[2q 0 2q 1 2q 2 2q 3 0 0 0 0]Intermediate parametersThen equation (16) is further rewritten as:
A q δξ+l q =0 (17)。
8. the method of claim 7, wherein steps 3-4 comprise: intermediate parameters of the orderIntermediate parameters->Intermediate parameters->Intermediate parameters->Intermediate parameters->Then formulas (11) and (14) are collectively expressed as:
Ge=Aδξ-l (18)
e is as follows T e=min as a target, and constructing Lagrange extremum function based on overall least squares criterion constraint
Wherein lambda is 1 、λ 2 Lagrange multiplication constant vectors and Lagrange multiplication constants corresponding to the formulas (18) and (17), respectively;
formula (19) is shown as the following for e, delta xi and lambda 1 、λ 2 Solving the bias guide, and making the obtained expression equal to 0 to obtain:
according to equation (20), the expression of e is obtained:
e=-G T λ 1 (24)
substituting formula (24) into formula (22) yields:
Q XX λ 1 +Aδξ=l (25)
wherein the intermediate parameter Q XX =GG T
Combining formula (25), formula (21), and formula (17) to obtain:
9. the method according to claim 8, wherein in step 3-5, the accuracy σ of the registration is calculated using the formula:
wherein z represents the logarithm of the homonymous feature.
CN202310893158.5A 2023-07-20 2023-07-20 Point cloud registration method based on EIV model description under plane feature constraint Active CN116883469B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310893158.5A CN116883469B (en) 2023-07-20 2023-07-20 Point cloud registration method based on EIV model description under plane feature constraint

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310893158.5A CN116883469B (en) 2023-07-20 2023-07-20 Point cloud registration method based on EIV model description under plane feature constraint

Publications (2)

Publication Number Publication Date
CN116883469A CN116883469A (en) 2023-10-13
CN116883469B true CN116883469B (en) 2024-01-19

Family

ID=88267713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310893158.5A Active CN116883469B (en) 2023-07-20 2023-07-20 Point cloud registration method based on EIV model description under plane feature constraint

Country Status (1)

Country Link
CN (1) CN116883469B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102044089A (en) * 2010-09-20 2011-05-04 董福田 Method for carrying out self-adaption simplification, gradual transmission and rapid charting on three-dimensional model
CN108871284A (en) * 2018-05-08 2018-11-23 中国矿业大学 Three-dimensional space similarity transformation model parameter based on line feature constraint without initial value method for solving
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN111563920A (en) * 2020-04-15 2020-08-21 西安工程大学 3D color point cloud registration method based on global optimization and multi-constraint condition iteration
CN111709981A (en) * 2020-06-22 2020-09-25 高小翎 Registration method of laser point cloud and analog image with characteristic line fusion
CN111958640A (en) * 2020-08-24 2020-11-20 哈工大机器人集团股份有限公司 Double-arm robot testing method and device for multi-base-station laser tracker cooperative station transfer
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 Point cloud accurate registration method based on robust constraint least square algorithm
CN112365528A (en) * 2020-07-23 2021-02-12 哈尔滨岛田大鹏工业股份有限公司 Three-dimensional point cloud model gradual refinement and rapid registration method based on principal component analysis
CN112614204A (en) * 2020-12-29 2021-04-06 哈尔滨理工大学 Capacitance tomography image reconstruction method based on improved least square
CN113327275A (en) * 2021-06-18 2021-08-31 哈尔滨工业大学 Point cloud double-view-angle fine registration method based on multi-constraint point to local curved surface projection
CN114895238A (en) * 2022-03-21 2022-08-12 宁波大学 DRSS-based wireless sensor network robust positioning method
CN115100254A (en) * 2022-06-10 2022-09-23 兰州交通大学 Point cloud registration method based on dual quaternion description under planar feature constraint
CN115800957A (en) * 2022-10-21 2023-03-14 北京理工大学 Deviation compensation adaptive filtering method based on matrix eigenvalue solution

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102267954B1 (en) * 2019-10-11 2021-06-23 한국과학기술연구원 Rss signal correction method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102044089A (en) * 2010-09-20 2011-05-04 董福田 Method for carrying out self-adaption simplification, gradual transmission and rapid charting on three-dimensional model
CN108871284A (en) * 2018-05-08 2018-11-23 中国矿业大学 Three-dimensional space similarity transformation model parameter based on line feature constraint without initial value method for solving
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN111563920A (en) * 2020-04-15 2020-08-21 西安工程大学 3D color point cloud registration method based on global optimization and multi-constraint condition iteration
CN111709981A (en) * 2020-06-22 2020-09-25 高小翎 Registration method of laser point cloud and analog image with characteristic line fusion
CN112365528A (en) * 2020-07-23 2021-02-12 哈尔滨岛田大鹏工业股份有限公司 Three-dimensional point cloud model gradual refinement and rapid registration method based on principal component analysis
CN111958640A (en) * 2020-08-24 2020-11-20 哈工大机器人集团股份有限公司 Double-arm robot testing method and device for multi-base-station laser tracker cooperative station transfer
CN112017220A (en) * 2020-08-27 2020-12-01 南京工业大学 Point cloud accurate registration method based on robust constraint least square algorithm
CN112614204A (en) * 2020-12-29 2021-04-06 哈尔滨理工大学 Capacitance tomography image reconstruction method based on improved least square
CN113327275A (en) * 2021-06-18 2021-08-31 哈尔滨工业大学 Point cloud double-view-angle fine registration method based on multi-constraint point to local curved surface projection
CN114895238A (en) * 2022-03-21 2022-08-12 宁波大学 DRSS-based wireless sensor network robust positioning method
CN115100254A (en) * 2022-06-10 2022-09-23 兰州交通大学 Point cloud registration method based on dual quaternion description under planar feature constraint
CN115800957A (en) * 2022-10-21 2023-03-14 北京理工大学 Deviation compensation adaptive filtering method based on matrix eigenvalue solution

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An Advanced Outlier Detected Total Least-Squares Algorithm for 3-D Point Clouds Registration;Jie Yu 等;《IEEE Transactions on Geoscience and Remote Sensing》;第4789 - 4798页 *
基于RANSAC的WTLSD平面拟合算法研究;张中岳 等;《国外电子测量技术》;第93-98页 *

Also Published As

Publication number Publication date
CN116883469A (en) 2023-10-13

Similar Documents

Publication Publication Date Title
Robles-Kelly et al. A Riemannian approach to graph embedding
Ansar et al. Linear pose estimation from points or lines
WO2024077812A1 (en) Single building three-dimensional reconstruction method based on point cloud semantic segmentation and structure fitting
Tang et al. N-dimensional tensor voting and application to epipolar geometry estimation
CN111080684B (en) Point cloud registration method for point neighborhood scale difference description
CN107871327A (en) The monocular camera pose estimation of feature based dotted line and optimization method and system
CN107358629B (en) Indoor mapping and positioning method based on target identification
CN111551895B (en) Method for positioning TDOA and FDOA of motion source based on weighted multidimensional scale and Lagrange multiplier
Habib et al. Quaternion-based solutions for the single photo resection problem
Li A calibration method of computer vision system based on dual attention mechanism
CN111551897B (en) TDOA (time difference of arrival) positioning method based on weighted multidimensional scaling and polynomial root finding under sensor position error
Zheng et al. Registration of optical images with LiDAR data and its accuracy assessment
Yuan et al. EGST: Enhanced Geometric Structure Transformer for Point Cloud Registration
Jiang et al. Learned local features for structure from motion of uav images: A comparative evaluation
Ren et al. High precision calibration algorithm for binocular stereo vision camera using deep reinforcement learning
Geng et al. Neighboring constraint-based pairwise point cloud registration algorithm
He et al. Research on geometric features and point cloud properties for tree skeleton extraction
CN116883469B (en) Point cloud registration method based on EIV model description under plane feature constraint
CN103810747A (en) Three-dimensional point cloud object shape similarity comparing method based on two-dimensional mainstream shape
CN116563096B (en) Method and device for determining deformation field for image registration and electronic equipment
Ding et al. Revisiting the P3P problem
CN116310194A (en) Three-dimensional model reconstruction method, system, equipment and storage medium for power distribution station room
Zeng et al. Extended WTLS iterative algorithm of 3D similarity transformation based on Gibbs vector
CN114742141A (en) Multi-source information data fusion studying and judging method based on ICP point cloud
Lu Algorithm of 3D virtual reconstruction of ancient buildings in Qing dynasty based on image sequence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant