CN115690380A - Registration method and system - Google Patents

Registration method and system Download PDF

Info

Publication number
CN115690380A
CN115690380A CN202211409046.XA CN202211409046A CN115690380A CN 115690380 A CN115690380 A CN 115690380A CN 202211409046 A CN202211409046 A CN 202211409046A CN 115690380 A CN115690380 A CN 115690380A
Authority
CN
China
Prior art keywords
registration
image
point cloud
unit
iteration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211409046.XA
Other languages
Chinese (zh)
Other versions
CN115690380B (en
Inventor
罗再谦
向煜
张俊
李兵
华媛媛
黄志�
黄令
韩�熙
黄国洪
毛阆
孟云豪
曹欣
钟敏
周林秋
吴琢珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHONGQING CYBERCITY SCI-TECH CO LTD
Original Assignee
CHONGQING CYBERCITY SCI-TECH CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHONGQING CYBERCITY SCI-TECH CO LTD filed Critical CHONGQING CYBERCITY SCI-TECH CO LTD
Priority to PCT/CN2022/131556 priority Critical patent/WO2024098428A1/en
Priority to CN202211409046.XA priority patent/CN115690380B/en
Publication of CN115690380A publication Critical patent/CN115690380A/en
Application granted granted Critical
Publication of CN115690380B publication Critical patent/CN115690380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of point cloud data, in particular to a registration method and a system, the method comprises the steps of carrying out lens distortion correction on a digital camera, obtaining internal and external parameters of the digital camera, carrying out distortion correction on extracted image data, carrying out denoising simplification processing on three-dimensional laser point cloud, using POS information data, obtaining a time stamp of the image data shot by the digital camera and the three-dimensional laser radar scanning point cloud at the same time, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS data, carrying out initial registration based on a collinear equation model, combining an iterative closest point algorithm, using a high-precision registration value as an initial value of the iterative closest point algorithm, and using the image data point cloud and the laser point cloud to carry out iterative closest point registration to obtain high-precision registration; the method realizes high-precision registration of the laser point cloud and the influence data through two times of registration, and solves the problems of low registration precision and more limitation conditions at present.

Description

Registration method and system
Technical Field
The invention relates to the technical field of point cloud data, in particular to a registration method and a registration system.
Background
As the geospatial information technology advances into a new period, acquiring and analyzing spatial data for accurately describing three-dimensional spatial information becomes the basis for studying geospatial information. The traditional method for acquiring the three-dimensional data is time-consuming and low in sampling density, so that the acquired data cannot accurately and completely describe the spatial object. In addition, the space object and its surrounding environment are complex, and how to accurately and comprehensively acquire the corresponding space data also becomes one of the main difficulties in the current scientific research of space information. The appearance of the ground three-dimensional laser scanning technology provides a non-destructive high-resolution three-dimensional measurement means for people, and can make up the defects of the traditional measurement means.
Three-dimensional laser scanning is also called LIDAR (Light Detection and Ranging), and is rapidly developed in nearly over ten years as a new mapping technology, and has wide application prospects in the aspects of topographic survey, reverse engineering, historical relic protection and the like due to the advantages of high scanning speed, high density of scanned coordinate points and the like, and gradually becomes an important means and method for obtaining a three-dimensional urban data model, and the technology is also called another technical innovation in the mapping field following the GPS technology. According to different carriers, three-dimensional laser scanning is divided into airborne laser scanning, vehicle-mounted laser scanning and ground three-dimensional laser scanning. The thesis data acquisition is mainly completed by using a ground three-dimensional laser scanner, and the data has the characteristics of the data compared with point cloud data acquired by other modes. In the process of ground scanning laser scanning, due to the limitation of measuring equipment and environment, point cloud data obtained by each measurement usually only covers most of the surface of a scanning object, complete point cloud data needs to be completed through multi-station measurement, and therefore the local point cloud data needs to be spliced and registered, and effective and accurate registration can provide good basic data for subsequent point cloud three-dimensional modeling.
Due to the shielding of scanning environments, point cloud data obtained by ground three-dimensional laser scanning often include various regions which cannot be measured, point cloud cavities are generated, and local region information is lost. These holes not only make the three-dimensional model of the scanned object unable to be visualized completely, but also affect the parameter extraction after modeling. Because the close-range image is relatively convenient to obtain, the flexibility is higher, and the cost is lower. Therefore, the scanning personnel can take a complementary shot by using the digital camera from different angles. Thus, the advantages of the two sensors are fully utilized to obtain the overall view of the object. However, multi-source data fusion is always a difficult point in the field of data processing, and needs to overcome the problem of finding the homonymous feature between two types of data, incorporate the two types of data into a unified coordinate system through registration operation by using information of an overlapped part, and make each element be geometrically aligned. The existing fusion method mainly focuses on extracting common geometric features on point clouds and images, is carried out by utilizing a collinear equation based on a central projection principle, is still registration between two dimensions and three dimensions in practice, and has low precision and more limitation conditions.
Therefore, a registration method needs to be developed to meet the current processing technology after data acquisition.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a registration method to solve the problems of low precision and more limitation conditions of laser point cloud and image data registration.
In order to solve the problems, the invention adopts the following technical scheme:
in one aspect, the invention provides a registration method comprising
Carrying out lens distortion correction on the digital camera, acquiring internal and external parameters of the digital camera, and carrying out distortion correction on the extracted image data;
denoising and simplifying the three-dimensional laser point cloud;
acquiring timestamps of image data shot by a digital camera and three-dimensional laser radar scanning point cloud at the same time according to a time synchronization principle by using position, posture and time information data in a POS sensor, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS sensor on the basis of the same timestamp, and performing initial registration based on a collinear equation model;
the registration initial value precision is improved by using a weight selection iterative method based on Robust estimation, a high-precision registration value is determined, the high-precision registration value is used as an initial value of an iterative closest point algorithm by combining the iterative closest point algorithm, and iterative closest point registration is carried out by using image data point cloud and laser point cloud to obtain high-precision registration.
Further, the performing lens distortion correction on the digital camera to obtain the internal and external parameters of the digital camera, and performing distortion correction on the extracted image data key frame includes:
fixing the checkerboard image and the digital camera, and acquiring a plurality of images from different directions;
setting the upper right corner of the checkerboard image as the origin of image coordinates, acquiring the number of squares on the checkerboard according to the length and width of the image, and detecting the corner points;
initializing parameters and performing nonlinear optimization, and calculating intrinsic parameters and distortion coefficients of the digital camera;
and calculating distortion mapping based on the intrinsic parameters and the distortion coefficients, and correcting the image data.
Further, the denoising and simplifying processing of the three-dimensional laser point cloud includes:
calculating the distance from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
calculating the integral average distance and variance in the point cloud according to the distance;
taking the point with the distance more than or equal to the sum of the integral average distance and the standard deviation multiple sigma as a noise point, and removing;
Figure BDA0003937775510000031
wherein t = u + m σ, u is the mean distance, m is the multiple of the standard deviation, σ is the variance, u is the mean distance, m is the multiple of the standard deviation i Is a distance;
and setting a minimum voxel scale, putting the original point cloud into a three-dimensional grid, and replacing each minimum voxel containing a plurality of points with one of the points to finish voxel down-sampling.
Further, the using of the position, posture and time information data in the POS sensor, according to the time synchronization principle, obtains a timestamp of the digital camera shooting image data and the three-dimensional laser radar scanning point cloud at the same time, extracts a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same timestamp, and performs initial registration based on a collinearity equation model, including:
determining the relation between the three-dimensional laser radar coordinates of the laser point cloud and the image space auxiliary coordinates of the image points with the same name as
Figure BDA0003937775510000032
Wherein [ X ] I Y I Z I ] T Auxiliary coordinates of image space of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
obtaining the conversion relation between the image point space auxiliary coordinate of the image data key frame and the WGS-84 coordinate system:
Figure BDA0003937775510000033
Figure BDA0003937775510000034
C IW =T KW +R KW T HK +R KW R HK C IH
based on the conversion relationship, the collinearity equation is obtained as:
Figure BDA0003937775510000041
in the formula: x and y are image plane coordinates of the image point; x is the number of 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C An object space coordinate that is the center of the camera; x, Y and Z are object space coordinates of the laser points; a is i ,b i ,c i (i =1,2,3) is the 3 external orientations of the imageAn element;
and performing primary registration on the laser point cloud and the image data through the collinearity equation.
Further, the method for improving the accuracy of the initial registration value by using the weighting iteration method based on Robust estimation, determining the high-accuracy registration value, combining the iterative closest point algorithm, using the high-accuracy registration value as the initial value of the iterative closest point algorithm, and performing iterative closest point registration by using the image data point cloud and the laser point cloud to obtain the high-accuracy registration comprises the following steps:
s401, starting with least square iteration according to an error equation of a collinearity equation, and applying a weight matrix of a POS position attitude data value as a unit matrix in the first iteration;
s402, applying the residual error of the first iteration, calculating a weight function of the second iteration, and performing improved Danish weight selection iteration;
s403, calculating the error in the unit weight, and if the error does not meet the condition, circulating to S402 until the precision meets the requirement;
s404, according to an error equation of the collinearity equation, taking the third iteration as a demarcation point, and the condition of iteration termination is that the difference value of the correction number of the image angle element of the key frame of the image data after two times of operation is 0.1'.
On the other hand, the invention provides a registration system, which comprises an image data correction module, a denoising and simplifying processing module, an initial registration module and a high-precision registration module;
the image data correction module is used for carrying out lens distortion correction on the digital camera, acquiring internal and external parameters of the digital camera and carrying out distortion correction on the extracted image data;
the denoising simplification processing module is used for denoising and simplifying the three-dimensional laser point cloud;
the initial registration module is used for acquiring a time stamp of the image data shot by the digital camera and the three-dimensional laser radar scanning point cloud at the same time by using position, posture and time information data in the POS sensor according to a time synchronization principle, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same time stamp, and performing initial registration based on a collinearity equation model;
the high-precision registration module is used for improving the precision of the initial registration value by using a weight selection iteration method based on Robust estimation, determining a high-precision registration value, combining an iterative closest point algorithm, taking the high-precision registration value as the initial value of the iterative closest point algorithm, and performing iterative closest point registration by using the image data point cloud and the laser point cloud to obtain high-precision registration.
Further, the image data correction module comprises an image acquisition unit, a corner detection unit, a calculation unit and a correction unit:
the image acquisition unit is used for fixing the checkerboard image and the digital camera and acquiring a plurality of images from different directions;
the corner detection unit is used for setting the upper right corner of the checkerboard image as the origin of image coordinates, acquiring the number of squares on the checkerboard according to the length and width of the image, and performing corner detection;
the calculation unit is used for initializing parameters and carrying out nonlinear optimization, and calculating intrinsic parameters and distortion coefficients of the digital camera;
and the correction unit is used for calculating distortion mapping based on the internal parameters and the distortion coefficients and correcting the image data.
Further, the denoising and simplifying processing module comprises a distance calculation unit, a whole average distance and variance calculation unit, a removal unit and a down-sampling unit;
the distance calculation unit is used for calculating the distance from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
the integral average distance and variance calculation unit is used for calculating the integral average distance and variance in the point cloud according to the distance;
the removing unit is used for taking the point with the distance greater than or equal to the sum of the integral average distance and the standard deviation multiple as a noise point and removing the noise point;
Figure BDA0003937775510000051
wherein t = u + m σ, u is the mean distance, m is the multiple of the standard deviation, σ is the variance, u is the mean distance, m is the multiple of the standard deviation i Is a distance;
the down-sampling unit is used for setting a minimum voxel scale, putting the original point cloud into a three-dimensional grid, and replacing each minimum voxel containing a plurality of points with one of the points to finish voxel down-sampling.
Further, the initial registration module comprises a coordinate relation determining unit, a coordinate conversion unit, a collinear equation determining unit and an initial registration unit;
the coordinate relation determining unit is used for determining the relation between the three-dimensional laser radar coordinates of the laser point cloud and the image space auxiliary coordinates of the image points with the same name as
Figure BDA0003937775510000061
In the formula, [ X ] I Y I Z I ] T Image space auxiliary coordinates of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r is IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
the coordinate conversion unit is used for obtaining the conversion relation between the image point image space auxiliary coordinate of the image data key frame and the WGS-84 coordinate system thereof:
Figure BDA0003937775510000062
Figure BDA0003937775510000063
C IW =T KW +R KW T HK +R KW R HK C IH
the collinearity equation determining unit is configured to obtain a collinearity equation based on the conversion relationship, where the collinearity equation is:
Figure BDA0003937775510000064
in the formula: x and y are image plane coordinates of the image point; x is a radical of a fluorine atom 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C Object space coordinates that are the camera center; x, Y and Z are object space coordinates of the laser points; a is a i ,b i ,c i (i =1,2,3) are 3 exterior orientation elements of the image;
and the initial registration unit is used for performing initial registration on the laser point cloud and the image data through the collinearity equation.
Further, the high-precision registration module comprises a first iteration unit, a weight selection iteration unit, a judgment unit and an iteration termination unit;
the first iteration unit is used for starting with least square iteration according to an error equation of a collinear equation, and the first iteration uses a weight matrix of the POS position attitude data value as a unit matrix;
the weight selection iteration unit is used for applying the residual error of the first iteration, calculating the weight function of the second iteration and performing the improved Danish weight selection iteration;
the judging unit is used for calculating the error in the unit weight, and if the error does not meet the condition, the process is circulated to S402 until the precision meets the requirement;
and the iteration termination unit is used for taking the third iteration as a demarcation point according to the error equation of the collinear equation, and the condition of iteration termination is that the difference of the correction number of the image angle element of the key frame of the image data after two times of operation is 0.1'.
The invention has the beneficial effects that: the invention combines the iterative closest point algorithm, takes the high-precision registration value as the initial value of the iterative closest point algorithm, improves the precision of the initial registration value, uses the image data point cloud and the laser point cloud to carry out iterative closest point registration to obtain high-precision registration, and solves the problem that the acquisition target data cannot be comprehensively and conveniently obtained due to the adoption of the laser point cloud data or the influence on the data.
Drawings
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings, in which:
fig. 1 is a schematic flow chart of a registration method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a registration system according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples.
It should be noted that these examples are only for illustrating the present invention, and not for limiting the present invention, and the simple modification of the method based on the idea of the present invention is within the protection scope of the present invention.
Referring to fig. 1, a registration method includes
S100, carrying out lens distortion correction on the digital camera, acquiring internal and external parameters of the digital camera, and carrying out distortion correction on the extracted image data;
s200, denoising and simplifying three-dimensional laser point cloud;
s300, acquiring a time stamp of the image data shot by the digital camera and the three-dimensional laser radar scanning point cloud at the same moment according to a time synchronization principle by using position, posture and time information data in the POS sensor, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same time stamp, and performing initial registration based on a collinear equation model;
s400, improving the precision of the initial registration value by using a weight selection iterative method based on Robust estimation, determining a high-precision registration value, taking the high-precision registration value as the initial value of the iterative closest point algorithm by combining the iterative closest point algorithm, and performing iterative closest point registration by using the image data point cloud and the laser point cloud to obtain high-precision registration.
As an implementation, the S100 may be:
s101, fixing the checkerboard image and the digital camera, and acquiring a plurality of images from different directions;
s102, setting the upper right corner of the checkerboard image as an original point of image coordinates, acquiring the number of squares on a checkerboard according to the length and width of the image, and detecting an angular point;
s103, initializing parameters and performing nonlinear optimization, and calculating intrinsic parameters and distortion coefficients of the digital camera;
and S104, calculating distortion mapping based on the internal parameters and the distortion coefficients, and correcting the image data.
As an implementation, the S200 includes:
s201, calculating the distance from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
s202, calculating the integral average distance and variance in the point cloud according to the distance;
s203, taking the point with the distance more than or equal to the sum of the integral average distance and the standard deviation multiple sigma as a noise point, and removing;
Figure BDA0003937775510000081
wherein t = u + m σ, u is the mean distance, m is the multiple of the standard deviation, σ is the variance, u is the mean distance, m is the multiple of the standard deviation i Is a distance;
s204, setting a minimum voxel scale, putting the original point cloud into a three-dimensional grid, and replacing each minimum voxel containing a plurality of points with one of the points to finish voxel down-sampling.
As one possible embodiment, the S300 includes:
s301, determining the relation between the three-dimensional laser radar coordinates of the laser point cloud and the image space auxiliary coordinates of the image points with the same name as
Figure BDA0003937775510000091
In the formula, [ X ] I Y I Z I ] T Image space auxiliary coordinates of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r is IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
s302, obtaining a conversion relation between image point image space auxiliary coordinates of the image data key frame and a WGS-84 coordinate system thereof:
Figure BDA0003937775510000092
Figure BDA0003937775510000093
C IW =T Kw +R KW T HK +R KW R HK C IH
s303, based on the conversion relation, obtaining a collinear equation as follows:
Figure BDA0003937775510000094
in the formula: x and y are image plane coordinates of the image point; x is the number of 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C An object space coordinate that is the center of the camera; x, Y and Z are object space coordinates of the laser points; a is i ,b i ,c i (i =1,2,3) are 3 exterior orientation elements of the image;
and S304, carrying out primary registration on the laser point cloud and the image data through the collinearity equation.
As an implementation, the S400 includes:
the conditions for the option iteration may be:
Figure BDA0003937775510000095
the weight formula is:
p t+1 =p t f(u),t=0,1,2,…,
replacing the residual with a normalized residual as a parameter of the weight function, the normalized residual being:
Figure BDA0003937775510000101
in the formula (I), the compound is shown in the specification,
Figure BDA0003937775510000102
Figure BDA0003937775510000103
to normalize residual error, u j As observed value residual error, m is observed value number, m 0 In order to obtain the standard deviation of the observed values,
Figure BDA0003937775510000104
is Q uu The main diagonal outer element in P,
Q uu P=E-AQ xx A T P
in the formula, E is an observation error vector, and A is an observation equation coefficient array;
so the weight-selecting iterative weight function is:
Figure BDA0003937775510000105
Figure BDA0003937775510000106
in the formula, P 0 Is a weight factor;
the error equation of the contribution equation is then:
Figure BDA0003937775510000107
s401, starting with least square iteration according to an error equation of a collinearity equation, and applying a weight matrix of a POS position attitude data value as a unit matrix in the first iteration;
s402, applying the residual error of the first iteration, calculating a weight function of the second iteration, and performing improved Danish weight selection iteration;
s403, calculating the error in the unit weight, and if the error does not meet the condition, circulating to S402 until the precision meets the requirement;
s404, according to an error equation of the collinearity equation, taking the third iteration as a demarcation point, and the condition of iteration termination is that the difference value of the correction number of the image angle element of the key frame of the image data after two times of operation is 0.1'.
Referring to fig. 2, a registration system includes an image data correction module 100, a denoising and simplification processing module 200, an initial registration module 300, and a high-precision registration module 400;
the image data correction module 100 is configured to perform lens distortion correction on the digital camera, acquire internal and external parameters of the digital camera, and perform distortion correction on the extracted image data;
the denoising simplification processing module 200 is used for denoising and simplifying the three-dimensional laser point cloud;
the initial registration module 300 is configured to obtain a timestamp of the image data captured by the digital camera and the three-dimensional lidar scanning point cloud at the same time according to a time synchronization principle by using position, posture and time information data in the POS sensor, extract a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same timestamp, and perform initial registration based on a collinearity equation model;
the high-precision registration module 400 is configured to improve the precision of an initial registration value by using a weighting iteration method based on Robust estimation, determine a high-precision registration value, use the high-precision registration value as an initial value of an iterative closest point algorithm by combining with the iterative closest point algorithm, and perform iterative closest point registration by using an image data point cloud and a laser point cloud to obtain high-precision registration.
As an implementation, the image data rectification module 100 includes an image acquisition unit 101, a corner detection unit 102, a calculation unit 103, and a correction unit 104:
the image acquisition unit 101 is configured to fix the checkerboard image and the digital camera, and acquire a plurality of images from different directions;
the corner detection unit 102 is configured to determine an upper right corner of the checkerboard image as an original point of an image coordinate, acquire the number of squares on the checkerboard according to the length and width of the image, and perform corner detection;
the calculation unit 103 is configured to perform parameter initialization and nonlinear optimization, and calculate intrinsic parameters and distortion coefficients of the digital camera;
the correcting unit 104 is configured to calculate a distortion map based on the internal parameter and the distortion coefficient, and correct the image data.
As an implementable embodiment, the denoising and simplification processing module 200 includes a distance calculation unit 201, an overall average distance and variance calculation unit 202, a removal unit 203, and a down-sampling unit 204;
the distance calculation unit 201 is configured to calculate distances from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
the ensemble average distance and variance calculating unit 202 is configured to calculate an ensemble average distance and variance in the point cloud according to the distance;
the removing unit 203 is configured to regard a point, whose distance is greater than or equal to the sum of the overall average distance and the standard deviation multiple σ, as a noise point, and remove the noise point;
Figure BDA0003937775510000121
wherein t = u + m σ, u is the mean distance, m is the multiple of the standard deviation, σ is the variance, u is the mean distance, m is the multiple of the standard deviation i Is a distance;
the down-sampling unit 204 is configured to set a minimum voxel scale, place the original point cloud into a three-dimensional grid, and replace each minimum voxel including a plurality of points with one of the points to complete voxel down-sampling.
As an implementation, the initial registration module 300 includes a coordinate relation determining unit 301, a coordinate converting unit 302, a collinearity equation determining unit 303, and a primary registration unit 304;
the coordinate relation determining unit 301 is configured to determine a relation between a three-dimensional lidar coordinate of the laser point cloud and an image space auxiliary coordinate of the image point with the same name as the image point, where:
Figure BDA0003937775510000122
in the formula, [ X ] I Y I Z I ] T Auxiliary coordinates of image space of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
the coordinate transformation unit 302 is configured to obtain a transformation relationship between the image point spatial auxiliary coordinates of the image data key frame and the WGS-84 coordinate system:
Figure BDA0003937775510000131
Figure BDA0003937775510000132
C IW =T KW +R iW T HK +R KW R HK C IH
the collinearity equation determining unit 303 is configured to obtain a collinearity equation based on the conversion relationship, where the collinearity equation is:
Figure BDA0003937775510000133
in the formula: x and y are image plane coordinates of the image point; x is a radical of a fluorine atom 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C Object space coordinates that are the camera center; x, Y and Z are object space coordinates of the laser points; a is i ,b i ,c i (i =1,2,3) are 3 exterior orientation elements of the image;
the initial registration unit 304 is configured to perform initial registration on the laser point cloud and the image data through the collinearity equation.
As an implementation manner, the high-precision registration module 400 includes a first iteration unit 401, a weight selection iteration unit 402, a decision unit 403, and an iteration termination unit 404;
the first iteration unit 401 is configured to start with least square iteration according to an error equation of a collinearity equation, where the first iteration uses a weight matrix of POS position and attitude data values as an identity matrix;
the weight selection iteration unit 402 is configured to apply the residual error of the first iteration, calculate a weight function of the second iteration, and perform modified danish weight selection iteration;
the determination unit 403 is configured to calculate an error in the unit weight, and if the error does not satisfy the condition, loop to S402 until the accuracy satisfies the requirement;
the iteration termination unit 404 is configured to use the third iteration as a boundary point according to the error equation of the collinearity equation, and a condition of termination of the iteration is that a difference value of two operations of the correction numbers of the image angle elements of the key frames of the image data is 0.1'.
Finally, it is noted that the above-mentioned embodiments illustrate rather than limit the invention, and that, while the invention has been described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A registration method is characterized by comprising
Carrying out lens distortion correction on the digital camera, acquiring internal and external parameters of the digital camera, and carrying out distortion correction on the extracted image data;
denoising and simplifying the three-dimensional laser point cloud;
acquiring timestamps of image data shot by a digital camera and three-dimensional laser radar scanning point cloud at the same time according to a time synchronization principle by using position, posture and time information data in a POS sensor, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS sensor on the basis of the same timestamp, and performing initial registration based on a collinear equation model;
the method comprises the steps of improving the accuracy of an initial registration value by using a weight selection iteration method based on Robust estimation, determining a high-accuracy registration value, taking the high-accuracy registration value as the initial value of an iterative closest point algorithm by combining the iterative closest point algorithm, and performing iterative closest point registration by using image data point cloud and laser point cloud to obtain high-accuracy registration.
2. The registration method of claim 1, wherein the performing lens distortion correction on the digital camera to obtain internal and external parameters of the digital camera and performing distortion correction on the extracted key frames of the image data comprises:
fixing the checkerboard image and the digital camera, and acquiring a plurality of images from different directions;
setting the upper right corner of the checkerboard image as the origin of image coordinates, acquiring the number of squares on the checkerboard according to the length and width of the image, and detecting the corner points;
initializing parameters and carrying out nonlinear optimization, and calculating internal parameters and distortion coefficients of the digital camera;
and calculating distortion mapping based on the internal parameters and the distortion coefficients, and correcting the image data.
3. The registration method according to claim 1, wherein the denoising and compacting process for the three-dimensional laser point cloud comprises:
calculating the distance from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
calculating the integral average distance and variance in the point cloud according to the distance;
taking the point with the distance more than or equal to the sum of the integral average distance and the standard deviation multiple times of variance as a noise point, and removing;
and setting a minimum voxel scale, putting the original point cloud into a three-dimensional grid, and replacing each minimum voxel containing a plurality of points with one of the points to finish voxel down-sampling.
4. The registration method according to claim 1,
the method comprises the following steps of acquiring a timestamp of image data shot by a digital camera and a three-dimensional laser radar scanning point cloud at the same moment according to a time synchronization principle by using position, posture and time information data in a POS sensor, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same timestamp, and performing initial registration based on a collinear equation model, wherein the method comprises the following steps:
determining the relation between the three-dimensional laser radar coordinates of the laser point cloud and the image space auxiliary coordinates of the image points with the same name as
Figure FDA0003937775500000021
In the formula, [ X ] I Y I Z I ] T Image space auxiliary coordinates of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
obtaining the conversion relation between the image point space auxiliary coordinate of the image data key frame and the WGS-84 coordinate system thereof:
Figure FDA0003937775500000022
Figure FDA0003937775500000023
C IW =T KW +R KW T HK 、+R KW R HK C IH
based on the conversion relationship, the collinearity equation is obtained as:
Figure FDA0003937775500000024
in the formula: x and y are image plane coordinates of the image point; x is the number of 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C Object space coordinates that are the camera center; x, Y and Z are object space coordinates of the laser points; a is i ,b i ,c i (i =1,2,3) are 3 exterior orientation elements of the image;
and carrying out primary registration on the laser point cloud and the image data through the collinearity equation.
5. The registration method according to claim 1, wherein the registration initial value precision is improved by using a weighting iterative method based on Robust estimation, a high-precision registration value is determined, the high-precision registration value is used as an initial value of an iterative closest point algorithm in combination with the iterative closest point algorithm, and the iterative closest point registration is performed by using an image data point cloud and a laser point cloud to obtain the high-precision registration, and the registration method comprises:
s401, starting with least square iteration according to an error equation of a collinearity equation, and applying a weight matrix of POS position attitude data values as a unit matrix in the first iteration;
s402, applying the residual error of the first iteration, calculating a weight function of the second iteration, and performing improved Danish weight selection iteration;
s403, calculating errors in the unit weight, and if the errors do not meet the conditions, circulating to S402 until the precision meets the requirements;
s404, according to the error equation of the collinearity equation, taking the third iteration as a demarcation point, and the condition of iteration termination is that the difference value of the correction number of the image angle element of the key frame of the image data after two times of operation is 0.1'.
6. A registration system is characterized by comprising an image data correction module, a denoising simplification processing module, an initial registration module and a high-precision registration module;
the image data correction module is used for carrying out lens distortion correction on the digital camera, acquiring internal and external parameters of the digital camera and carrying out distortion correction on the extracted image data;
the denoising simplification processing module is used for denoising and simplifying the three-dimensional laser point cloud;
the initial registration module is used for acquiring a time stamp of the image data shot by the digital camera and the three-dimensional laser radar scanning point cloud at the same time by using position, posture and time information data in the POS sensor according to a time synchronization principle, extracting a registration initial value corresponding to the image data and the laser point cloud from the POS data on the basis of the same time stamp, and performing initial registration based on a collinearity equation model;
the high-precision registration module is used for improving the precision of the initial registration value by using a weighting iteration method based on Robust estimation, determining a high-precision registration value, combining an iteration closest point algorithm, taking the high-precision registration value as the initial value of the iteration closest point algorithm, and performing iteration closest point registration by using the image data point cloud and the laser point cloud to obtain high-precision registration.
7. The registration system of claim 6, wherein the image data rectification module comprises an image acquisition unit, a corner detection unit, a calculation unit, and a correction unit:
the image acquisition unit is used for fixing the checkerboard image and the digital camera and acquiring a plurality of images from different directions;
the corner detection unit is used for setting the upper right corner of the checkerboard image as the origin of image coordinates, acquiring the number of squares on the checkerboard according to the length and width of the image, and performing corner detection;
the calculation unit is used for initializing parameters and carrying out nonlinear optimization, and calculating intrinsic parameters and distortion coefficients of the digital camera;
and the correction unit is used for calculating distortion mapping based on the internal parameters and the distortion coefficients and correcting the image data.
8. The registration system of claim 6, wherein the de-noising refinement processing module comprises a distance calculation unit, an ensemble average distance and variance calculation unit, a removal unit, and a down-sampling unit;
the distance calculation unit is used for calculating the distance from each point in the three-dimensional laser point cloud to a plurality of previous adjacent points;
the integral average distance and variance calculation unit is used for calculating the integral average distance and variance in the point cloud according to the distance;
the removing unit is used for taking the point with the distance greater than or equal to the sum of the integral average distance and the standard deviation multiple × variance as a noise point and removing the noise point;
the down-sampling unit is used for setting a minimum voxel scale, placing the original point cloud into a three-dimensional grid, and replacing each minimum voxel containing a plurality of points with one of the points to finish voxel down-sampling.
9. The registration system of claim 6, wherein the initial registration module comprises a coordinate relationship determination unit, a coordinate transformation unit, a collinearity equation determination unit, and an initial registration unit;
the coordinate relation determining unit is used for determining the relation between the three-dimensional laser radar coordinates of the laser point cloud and the image space auxiliary coordinates of the image points with the same name as
Figure FDA0003937775500000041
In the formula, [ X ] I Y I Z I ] T Image space auxiliary coordinates of the image points; [ X ] H Y H Z H ] T The coordinate of the laser point under the three-dimensional laser radar coordinate system is shown; r is IH And C IH Respectively representing a rotation matrix and a translation variable between an image space auxiliary coordinate system and a three-dimensional laser radar coordinate system;
the coordinate conversion unit is used for obtaining the conversion relation between the image point image space auxiliary coordinate of the image data key frame and the WGS-84 coordinate system thereof:
Figure FDA0003937775500000051
Figure FDA0003937775500000052
C IW =T KW +R KW T HK +R KW R HK C IH
the collinearity equation determining unit is configured to obtain a collinearity equation based on the conversion relationship, where the collinearity equation is:
Figure FDA0003937775500000053
in the formula: x and y are image plane coordinates of the image point; x is a radical of a fluorine atom 0 ,y 0 F is the internal orientation element of the image; x C ,Y C ,Z C An object space coordinate that is the center of the camera; x, Y and Z are object space coordinates of the laser points; a is i ,b i ,c i (i =1,2,3) are 3 exterior orientation elements of the image;
and the initial registration unit is used for performing initial registration on the laser point cloud and the image data through the collinearity equation.
10. The registration system of claim 6, wherein the high-precision registration module comprises a first iteration unit, a weight selection iteration unit, a decision unit, and an iteration termination unit;
the first iteration unit is used for starting the least square iteration according to an error equation of a collinearity equation, and the first iteration uses a weight matrix of the POS position attitude data value as a unit matrix;
the weight selection iteration unit is used for applying the residual error of the first iteration, calculating the weight function of the second iteration and performing the improved Danish weight selection iteration;
the judging unit is used for calculating the error in the unit weight, and if the error does not meet the condition, the operation is circulated to S402 until the precision meets the requirement;
and the iteration termination unit is used for taking the third iteration as a demarcation point according to the error equation of the collinear equation, and the condition of iteration termination is that the difference of the correction number of the image angle element of the key frame of the image data after two times of operation is 0.1'.
CN202211409046.XA 2022-11-11 2022-11-11 Registration method and system Active CN115690380B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/131556 WO2024098428A1 (en) 2022-11-11 2022-11-11 Registration method and system
CN202211409046.XA CN115690380B (en) 2022-11-11 2022-11-11 Registration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211409046.XA CN115690380B (en) 2022-11-11 2022-11-11 Registration method and system

Publications (2)

Publication Number Publication Date
CN115690380A true CN115690380A (en) 2023-02-03
CN115690380B CN115690380B (en) 2023-07-21

Family

ID=85052974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211409046.XA Active CN115690380B (en) 2022-11-11 2022-11-11 Registration method and system

Country Status (2)

Country Link
CN (1) CN115690380B (en)
WO (1) WO2024098428A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117518197A (en) * 2024-01-08 2024-02-06 太原理工大学 Contour marking method for underground coal mine tunneling roadway

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100973052B1 (en) * 2009-04-24 2010-07-30 서울시립대학교 산학협력단 Automatic matching method of digital aerial images using lidar data
CN109410256A (en) * 2018-10-29 2019-03-01 北京建筑大学 Based on mutual information cloud and image automatic, high precision method for registering
CN112465849A (en) * 2020-11-27 2021-03-09 武汉大学 Registration method for laser point cloud and sequence image of unmanned aerial vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100973052B1 (en) * 2009-04-24 2010-07-30 서울시립대학교 산학협력단 Automatic matching method of digital aerial images using lidar data
CN109410256A (en) * 2018-10-29 2019-03-01 北京建筑大学 Based on mutual information cloud and image automatic, high precision method for registering
CN112465849A (en) * 2020-11-27 2021-03-09 武汉大学 Registration method for laser point cloud and sequence image of unmanned aerial vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117518197A (en) * 2024-01-08 2024-02-06 太原理工大学 Contour marking method for underground coal mine tunneling roadway
CN117518197B (en) * 2024-01-08 2024-03-26 太原理工大学 Contour marking method for underground coal mine tunneling roadway

Also Published As

Publication number Publication date
WO2024098428A1 (en) 2024-05-16
CN115690380B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN109903352B (en) Method for making large-area seamless orthoimage of satellite remote sensing image
CN109410256B (en) Automatic high-precision point cloud and image registration method based on mutual information
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
CN110244302B (en) Three-dimensional transformation method for image pixel coordinates of foundation synthetic aperture radar
CN109727278B (en) Automatic registration method for airborne LiDAR point cloud data and aerial image
CN107767456A (en) A kind of object dimensional method for reconstructing based on RGB D cameras
CN111524194B (en) Positioning method and terminal for mutually fusing laser radar and binocular vision
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN112648976B (en) Live-action image measuring method and device, electronic equipment and storage medium
CN112465732A (en) Registration method of vehicle-mounted laser point cloud and sequence panoramic image
CN111003214B (en) Attitude and orbit refinement method for domestic land observation satellite based on cloud control
CN112946679B (en) Unmanned aerial vehicle mapping jelly effect detection method and system based on artificial intelligence
CN112767461A (en) Automatic registration method for laser point cloud and sequence panoramic image
CN108230375A (en) Visible images and SAR image registration method based on structural similarity fast robust
CN112270698A (en) Non-rigid geometric registration method based on nearest curved surface
Tao et al. Use of the rational function model for image rectification
CN115690380B (en) Registration method and system
Wang et al. Large-scale orthorectification of GF-3 SAR images without ground control points for China’s land area
CN116758234A (en) Mountain terrain modeling method based on multipoint cloud data fusion
CN116563377A (en) Mars rock measurement method based on hemispherical projection model
CN109029379B (en) High-precision small-base-height-ratio three-dimensional mapping method
CN115496783A (en) Indoor space three-dimensional color point cloud generation method
CN112767459A (en) Unmanned aerial vehicle laser point cloud and sequence image registration method based on 2D-3D conversion
CN111611525B (en) Remote sensing data elevation calculation method based on object space matching elevation deviation iterative correction
CN115578448B (en) Astronomical positioning method and system based on CCD observation data batch processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant