CN107316325B - Airborne laser point cloud and image registration fusion method based on image registration - Google Patents

Airborne laser point cloud and image registration fusion method based on image registration Download PDF

Info

Publication number
CN107316325B
CN107316325B CN201710422435.9A CN201710422435A CN107316325B CN 107316325 B CN107316325 B CN 107316325B CN 201710422435 A CN201710422435 A CN 201710422435A CN 107316325 B CN107316325 B CN 107316325B
Authority
CN
China
Prior art keywords
image
point cloud
pixel
aerial
orthographic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710422435.9A
Other languages
Chinese (zh)
Other versions
CN107316325A (en
Inventor
裴海龙
黄荣恩
庄兆殿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201710422435.9A priority Critical patent/CN107316325B/en
Publication of CN107316325A publication Critical patent/CN107316325A/en
Application granted granted Critical
Publication of CN107316325B publication Critical patent/CN107316325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image registration-based airborne laser point cloud and image registration fusion method, which is used for fusing three-dimensional terrain point cloud acquired by an airborne LiDAR system with an aerial image to generate a true-color three-dimensional point cloud picture and comprises the following steps: generating a point cloud orthographic projection image A by using three-dimensional laser point cloud and establishing a point cloud-pixel index; generating an aerial orthographic stitching image B by utilizing the aerial image; carrying out image registration operation on the image A and the image B, and converting the pixel coordinate of the point cloud orthographic projection image A into the pixel coordinate system of the aerial orthographic stitching image B; and performing back projection by using the point cloud-pixel index, finding the pixel coordinate of the aerial orthographic mosaic image B corresponding to each point cloud, assigning the pixel color value to the point cloud, and fusing to generate a true color point cloud picture. The method does not limit whether the point cloud data and the image data are synchronously acquired, and even aerial images or remote sensing orthographic images generated by a third party can be used as registration images.

Description

Airborne laser point cloud and image registration fusion method based on image registration
Technical Field
The invention relates to the technical field of unmanned aerial vehicle aerial surveying and mapping, in particular to an airborne laser point cloud and image registration fusion method based on image registration.
Background
The laser scanning measurement technology (Light Detection And Ranging), abbreviated as LiDAR, is also called real scene replication technology, And is the latest one of three-dimensional data technology And scene modeling technology. The airborne LiDAR technology is used on an aircraft to realize earth observation, integrates a global positioning system, an inertial navigation system and laser, and can quickly and efficiently acquire the accurate three-dimensional space coordinate of each sampling point on the surface of a ground object. In recent years, with the rapid development of airborne laser scanning measurement technology, the method has wide application prospects in the aspects of digital cities, disaster monitoring, coastal engineering, forestry investigation and the like. However, the technology can only acquire the discrete three-dimensional space coordinate and the reflection intensity information of the target object, and does not have the capability of acquiring the real texture information of the object. With the rapid development of digital imaging technology and image sensors, the acquisition of high-definition visible light images is not a difficult problem, and the corresponding image processing and fusion technology is widely applied in the fields of machine vision and remote sensing aerial photography. The texture information of the target object carried by the image is complementary to the three-dimensional information acquired by the laser scanning technology.
In order to generate a true color point cloud image, the point cloud and the image need to be registered and fused. The registration fusion of the three-dimensional point cloud and the aerial image data is a data registration fusion problem among different types of sensors, and belongs to a classical two-dimensional-three-dimensional fusion problem. According to the difference between the registration process and the registration features, the current two-dimensional and three-dimensional registration methods can be classified into the following methods: (1) a laser scanner and camera one-machine position calibration method; (2) a feature-based 3D-2D registration method; (3) a method of 3D-3D registration based on a stereopair. The first method needs to fix and synchronously collect the camera and the scanner, and needs to perform precise calibration operation, so that certain limitations exist; the second method requires that more obvious features (points, lines and planes) exist in the point cloud and the image as registration elements, and the extraction precision of the registration elements has great influence on the registration result; the third method has requirements on the inclination angle and the overlapping degree of the shot image, the algorithm and the process for generating the image point cloud are complex, the precision is difficult to guarantee, and the requirement for quickly and efficiently generating the true color point cloud cannot be met. Aiming at airborne aerial surveying and mapping of an unmanned aerial vehicle, when a scanned object is a terrain with few obvious characteristics such as a wasteland, a mountain peak or a coastline, the existing method cannot completely solve the problem of rapid and efficient registration fusion of data.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides an airborne laser point cloud and image registration fusion method based on image registration.
The purpose of the invention can be achieved by adopting the following technical scheme:
an airborne laser point cloud and image registration fusion method based on image registration comprises the following steps:
s1, performing orthographic projection on the three-dimensional laser point cloud, generating a point cloud orthographic projection image A and establishing a point cloud-pixel index;
s2, distortion correction is carried out on the aerial image shot by the camera, and orthoscopic and mosaic processing is carried out on the corrected image to generate an aerial orthoscopic mosaic image B;
s3, taking the point cloud orthographic projection image A as a floating image, taking the aerial orthographic stitching image B as a reference image, carrying out image registration operation on the point cloud orthographic projection image A and the aerial orthographic stitching image B, solving transformation parameters by using an image registration algorithm, and transforming the pixel coordinate of the point cloud orthographic projection image A into the coordinate system of the aerial orthographic stitching image B;
s4, performing back projection on the pixels of the point cloud orthoscopic projection image A by using the point cloud-pixel index established in the projection process, finding out the pixel coordinates of the aerial orthoscopic mosaic image B corresponding to each point cloud, and assigning the color value of the aerial orthoscopic mosaic image B to the corresponding point cloud to generate a true color point cloud picture.
Further, the process of generating the point cloud orthographic projection image a in the step S1 includes: s11, sampling; s12, quantization process; and S13, interpolation process.
Further, the sampling process includes:
s111, selecting a projection plane, establishing a pixel coordinate system, calculating a minimum outsourcing rectangle of all projection points in the projection plane, taking an O-XY plane under a northeast coordinate system as the projection plane, taking the upper left corner of the outsourcing rectangle as a coordinate system origin O, taking the southward direction as a V axis, and taking the eastern direction as an axis U, and establishing a pixel coordinate system O-UV;
s112, calculating the variation range of the X coordinate and the Y coordinate of the point cloud, YmaxAnd YminRespectively representing the maximum and minimum values of the Y-axis coordinate, XmaxAnd XminThe maximum and minimum values of the X-axis coordinate are represented respectively, and the calculation formula of the pixel size S and the image height H by manually setting the projection image width W is as follows:
Figure BDA0001315316020000031
s113, the corresponding pixel coordinate of each point cloud under the projection image is (u, v), and the calculation formula is as follows:
Figure BDA0001315316020000032
and S114, rounding the pixel coordinates by using a nearest neighbor method, assigning the pixel coordinates nearest to the pixel coordinates to (u, v), and simultaneously recording the corresponding relation between the point cloud coordinates and the pixel coordinates to generate the point cloud-pixel index.
Further, the quantization process specifically includes:
and carrying out quantitative filling on the value of each pixel point, selecting an elevation to carry out quantitative generation of an elevation projection image, and selecting laser reflection intensity to carry out quantitative generation of an intensity projection image.
Further, the interpolation process specifically includes:
for a blank pixel in an image, interpolating the surrounding pixels by a high-order interpolation method, interpolating the point cloud at the same time, and establishing a point cloud-pixel index, wherein a high-order interpolation function is as follows:
Figure BDA0001315316020000041
where | x | is a distance value from a surrounding pixel to (u, v).
Further, the transformation model used for image registration in the image registration algorithm of step S3 is an affine transformation model, and the homogeneous equation of the affine transformation model is expressed as follows:
Figure BDA0001315316020000042
wherein (x)A,yA) Is the pixel coordinate of the point cloud orthographic projection image A, (x)B,yB) Pixel coordinates in the aerial ortho-stitched image B, M is an affine transformation relationship of the point cloud ortho-projected image A to the aerial ortho-stitched image B, wherein
Figure BDA0001315316020000043
Is a composite matrix of rotation, scaling and inversion,
Figure BDA0001315316020000044
is a translation vector, the affine transformation model has 6 unknown parameters, and 3 pairs of co-named feature point coordinates which are not collinear are foundThe unknown parameters can be obtained.
Further, the affine transformation model is simplified as follows:
the unknown parameters are limited by only considering the rotation translation and the scaling, and the order is
a11=a22=k cosθ,a12=-a21=k sinθ,b1=kc1,b2=kc2The affine transformation model is simplified into the RST transformation model, and is expressed as follows:
Figure BDA0001315316020000045
where k denotes a scaling, θ denotes a rotation angle between images, [ c ]1,c2]TThe translation vector before scaling is represented, the simplified RST transformation model has 4 unknown parameters, and the result can be solved by finding 2 pairs of homonymous feature points.
Compared with the prior art, the invention has the following advantages and effects:
1) the method is suitable for the field of unmanned aerial vehicle surveying and mapping, and has the advantages of high stability and high precision.
2) The method does not limit whether the laser scanner and the camera are fixedly installed and synchronously collected, and even can use a third-party aerial image as a registration image, so that the flexibility is high.
3) The method can be well suitable for scanning areas without obvious characteristics, such as wastelands, grasslands, coastlines and the like, and has the characteristics of low calculation amount, high mapping efficiency and high stability.
Drawings
FIG. 1 is a flow chart of the method for fusing the registration of airborne laser point cloud and image based on image registration disclosed by the invention;
FIG. 2 is a point cloud orthographic projection model;
FIG. 3 is a mapping diagram of the relationship between the point cloud and the image coordinate.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
The embodiment discloses an airborne laser point cloud and image registration fusion method based on image registration, which is used for registration fusion of the airborne laser point cloud and aerial images and generation of true color point cloud images and comprises the following steps:
s1, performing orthographic projection on the three-dimensional laser point cloud, generating a point cloud orthographic projection image A, establishing a point cloud-pixel index, and taking the point cloud orthographic projection image A as a floating image;
the generation of the point cloud orthographic projection image A comprises sampling, quantification and interpolation, and the specific process is as follows:
s11, sampling process:
s111, selecting a projection plane, establishing a pixel coordinate system and calculating the minimum outsourcing rectangle of all projection points in the projection plane. An O-XY plane under a North-East-Down (NED) coordinate system is taken as a projection plane, the upper left corner of an outer-wrapping rectangle is taken as a coordinate system origin O, the south-positive direction is taken as a V axis, and the East-positive direction is taken as an axis U, so that a pixel coordinate system O-UV is established.
S112, calculating the variation range of the X coordinate and the Y coordinate of the point cloud, YmaxAnd YminRespectively representing the maximum and minimum values of the Y-axis coordinate, XmaxAnd XminThe maximum and minimum values of the X-axis coordinate are represented respectively, and the calculation formula of the pixel size S and the image height H by manually setting the projection image width W is as follows:
Figure BDA0001315316020000061
s113, the corresponding pixel coordinate of each point cloud under the projection image is (u, v), and the calculation formula is as follows:
Figure BDA0001315316020000062
and S114, rounding the pixel coordinates by using a nearest neighbor method, assigning the pixel coordinates nearest to the pixel coordinates to (u, v), and simultaneously recording the corresponding relation between the point cloud coordinates and the pixel coordinates to generate a point cloud-pixel index, so that the subsequent back projection query can be conveniently carried out.
S12, quantization process: after sampling is completed, the value of each pixel point needs to be subjected to quantization filling, an elevation projection image can be generated by selecting elevation for quantization, and an intensity projection image is generated by selecting laser reflection intensity for quantization.
S13, interpolation process: for a blank pixel in an image, interpolating by using a high-order interpolation method by using surrounding pixels (except the blank pixel), simultaneously interpolating a point cloud, and establishing a point cloud-pixel index, wherein a high-order interpolation function is as follows:
Figure BDA0001315316020000071
where | x | is a distance value from a surrounding pixel to (u, v).
S2, distortion correction is carried out on the aerial image shot by the camera, and orthorectification and splicing processing are carried out on the corrected image to generate an aerial orthorectification spliced image B serving as a reference image;
s3, taking the point cloud orthographic projection image A as a floating image, taking the aerial orthographic stitching image B as a reference image, carrying out image registration operation on the point cloud orthographic projection image A and the aerial orthographic stitching image B, solving transformation parameters by using an image registration algorithm, and transforming the pixel coordinate of the point cloud orthographic projection image A into the coordinate system of the aerial orthographic stitching image B;
the transformation model used for image registration in the image registration algorithm is an affine transformation model, and the homogeneous equation of the affine transformation model is expressed as follows:
Figure BDA0001315316020000072
wherein (x)A,yA) Is the pixel coordinate of the point cloud orthographic projection image A, (x)B,yB) Pixel coordinates in the aerial ortho-stitched image B, M is an affine transformation relationship of the point cloud ortho-projected image A to the aerial ortho-stitched image B, wherein
Figure BDA0001315316020000073
Is a composite matrix of rotation, scaling and inversion,
Figure BDA0001315316020000074
the method is a translation vector, the affine transformation model has 6 unknown parameters, and the unknown parameters can be solved only by finding 3 pairs of co-linear feature point coordinates with the same name.
The affine transformation model can be simplified. The unknown parameters are limited, only the rotational translation and the zooming are considered, let a11=a22=k cosθ,a12=-a21=k sinθ,b1=kc1,b2=kc2The affine transformation model can be simplified to the RST (rotation-scaling-translation) transformation model, expressed as follows:
Figure BDA0001315316020000081
where k denotes a scaling, θ denotes a rotation angle between images, [ c ]1,c2]TThe translation vector before scaling is shown, the simplified RST transformation model has only 4 unknown parameters, and the result can be solved by only 2 pairs of homonymous feature points.
If the image registration method based on the characteristics is adopted, the characteristic points with the same name are at least 2 pairs.
S4, performing back projection on the pixels of the point cloud orthoscopic projection image A by using the point cloud-pixel index established in the projection process, finding out the pixel coordinates of the aerial orthoscopic mosaic image B corresponding to each point cloud, and assigning the color value of the aerial orthoscopic mosaic image B to the corresponding point cloud to generate a true color point cloud picture.
And finding the pixel coordinate of the aerial orthographic projection image B corresponding to each three-dimensional point cloud coordinate according to the point cloud-pixel index relation in the step S1 and the coordinate corresponding relation from the point cloud orthographic projection image to the aerial orthographic projection image in the step S3, assigning the pixel value to the corresponding point cloud, and finally generating a true color point cloud picture.
Example two
The embodiment discloses a specific implementation of an airborne laser point cloud and image registration fusion method based on image registration, and the steps of an algorithm based on image registration are shown in fig. 1.
The small unmanned helicopter is influenced by low air flow and engine vibration when flying to cause unstable flying attitude, so that the shooting visual angle is influenced to cause image shaking distortion, an external high-resolution camera generally adopts a large wide-angle camera to shoot a large-scale landform, and the geometric distortion of aerial images is also caused. Due to distortion, the pixels in the image can be shifted in geometric position, for example, a straight line in space can become a curve in the image, and the distortion calibration of the image is to make the geometric relationship of each pixel return to a relatively correct state. According to the actual imaging model of the camera:
Figure BDA0001315316020000091
(xi,yi) Is the coordinate of the ideal projection point in the ideal imaging model under the image coordinate system, and (x)r,yr) For the coordinates of the actual projection point in the actual imaging model under the image coordinate system, (u)0,v0) Representing the coordinates of the origin of the image coordinate system in the pixel coordinate system, sx,syIs a scale factor, k, of the horizontal and vertical axes of the image1,k2In order to be the radial distortion factor,
Figure BDA0001315316020000092
is the distance, p, from the pixel point to the center of the image plane1,p2Is the tangential distortion coefficient.
Under the condition that the focal length of the camera is not changed, the internal parameters and the distortion coefficients obtained by calibrating the camera are substituted into the formula, so that the ideal coordinate position of the pixel point of the corrected image can be obtained, and because the pixel coordinate of the image is an integer in practice, but the pixel coordinate obtained by calculation of the formula is not an integer generally, the bilinear interpolation method is selected for correcting the distorted image in the experiment contrast. After distortion correction, the spliced orthoimage is generated by using PhotoSacan of the post-stage aerial photo splicing software of the unmanned aerial vehicle.
As shown in the point cloud orthographic projection model diagram of fig. 2, a point cloud elevation orthographic projection may be generated by orthographically projecting a point cloud and taking the elevation as a quantization object.
The spliced orthographic image and the point cloud orthographic projection image can both express an accurate geometric position relation between terrain and ground objects, an affine transformation relation is satisfied between the two images, the point cloud orthographic projection image A is introduced into an experiment as a floating image, the aerial orthographic spliced image B is used as a reference image, an affine transformation relation between the two images is established, the rotation and translation scaling and inversion of the images are realized through affine transformation and transformation, and pixel coordinates (x) in the point cloud orthographic projection image AA,yA) Obtaining the pixel coordinate (x) in the aerial orthographic mosaic image B after affine transformation MB,yB) The affine transformation model homogeneous form can be expressed as
Figure BDA0001315316020000093
Solve the equation of
Figure BDA0001315316020000101
The feature-based image registration generally comprises four steps of feature extraction, feature matching, transformation model parameter estimation and image registration, and the accuracy of the image registration is determined by the extraction accuracy of the features. In the feature extraction and matching, manual participation is not needed according to needs, and the feature extraction and matching can be divided into manual matching and automatic matching. For images with more features, a semi-automatic registration method based on angular point features can be adopted, firstly, morphological noise reduction is carried out on two images, then an edge is extracted by using a Sobel operator, then alternative homonymy points are extracted in the edge images by using a Harris angular point detection algorithm, and finally manual extraction is carried out in the alternative points; and aiming at the images with unobvious characteristic ground objects, the registration of the images can be realized by manually selecting the characteristic points and calculating transformation parameters. In this embodiment, three pairs of feature points with the same name are manually extracted, and transformation parameters are solved according to a formula.
Indexes of point cloud coordinates and projected image pixel coordinates are established in the projection process of the three-dimensional point cloud, and the mapping relation from the projected image pixel coordinates to the aerial image pixel coordinates is obtained in the image registration process, so that the mapping relation from each three-dimensional point cloud coordinate to the aerial image pixel coordinates can be obtained. And finally, assigning the color value of the aerial image pixel to the corresponding point cloud, so as to realize the fusion of the point cloud and the image. The coordinate mapping relationship between the point cloud and the aerial image is shown in fig. 3.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (3)

1. An airborne laser point cloud and image registration fusion method based on image registration is characterized by comprising the following steps:
s1, performing orthographic projection on the three-dimensional laser point cloud, generating a point cloud orthographic projection image A and establishing a point cloud-pixel index, wherein the process of generating the point cloud orthographic projection image A comprises the following steps:
s11, a sampling process, comprising:
s111, selecting a projection plane, establishing a pixel coordinate system, calculating a minimum outsourcing rectangle of all projection points in the projection plane, taking an O-XY plane under a northeast coordinate system as the projection plane, taking the upper left corner of the outsourcing rectangle as a coordinate system origin O, taking the southward direction as a V axis, and taking the eastern direction as an axis U, and establishing a pixel coordinate system O-UV;
s112, calculating the variation range of the X coordinate and the Y coordinate of the point cloud, YmaxAnd YminRespectively representing the maximum and minimum values of the Y-axis coordinate, XmaxAnd XminThe maximum and minimum values of the X-axis coordinate are represented respectively, and the calculation formula of the pixel size S and the image height H by manually setting the projection image width W is as follows:
Figure FDA0002448097360000011
s113, the corresponding pixel coordinate of each point cloud under the projection image is (u, v), and the calculation formula is as follows:
Figure FDA0002448097360000012
s114, rounding the pixel coordinates by using a nearest neighbor method, assigning the pixel coordinates nearest to the pixel coordinates to (u, v), and simultaneously recording the corresponding relation between the point cloud coordinates and the pixel coordinates to generate a point cloud-pixel index;
s12, a quantification process specifically comprises the following steps:
the value of each pixel point is subjected to quantitative filling, the elevation is selected for quantitative generation of an elevation projection image, and the laser reflection intensity is selected for quantitative generation of an intensity projection image;
s13, the interpolation process specifically includes:
for a blank pixel in an image, interpolating the surrounding pixels by a high-order interpolation method, interpolating the point cloud at the same time, and establishing a point cloud-pixel index, wherein a high-order interpolation function is as follows:
Figure FDA0002448097360000021
wherein | x | is a distance value from a surrounding pixel to (u, v);
s2, distortion correction is carried out on the aerial image shot by the camera, and orthoscopic and mosaic processing is carried out on the corrected image to generate an aerial orthoscopic mosaic image B;
s3, taking the point cloud orthographic projection image A as a floating image, taking the aerial orthographic stitching image B as a reference image, carrying out image registration operation on the point cloud orthographic projection image A and the aerial orthographic stitching image B, solving transformation parameters by using an image registration algorithm, and transforming the pixel coordinate of the point cloud orthographic projection image A into the coordinate system of the aerial orthographic stitching image B;
s4, performing back projection on the pixels of the point cloud orthoscopic projection image A by using the point cloud-pixel index established in the projection process, finding out the pixel coordinates of the aerial orthoscopic mosaic image B corresponding to each point cloud, and assigning the color value of the aerial orthoscopic mosaic image B to the corresponding point cloud to generate a true color point cloud picture.
2. The image registration-based airborne laser point cloud and image registration fusion method of claim 1, wherein the transformation model used for image registration in the image registration algorithm of step S3 is an affine transformation model, and the homogeneous equation of the affine transformation model is expressed as follows:
Figure FDA0002448097360000022
wherein (x)A,yA) Is the pixel coordinate of the point cloud orthographic projection image A, (x)B,yB) Pixel coordinates in the aerial ortho-stitched image B, M is an affine transformation relationship of the point cloud ortho-projected image A to the aerial ortho-stitched image B, wherein
Figure FDA0002448097360000031
Is a composite matrix of rotation, scaling and inversion,
Figure FDA0002448097360000032
the affine transformation model has 6 unknown parameters, and the unknown parameters can be obtained by finding 3 pairs of co-linear feature point coordinates with the same name.
3. The image registration-based airborne laser point cloud and image registration fusion method of claim 2, wherein the affine transformation model is simplified as follows:
the unknown parameters are limited by only considering the rotation translation and the scaling, and the order is
a11=a22=kcosθ,a12=-a21=ksinθ,b1=kc1,b2=kc2The affine transformation model is simplified into the RST transformation model, and is expressed as follows:
Figure FDA0002448097360000033
where k denotes a scaling, θ denotes a rotation angle between images, [ c ]1,c2]TThe translation vector before scaling is represented, the simplified RST transformation model has 4 unknown parameters, and the result can be solved by finding 2 pairs of homonymous feature points.
CN201710422435.9A 2017-06-07 2017-06-07 Airborne laser point cloud and image registration fusion method based on image registration Active CN107316325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710422435.9A CN107316325B (en) 2017-06-07 2017-06-07 Airborne laser point cloud and image registration fusion method based on image registration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710422435.9A CN107316325B (en) 2017-06-07 2017-06-07 Airborne laser point cloud and image registration fusion method based on image registration

Publications (2)

Publication Number Publication Date
CN107316325A CN107316325A (en) 2017-11-03
CN107316325B true CN107316325B (en) 2020-09-22

Family

ID=60182199

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710422435.9A Active CN107316325B (en) 2017-06-07 2017-06-07 Airborne laser point cloud and image registration fusion method based on image registration

Country Status (1)

Country Link
CN (1) CN107316325B (en)

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109813335B (en) * 2017-11-21 2021-02-09 武汉四维图新科技有限公司 Calibration method, device and system of data acquisition system and storage medium
CN109076173A (en) * 2017-11-21 2018-12-21 深圳市大疆创新科技有限公司 Image output generation method, equipment and unmanned plane
CN108317953A (en) * 2018-01-19 2018-07-24 东北电力大学 A kind of binocular vision target surface 3D detection methods and system based on unmanned plane
CN108198223B (en) * 2018-01-29 2020-04-07 清华大学 Method for quickly and accurately calibrating mapping relation between laser point cloud and visual image
CN110136178B (en) * 2018-02-08 2021-06-25 中国人民解放军战略支援部队信息工程大学 Three-dimensional laser point cloud registration method and device based on endpoint fitting
CN108596837B (en) * 2018-05-09 2022-06-10 北京玖瑞科技有限公司 Image splicing method, device, equipment and computer medium
CN108682028A (en) * 2018-05-16 2018-10-19 陈年康 Laser point cloud based on radiation correcting and optical image automatic matching method
CN110148196B (en) * 2018-09-12 2022-03-25 腾讯大地通途(北京)科技有限公司 Image processing method and device and related equipment
CN109472752B (en) * 2018-10-30 2022-05-03 北京工业大学 Multi-exposure fusion system based on aerial images
US10353073B1 (en) * 2019-01-11 2019-07-16 Nurulize, Inc. Point cloud colorization system with real-time 3D visualization
CN111819602A (en) * 2019-02-02 2020-10-23 深圳市大疆创新科技有限公司 Method for increasing point cloud sampling density, point cloud scanning system and readable storage medium
CN109993696B (en) * 2019-03-15 2022-11-25 广州愿托科技有限公司 Multi-viewpoint image-based correction and splicing method for structural object surface panoramic image
CN109945853B (en) * 2019-03-26 2023-08-15 西安因诺航空科技有限公司 Geographic coordinate positioning system and method based on 3D point cloud aerial image
CN109993793B (en) * 2019-03-29 2021-09-07 北京易达图灵科技有限公司 Visual positioning method and device
CN110111414B (en) * 2019-04-10 2023-01-06 北京建筑大学 Orthographic image generation method based on three-dimensional laser point cloud
CN110120013B (en) * 2019-05-15 2023-10-20 深圳市凌云视迅科技有限责任公司 Point cloud splicing method and device
CN110378199B (en) * 2019-06-03 2021-08-06 北京北科安地科技发展有限公司 Rock-soil body displacement monitoring method based on multi-period images of unmanned aerial vehicle
CN110223389B (en) * 2019-06-11 2021-05-04 中国科学院自动化研究所 Scene modeling method, system and device fusing image and laser data
CN110502839B (en) * 2019-08-23 2023-05-02 中铁第六勘察设计院集团有限公司 GIS (geographic information System) coordinate and CAD (computer aided design) coordinate conversion method based on BIM (building information modeling) platform
CN110596729A (en) * 2019-09-12 2019-12-20 北京京东乾石科技有限公司 Laser scanner and autopilot car
CN110827202A (en) * 2019-11-07 2020-02-21 上海眼控科技股份有限公司 Target detection method, target detection device, computer equipment and storage medium
CN110880202B (en) * 2019-12-02 2023-03-21 中电科特种飞机系统工程有限公司 Three-dimensional terrain model creating method, device, equipment and storage medium
CN111091595B (en) * 2019-12-23 2023-06-02 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Strabismus three-dimensional mapping method and system
CN111561949B (en) * 2020-06-06 2023-05-05 北京依锐思遥感技术有限公司 Coordinate matching method of airborne laser radar and hyperspectral imager integrated machine
CN111798402B (en) * 2020-06-09 2024-02-27 同济大学 Power equipment temperature measurement data visualization method and system based on three-dimensional point cloud model
CN111968161A (en) * 2020-07-28 2020-11-20 北京恒通智控机器人科技有限公司 Registration method, device and equipment for three-dimensional laser point cloud and panoramic image
US20220051422A1 (en) * 2020-08-12 2022-02-17 Faro Technologies, Inc. Laser scanner with ultrawide-angle lens camera for registration
CN111784585B (en) * 2020-09-07 2020-12-15 成都纵横自动化技术股份有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN114326775B (en) * 2020-09-29 2024-05-28 北京机械设备研究所 Unmanned aerial vehicle system based on thing networking
CN112130151B (en) * 2020-10-16 2022-07-08 中国有色金属长沙勘察设计研究院有限公司 Arc synthetic aperture ground radar coordinate projection rapid calculation method
CN112581505B (en) * 2020-12-24 2022-06-03 天津师范大学 Simple automatic registration method for laser radar point cloud and optical image
CN113793370B (en) * 2021-01-13 2024-04-19 北京京东叁佰陆拾度电子商务有限公司 Three-dimensional point cloud registration method and device, electronic equipment and readable medium
CN112802073B (en) * 2021-04-08 2021-07-06 之江实验室 Fusion registration method based on image data and point cloud data
CN113192182A (en) * 2021-04-29 2021-07-30 山东产研信息与人工智能融合研究院有限公司 Multi-sensor-based live-action reconstruction method and system
CN113643338A (en) * 2021-08-13 2021-11-12 亿嘉和科技股份有限公司 Texture image target positioning method based on fusion affine transformation
CN113688900A (en) * 2021-08-23 2021-11-23 阿波罗智联(北京)科技有限公司 Radar and visual data fusion processing method, road side equipment and intelligent traffic system
CN114677454B (en) * 2022-03-25 2022-10-04 杭州睿影科技有限公司 Image generation method and device
CN115810078A (en) * 2022-11-22 2023-03-17 武汉际上导航科技有限公司 Method for coloring laser point cloud based on POS data and airborne visible light image
CN116758006B (en) * 2023-05-18 2024-02-06 广州广检建设工程检测中心有限公司 Scaffold quality detection method and device
CN118348015A (en) * 2024-05-16 2024-07-16 山东大学 Self-moving subway tunnel structure apparent defect detection equipment and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN104268935A (en) * 2014-09-18 2015-01-07 华南理工大学 Feature-based airborne laser point cloud and image data fusion system and method
CN105423915A (en) * 2015-11-16 2016-03-23 天津师范大学 Accurate positioning method of planar target for ground laser scanning data registration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201022708A (en) * 2008-12-11 2010-06-16 Univ Nat Central Method of change detection for building models

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN104268935A (en) * 2014-09-18 2015-01-07 华南理工大学 Feature-based airborne laser point cloud and image data fusion system and method
CN105423915A (en) * 2015-11-16 2016-03-23 天津师范大学 Accurate positioning method of planar target for ground laser scanning data registration

Also Published As

Publication number Publication date
CN107316325A (en) 2017-11-03

Similar Documents

Publication Publication Date Title
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
KR100912715B1 (en) Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
Teller et al. Calibrated, registered images of an extended urban area
KR101533182B1 (en) 3d streets
CN110246221B (en) Method and device for obtaining true shot image
KR100671529B1 (en) Drawing method of three-dimensional cubic map using multi aerial photograph image
CN110319772B (en) Visual large-span distance measurement method based on unmanned aerial vehicle
KR102200299B1 (en) A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof
JP3776787B2 (en) 3D database generation system
JP2002157576A (en) Device and method for processing stereo image and recording medium for recording stereo image processing program
KR20100087034A (en) Method and apparatus of taking aerial surveys
CN111693025B (en) Remote sensing image data generation method, system and equipment
CN113192193A (en) High-voltage transmission line corridor three-dimensional reconstruction method based on Cesium three-dimensional earth frame
JP2023505891A (en) Methods for measuring environmental topography
CN112461204B (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
WO2024098428A1 (en) Registration method and system
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
CN110986888A (en) Aerial photography integrated method
KR100671504B1 (en) Method for correcting of aerial photograph image using multi photograph image
Sai et al. Geometric accuracy assessments of orthophoto production from uav aerial images
Burkhard et al. Stereovision mobile mapping: System design and performance evaluation
CN114544006B (en) Low-altitude remote sensing image correction system and method based on ambient illumination condition
CN107941241B (en) Resolution board for aerial photogrammetry quality evaluation and use method thereof
Abdullah et al. Camera Calibration Performance on Different Non-metric Cameras.
Deng et al. Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant