CN109727278B - Automatic registration method for airborne LiDAR point cloud data and aerial image - Google Patents

Automatic registration method for airborne LiDAR point cloud data and aerial image Download PDF

Info

Publication number
CN109727278B
CN109727278B CN201811651300.0A CN201811651300A CN109727278B CN 109727278 B CN109727278 B CN 109727278B CN 201811651300 A CN201811651300 A CN 201811651300A CN 109727278 B CN109727278 B CN 109727278B
Authority
CN
China
Prior art keywords
point
image
points
current processing
reliable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811651300.0A
Other languages
Chinese (zh)
Other versions
CN109727278A (en
Inventor
梁菲
王慧芳
王铮尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerial Photogrammetry and Remote Sensing Co Ltd
Original Assignee
Aerial Photogrammetry and Remote Sensing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerial Photogrammetry and Remote Sensing Co Ltd filed Critical Aerial Photogrammetry and Remote Sensing Co Ltd
Priority to CN201811651300.0A priority Critical patent/CN109727278B/en
Publication of CN109727278A publication Critical patent/CN109727278A/en
Application granted granted Critical
Publication of CN109727278B publication Critical patent/CN109727278B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an automatic registration method of airborne LiDAR point cloud data and aerial images, which comprises the following steps: firstly, point cloud data and aerial image acquisition and control point field actual measurement; secondly, preprocessing an image; thirdly, matching images; fourthly, acquiring coordinates of the control point photo; fifthly, updating the adjustment of the area network and the external orientation elements by the primary beam method; sixthly, matching the point cloud data with the aviation image feature points; seventhly, calculating coordinates of feature points and photos of the image to be processed; eighthly, matching images; ninthly, updating the set; tenthly, updating adjustment and external orientation elements of the area network by using a beam method; and eleventh, judging the automatic registration end. The method and the device match the characteristic points of the aerial image with the point cloud data to realize the automatic registration of the airborne LiDAR point cloud data and the aerial image, can effectively improve the registration precision, have small calculation amount, and can be effectively applied to the automatic registration process of the point cloud data and the aerial image of different terrains by combining the field actual measurement three-dimensional coordinates of a plurality of ground control points to perform registration.

Description

Automatic registration method for airborne LiDAR point cloud data and aerial image
Technical Field
The invention belongs to the technical field of aerial photogrammetry, and particularly relates to an automatic registration method for airborne LiDAR point cloud data and aerial images.
Background
The INSAR (Interferometric Synthetic Aperture Radar; referred to as Interferometric Radar measurement for short) technology can quickly acquire a high-precision Digital Surface Model (DSM), and after data processing, the DSM is widely applied to the fields of electric power, forestry, digital cities and the like, and acquired data mainly comprise discrete laser point clouds (also called Lidar data, airborne Lidar data, Lidar point cloud data or point cloud data), have high-precision space geometric structure information, but lack spectral information, so that the special data of ground objects cannot be acquired. On the contrary, the traditional image data can provide rich semantic information, so that the Lidar data and the image are used together, the precision of a remote sensing data product can be effectively improved, the depth and the breadth of remote sensing application are expanded, and the method is an important research direction in the current remote sensing field.
The key step of jointly utilizing the image and the point cloud data is to perform high-precision registration on the image and the point cloud data, wherein the core is also an image matching technology. The image matching means that homonymous points are identified between two-dimensional data through a certain image matching algorithm, and the identified homonymous points are also called image connection points. The image matching can be used in different fields, such as photogrammetry processing, digital image processing, medical image processing, remote sensing image data processing and the like, and different application fields have different requirements on image matching. And the registration of the point cloud data and the image is the registration between two-dimensional data and three-dimensional data, which is different from the traditional two-dimensional data matching. Since the image is two-dimensional and the point cloud data is three-dimensional, the point cloud data needs to be interpolated in the registration process, resulting in low matching accuracy. In order to achieve a high-precision registration result, the precision of the matching point is required to be high, and the precision needs to be within 1 pixel. In practical application, the matching algorithm is often influenced by image noise, image rotation, image deformation and the like, especially in aerial photography, aerial photography image rotation and deformation are large due to the influence of a photographing mode of topographic relief and central projection and a flight environment, and the problem of low registration accuracy is easily caused in an automatic processing process by using the existing registration method, so that a registration model with high accuracy needs to be used. The problems that the mathematical expression and calculation of the existing 3D-3D registration model are more complicated, the registration model is not strict and the like are solved.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an automatic registration method of airborne LiDAR point cloud data and aerial images aiming at the defects in the prior art, the method has simple steps, reasonable design, convenient implementation and good use effect, the extracted feature points of the aerial images and the point cloud data are matched to realize the automatic registration of the airborne LiDAR point cloud data and the aerial images, the registration precision can be effectively improved, the calculated amount is small, and meanwhile, the registration is carried out by combining the field actual measurement three-dimensional coordinates of a plurality of ground control points, so that the automatic registration method can be effectively applied to the automatic registration process of the point cloud data and the aerial images of different terrains.
In order to solve the technical problems, the invention adopts the technical scheme that: an automatic registration method for airborne LiDAR point cloud data and aerial images is characterized by comprising the following steps:
the method comprises the following steps of firstly, point cloud data and aerial image acquisition and control point field actual measurement: acquiring point cloud data of an area to be measured by adopting an airborne LiDAR measuring system, and transmitting the acquired point cloud data to data processing equipment; the point cloud data comprises a plurality of measuring points of a region to be measured and three-dimensional coordinates of each measuring point;
meanwhile, a plurality of ground control points are distributed in a measured area, and field actual measurement is carried out on the three-dimensional coordinates of each ground control point to obtain the actual measurement three-dimensional coordinates of each ground control point; then, carrying out aerial photogrammetry on the measured area by utilizing the distributed ground control points, shooting a plurality of aerial photographic images of the measured area, and synchronously transmitting the obtained plurality of aerial photographic images to the data processing equipment; each aerial photographic image is a digital image and is a two-dimensional image;
when the aerial photogrammetry is carried out on the measured area, the external orientation element of each aerial photographic image is obtained, and the obtained external orientation element of each aerial photographic image is synchronously transmitted to the data processing equipment; in the step, the exterior orientation elements of the aerial photographic images are the initial exterior orientation elements of the aerial photographic images;
step two, image preprocessing: denoising and filtering the plurality of aerial photographic images in the step one by adopting the data processing equipment to obtain a plurality of preprocessed aerial photographic images;
step three, image matching: adopting the data processing equipment and calling an image matching module to perform image matching on the plurality of aerial photographic images preprocessed in the step two to obtain all feature points matched with each other among the plurality of aerial photographic images; the obtained characteristic points are image connection points which are matched with the plurality of aerial photographic images;
in the step, all the feature points on each acquired aerial photographic image form a feature point set of the aerial photographic image, and the feature point set of each aerial photographic image comprises the image coordinates of all the feature points on the aerial photographic image obtained by matching in the step;
step four, acquiring coordinates of the control point photo: according to the actually measured three-dimensional coordinates of the ground control points in the first step, respectively calculating the photo coordinates of the aerial photographic images preprocessed by the ground control points in the second step by adopting the data processing equipment and calling a photo coordinate calculation module to obtain the photo coordinates of the ground control points on the aerial photographic images; adding the photo coordinates of the plurality of ground control points on each aerial photographic image into the feature point set of the aerial photographic image in the third step to obtain a complete feature point set of each aerial photographic image;
step five, updating the adjustment of the area network and the external orientation elements by the primary beam method: according to the initial exterior orientation element of each aerial photographic image in the step one, the complete characteristic point set of each aerial photographic image in the step four and the measured three-dimensional coordinates of the plurality of ground control points in the step one, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain the exterior orientation element of each aerial photographic image after adjustment; then calling a data updating module to update the external orientation element of each aerial photographic image into the external orientation element of the aerial photographic image after adjustment obtained at the moment;
in the first step, the measured three-dimensional coordinates of a plurality of ground control points form a control point set;
step six, matching the point cloud data with the aviation image feature points: matching the point cloud data in the step one with any one of the aerial photographic images preprocessed in the step two by adopting the data processing equipment, wherein the aerial photographic image is a reference image; all the rest aerial photographic images except the reference image in the plurality of the aerial photographic images preprocessed in the step two are to-be-processed images;
when the point cloud data and the reference image are matched, the process is as follows:
step 601, constructing a triangulation network: calling a triangulation network construction module to construct a triangulation network according to the point cloud data in the first step; the constructed triangulation network is a point cloud triangulation network;
step 602, extracting a Harris corner: calling a Harris angular point detection module to extract characteristic points of the aerial photographic image, and recording the photo coordinates of the extracted characteristic points; the extracted characteristic points are Harris angular points;
step 603, determining three-dimensional coordinates of an image Harris corner point and a ground control point based on a triangulation network: respectively determining three-dimensional coordinates of all Harris angular points of the aerial photographic image in the step 602 and a plurality of ground control points in the step four according to the point cloud triangulation network in the step 601;
when three-dimensional coordinates of all Harris angular points of the aerial photographic image are determined, the three-dimensional coordinates of all Harris angular points are respectively determined according to the photo coordinates of all Harris angular points on the aerial photographic image, and three-dimensional coordinates of a plurality of reliable Harris angular points are obtained;
when the three-dimensional coordinates of any Harris angular point on the aerial photographic image are determined, the Harris angular point is a current processing point, and the process is as follows:
step A1, effective judgment of processing points: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering the step A2; otherwise, discarding the current processed point;
TL is a preset triangle side length judgment threshold; TH is a preset triangle top point height difference judgment threshold; the effective processing point is the reliable Harris angular point;
step A2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step A1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step A1 to obtain a three-dimensional coordinate of the current processing point;
step A3, repeating the steps A1 to A2 one or more times, and respectively determining the three-dimensional coordinates of all Harris angular points of the aerial photographic image to obtain the three-dimensional coordinates of a plurality of reliable Harris angular points of the aerial photographic image;
when the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined, the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined according to the photo coordinates of the plurality of ground control points in the fourth step on the aerial photographic image, and the three-dimensional coordinates of the plurality of reliable control points are obtained;
when the three-dimensional coordinates of any one of the ground control points are determined, the ground control point is a current processing point, and the process is as follows:
step B1, judging the validity of the processing point: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes of the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering step B2; otherwise, discarding the current processed point;
in this step, the effective processing point is the reliable control point;
step B2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step B1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step B1 to obtain a three-dimensional coordinate of the current processing point;
step B3, repeating the steps B1 to B2 one or more times, and respectively determining the three-dimensional coordinates of the plurality of ground control points in the step four to obtain the three-dimensional coordinates of the plurality of reliable control points;
step 604, calculating a coordinate transformation matrix: calling a control point searching module, and searching the actual measurement three-dimensional coordinates of the reliable control points from the actual measurement three-dimensional coordinates of the ground control points in the step one; then, a coordinate transformation matrix calculation module is called to calculate a coordinate transformation matrix of the three-dimensional coordinates of the reliable control points obtained in the step B3 and the actual measurement three-dimensional coordinates of the reliable control points;
step 605, determining three-dimensional coordinates of Harris corner points of the image based on the coordinate transformation matrix: according to the coordinate transformation matrix calculated in the step 604, calling a coordinate transformation module to respectively perform coordinate transformation on the three-dimensional coordinates of the reliable Harris angular points of the aerial photographic image in the step A3, and calculating to obtain the three-dimensional coordinates of the reliable Harris angular points after coordinate transformation;
step 606, correcting three-dimensional coordinates of Harris corner points of the image based on the triangulation network: respectively correcting the three-dimensional coordinates of the reliable Harris angular points obtained in the step 605 according to the point cloud triangulation network constructed in the step 601;
when the three-dimensional coordinates of any one of the reliable Harris corner points obtained in step 605 are corrected, the reliable Harris corner point is a current correction point, and the process is as follows:
step C1, correction and judgment: finding out the triangle of the current correction point in the point cloud triangulation network constructed in the step 601 according to the three-dimensional coordinates of the current correction point obtained in the step 605, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the three-dimensional coordinate of the current correction point needs to be corrected, and entering the step C2; otherwise, the three-dimensional coordinate of the current correction point does not need to be corrected, and the three-dimensional coordinate of the current correction point is the three-dimensional coordinate of the current correction point obtained in the step 605;
step C2, coordinate correction: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step C1 to obtain an elevation value of the current correction point, and replacing the elevation value in the three-dimensional coordinate of the current correction point obtained in the step 604 with the elevation value of the current correction point obtained by interpolation at the moment to obtain the three-dimensional coordinate of the current correction point after correction;
step C3, repeating the steps C1 to C2 one or more times, and respectively correcting the three-dimensional coordinates of the reliable Harris corner points to obtain corrected three-dimensional coordinates of the reliable Harris corner points;
the three-dimensional coordinates of the plurality of reliable Harris corner points corrected in the step C3 are the three-dimensional coordinates of the plurality of reliable Harris corner points of the reference image;
step seven, calculating the coordinates of the feature point image slice of the image to be processed: respectively calculating the photo coordinates of the characteristic points of each image to be processed in the sixth step by adopting the data processing equipment to obtain the photo coordinates of a plurality of Harris angular points corresponding to each image to be processed;
the characteristic point photo coordinate calculation methods of all the images to be processed are the same;
when calculating the feature point photo coordinates of any image to be processed, calling a photo coordinate calculation module to calculate the photo coordinates of the reliable Harris corner points on the image to be processed respectively according to the three-dimensional coordinates of the reliable Harris corner points of the reference image obtained in step C3 and by combining the external orientation element of the image to be processed at this time, so as to obtain the photo coordinates of the corresponding Harris corner points of the image to be processed;
each corresponding Harris angular point is a pixel point of one reliable Harris angular point on the image to be processed, and the pixel coordinate of each reliable Harris angular point on the image to be processed is the coordinate of the pixel point of the reliable Harris angular point on the image to be processed;
step eight, image matching: finding photo coordinates of the plurality of reliable Harris corner points in the step a3 from the photo coordinates of each feature point extracted in the step 602 by using the data processing equipment and calling a feature point finding module, wherein the found photo coordinates of the plurality of reliable Harris corner points are the photo coordinates of the plurality of reliable Harris corner points of the reference image; then, according to the photo coordinates of the reliable Harris angular points of the reference image and the photo coordinates of the corresponding Harris angular points of the images to be processed obtained in the seventh step, image matching is carried out by adopting the data processing equipment and calling an image matching module, and all Harris angular points matched between the reference image and the images to be processed are obtained;
all Harris angular points matched between the obtained reference image and each image to be processed are matched control points;
step nine, set updating: adding the photo coordinates of all the matching control points obtained in the step eight on each aerial photographic image into the complete characteristic point set of the aerial photographic image in the step four by adopting the data processing equipment to obtain the updated complete characteristic point set of each aerial photographic image; meanwhile, adding the three-dimensional coordinates of the plurality of reliable Harris angular points corrected in the sixth step into the control point set at the moment to obtain an updated control point set;
step ten, updating the adjustment of the area network and the exterior orientation elements by using a beam method: according to the exterior orientation elements of the aerial photographic images, the complete characteristic point set of the aerial photographic images in the ninth step and the control point set at the moment, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain exterior orientation elements of the aerial photographic images after adjustment; then calling a data updating module, updating the external orientation element of each aerial photographic image into the external orientation element of the obtained aerial photographic image after adjustment, obtaining the external orientation element of each updated aerial photographic image, and finishing the one-time automatic registration process of the point cloud data and the aerial photographic image;
eleventh, automatic registration end judgment: adopting the data processing equipment and calling a numerical value comparison module to respectively judge the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating in the step ten: when the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating are all smaller than a preset limit difference value, judging that the automatic registration process of the point cloud data and the plurality of aerial photographic images is completed, and outputting an automatic registration result by adopting the data processing equipment, wherein the automatic registration result is the external orientation element of each aerial photographic image after updating in the step ten; otherwise, adopting the data processing equipment to judge the automatic registration times;
when the data processing equipment is adopted to judge the automatic registration times, the data processing equipment is adopted to judge whether the automatic registration times completed at the moment reach the preset maximum registration times: when the number of times of automatic registration completed at this time reaches the preset maximum number of times of registration, judging that the automatic registration fails, and outputting an automatic registration result by the data processing equipment at this time, wherein the automatic registration result is the automatic registration failure; and otherwise, returning to the step six, and carrying out automatic registration on the point cloud data and the aerial photographic image for the next time.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: in step a1, TL is 3m and TH is 1 m.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: and step eleven, when the point cloud data and the aerial photographic image are automatically registered for the next time, automatically registering the point cloud data and the aerial photographic image according to the method in the step six to the step eleven, and then entering the step eleven to judge the end of automatic registration.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: before image preprocessing in the second step, setting the automatic registration times N by adopting the data processing equipment; at this time, N is 0;
after completing the one-time automatic registration process of the point cloud data and the aerial photographic image in the step ten, adding 1 to the automatic registration times N by adopting the data processing equipment;
the maximum registration times preset in the step eleven are recorded as Nmax(ii) a Wherein N ismaxIs a positive integer and Nmax≥3;
Step eleven, judging whether the number of times of the automatic registration completed at the moment reaches the preset maximum registration number by adopting the data processing equipment, and comparing the N and the N at the momentmaxAnd (3) comparing difference values: when N is more than or equal to NmaxJudging that the number of times of automatic registration completed at the moment reaches the preset maximum registration number; otherwise, judging that the number of times of the completed automatic registration does not reach the preset maximum registration number.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: and in the fifth step and the tenth step, when the adjustment of the beam method area network is carried out, the data processing equipment is adopted and the adjustment module of the beam method area network is called to carry out POS auxiliary beam method area network adjustment.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: when the aerial photogrammetry is carried out on the measured area in the first step, the aerial photogrammetry is carried out by adopting a POS system for the aerial photogrammetry;
in the first step, the exterior orientation elements of the aerial photographic images are all the exterior orientation elements obtained by the POS system when the aerial photographic measurement is carried out on the measured area.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: before calculating coordinates of feature points and images of the image to be processed in the seventh step, respectively performing corner validity judgment on a plurality of reliable Harris corners of the reference image obtained in the step C3 by using the data processing equipment;
the corner validity judgment methods of all reliable Harris corners are the same;
when the angular point validity judgment is performed on any one of the reliable Harris angular points of the reference image, according to the three-dimensional coordinates of the reliable Harris angular point of the reference image obtained in the step C3, and in combination with the external orientation element of the reference image at the moment, a photo coordinate calculation module is called to calculate the photo coordinates of the reliable Harris angular point on the reference image, and the computed reliable Harris angular point on the reference image is marked (x is the position of the reliable Harris angular point on the reference image)*,y*) (ii) a Then, the data processing device is adopted to find out the pixel coordinates of the reliable Harris angular points of the reference image from the image coordinates of the reliable Harris angular points of the reference image in the step eight, and the found pixel coordinates of the reliable Harris angular points are marked as (x ^ y ^); then, calling a numerical calculation module and according to a formula
Figure GDA0002727826500000091
Calculating to obtain the image coordinate deviation value delta r of the reliable Harris angular point, wherein delta x is x*-x^,Δy=y*-y ^ y; and then calling a difference value comparison module to judge whether the delta r is smaller than the delta t: when delta r is less than delta t, the reliable Harris angular point is judged to be an effective angular point; otherwise, judging that the reliable Harris angular point is an invalid angular point;
wherein, Δ t is a preset photo coordinate deviation judgment threshold;
removing all invalid corner points in the reliable Harris corner points of the reference image obtained in the step C3 by using the data processing equipment before calculating the feature point photo coordinates of any image to be processed;
seventhly, when the feature point photo coordinates of any image to be processed are calculated, all the reliable Harris angular points of the reference image are the effective angular points;
each Harris angular point is a pixel point of one effective angular point on the image to be processed;
the reliable Harris corner points added to the control point set at this time in step nine are all the effective corner points.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: when the data processing equipment is adopted and an image matching module is called to perform image matching in the step eight, correlation coefficient calculation is performed on the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the step seven respectively, and all matching control points matched between the reference image and each image to be processed are found out according to the correlation coefficient calculation result;
when correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of any image to be processed obtained in the seventh step, correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of the image to be processed respectively, and all Harris corners matched between the reference image and the image to be processed are found out according to the calculation result of the correlation coefficients;
and after the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the seventh step are all subjected to correlation coefficient calculation, obtaining all Harris angular points matched between the reference image and each image to be processed, wherein the found Harris angular points are matching control points.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: after the elevation coordinate calculation module is called in step a2 and the triangle found in step a1 is used to interpolate to obtain the elevation value of the current processing point, an elevation value correction module is also called to correct the interpolated elevation value, and the process is as follows:
step A21, finding the nearest measuring point in the triangulation network: according to the interpolation, the elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step A1 are obtained, a triangle where the current processing point is located is found out in the point cloud triangulation network constructed in the step 601, a measuring point closest to the current processing point is found out in the found triangle, and the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A22, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A21 and the elevation value of the current processing point before correction in the step A21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step A23 for next correction;
wherein, Δ H is a preset elevation difference judgment threshold;
step A23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step A1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A24, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at this time as the reliable elevation value of the current processing point; otherwise, returning to the step A23 for next correction;
in step a2, the ground coordinates of the current processing point are obtained by combining the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point, and the ground coordinates of the current processing point and the reliable elevation value of the current processing point are obtained by the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point.
The automatic registration method of the airborne LiDAR point cloud data and the aerial image is characterized by comprising the following steps: in step B2, an elevation coordinate calculation module is called, and after the elevation value of the current processing point is obtained by interpolating the triangle found in step B1, an elevation value correction module is also called to correct the interpolated elevation value, and the process is as follows:
step B21, finding the nearest measuring point in the triangulation network: according to the interpolation, the elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step B1 are obtained, a triangle where the current processing point is located is found out in the point cloud triangulation network constructed in the step 601, a measuring point closest to the current processing point is found out in the found triangle, and the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B22, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B21 and the elevation value of the current processing point before correction in the step B21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step B23 for next correction;
step B23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step B1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B24, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, returning to the step B23 for next correction;
and B2, obtaining the ground coordinates of the current processing point by combining the ground coordinates of the current processing point obtained by conversion in the step B1, obtaining the three-dimensional coordinates of the current processing point, and obtaining the three-dimensional coordinates of the current processing point according to the ground coordinates of the current processing point and the reliable elevation value of the current processing point obtained by conversion in the step B1.
Compared with the prior art, the invention has the following advantages:
1. the method has simple steps, convenient realization and lower input cost.
2. The degree of automation is high, and the registration process and the registration are easy to control.
3. The method has the advantages that the use effect is good, the practical value is high, the precision of the image connection points is high, the feature points extracted from aerial photography images and point cloud data are directly adopted to be fused and matched, the precision of the obtained connection points can be guaranteed, the precision of the connection points is high, the point location precision is within 1 pixel and is uniformly distributed, the requirement of the point cloud data and the aerial image registration connection points is met, and therefore the registration precision can be effectively improved. Meanwhile, the actual measurement three-dimensional coordinates of the plurality of ground control points are used for registration, so that field measurement control points assist in registration, and the method is suitable for different terrains and has a wider application range. In addition, when the point cloud data and the aerial photography image are automatically registered, a complex registration model does not need to be established, and the calculation amount is small, so that the method can be effectively suitable for registering mass remote sensing data, further widens the application range, and has remarkable economic benefit and social benefit.
In conclusion, the method has the advantages of simple steps, reasonable design, convenience in implementation and good use effect, the extracted characteristic points of the aerial image are matched with the point cloud data to achieve automatic registration of airborne LiDAR point cloud data and the aerial image, the registration precision can be effectively improved, the calculated amount is small, meanwhile, registration is performed by combining field actual measurement three-dimensional coordinates of a plurality of ground control points, and the method can be effectively applied to automatic registration processes of point cloud data and aerial images of different terrains. On the premise of not needing to interpolate all point cloud data, the method for registering the point cloud data and the image is good in robustness and high in accuracy.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a block diagram of the process flow of the present invention.
Detailed Description
As shown in FIG. 1, a method for automatically registering airborne LiDAR point cloud data with aerial images includes the following steps:
the method comprises the following steps of firstly, point cloud data and aerial image acquisition and control point field actual measurement: acquiring point cloud data of an area to be measured by adopting an airborne LiDAR measuring system, and transmitting the acquired point cloud data to data processing equipment; the point cloud data comprises a plurality of measuring points of a region to be measured and three-dimensional coordinates of each measuring point;
meanwhile, a plurality of ground control points are distributed in a measured area, and field actual measurement is carried out on the three-dimensional coordinates of each ground control point to obtain the actual measurement three-dimensional coordinates of each ground control point; then, carrying out aerial photogrammetry on the measured area by utilizing the distributed ground control points, shooting a plurality of aerial photographic images of the measured area, and synchronously transmitting the obtained plurality of aerial photographic images to the data processing equipment; each aerial photographic image is a digital image and is a two-dimensional image;
when the aerial photogrammetry is carried out on the measured area, the external orientation element of each aerial photographic image is obtained, and the obtained external orientation element of each aerial photographic image is synchronously transmitted to the data processing equipment; in the step, the exterior orientation elements of the aerial photographic images are the initial exterior orientation elements of the aerial photographic images;
step two, image preprocessing: denoising and filtering the plurality of aerial photographic images in the step one by adopting the data processing equipment to obtain a plurality of preprocessed aerial photographic images;
step three, image matching: adopting the data processing equipment and calling an image matching module to perform image matching on the plurality of aerial photographic images preprocessed in the step two to obtain all feature points matched with each other among the plurality of aerial photographic images; the obtained characteristic points are image connection points which are matched with the plurality of aerial photographic images;
in the step, all the feature points on each acquired aerial photographic image form a feature point set of the aerial photographic image, and the feature point set of each aerial photographic image comprises the image coordinates of all the feature points on the aerial photographic image obtained by matching in the step;
step four, acquiring coordinates of the control point photo: according to the actually measured three-dimensional coordinates of the ground control points in the first step, respectively calculating the photo coordinates of the aerial photographic images preprocessed by the ground control points in the second step by adopting the data processing equipment and calling a photo coordinate calculation module to obtain the photo coordinates of the ground control points on the aerial photographic images; adding the photo coordinates of the plurality of ground control points on each aerial photographic image into the feature point set of the aerial photographic image in the third step to obtain a complete feature point set of each aerial photographic image;
step five, updating the adjustment of the area network and the external orientation elements by the primary beam method: according to the initial exterior orientation element of each aerial photographic image in the step one, the complete characteristic point set of each aerial photographic image in the step four and the measured three-dimensional coordinates of the plurality of ground control points in the step one, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain the exterior orientation element of each aerial photographic image after adjustment; then calling a data updating module to update the external orientation element of each aerial photographic image into the external orientation element of the aerial photographic image after adjustment obtained at the moment;
in the first step, the measured three-dimensional coordinates of a plurality of ground control points form a control point set;
step six, matching the point cloud data with the aviation image feature points: matching the point cloud data in the step one with any one of the aerial photographic images preprocessed in the step two by adopting the data processing equipment, wherein the aerial photographic image is a reference image; all the rest aerial photographic images except the reference image in the plurality of the aerial photographic images preprocessed in the step two are to-be-processed images;
when the point cloud data and the reference image are matched, the process is as follows:
step 601, constructing a triangulation network: calling a triangulation network construction module to construct a triangulation network according to the point cloud data in the first step; the constructed triangulation network is a point cloud triangulation network;
step 602, extracting a Harris corner: calling a Harris angular point detection module to extract characteristic points of the aerial photographic image, and recording the photo coordinates of the extracted characteristic points; the extracted characteristic points are Harris angular points;
step 603, determining three-dimensional coordinates of an image Harris corner point and a ground control point based on a triangulation network: respectively determining three-dimensional coordinates of all Harris angular points of the aerial photographic image in the step 602 and a plurality of ground control points in the step four according to the point cloud triangulation network in the step 601;
when three-dimensional coordinates of all Harris angular points of the aerial photographic image are determined, the three-dimensional coordinates of all Harris angular points are respectively determined according to the photo coordinates of all Harris angular points on the aerial photographic image, and three-dimensional coordinates of a plurality of reliable Harris angular points are obtained;
when the three-dimensional coordinates of any Harris angular point on the aerial photographic image are determined, the Harris angular point is a current processing point, and the process is as follows:
step A1, effective judgment of processing points: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering the step A2; otherwise, discarding the current processed point;
TL is a preset triangle side length judgment threshold; TH is a preset triangle top point height difference judgment threshold; the effective processing point is the reliable Harris angular point;
step A2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step A1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step A1 to obtain a three-dimensional coordinate of the current processing point;
step A3, repeating the steps A1 to A2 one or more times, and respectively determining the three-dimensional coordinates of all Harris angular points of the aerial photographic image to obtain the three-dimensional coordinates of a plurality of reliable Harris angular points of the aerial photographic image;
when the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined, the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined according to the photo coordinates of the plurality of ground control points in the fourth step on the aerial photographic image, and the three-dimensional coordinates of the plurality of reliable control points are obtained;
when the three-dimensional coordinates of any one of the ground control points are determined, the ground control point is a current processing point, and the process is as follows:
step B1, judging the validity of the processing point: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes of the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering step B2; otherwise, discarding the current processed point;
in this step, the effective processing point is the reliable control point;
step B2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step B1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step B1 to obtain a three-dimensional coordinate of the current processing point;
step B3, repeating the steps B1 to B2 one or more times, and respectively determining the three-dimensional coordinates of the plurality of ground control points in the step four to obtain the three-dimensional coordinates of the plurality of reliable control points;
step 604, calculating a coordinate transformation matrix: calling a control point searching module, and searching the actual measurement three-dimensional coordinates of the reliable control points from the actual measurement three-dimensional coordinates of the ground control points in the step one; then, a coordinate transformation matrix calculation module is called to calculate a coordinate transformation matrix of the three-dimensional coordinates of the reliable control points obtained in the step B3 and the actual measurement three-dimensional coordinates of the reliable control points;
step 605, determining three-dimensional coordinates of Harris corner points of the image based on the coordinate transformation matrix: according to the coordinate transformation matrix calculated in the step 604, calling a coordinate transformation module to respectively perform coordinate transformation on the three-dimensional coordinates of the reliable Harris angular points of the aerial photographic image in the step A3, and calculating to obtain the three-dimensional coordinates of the reliable Harris angular points after coordinate transformation;
step 606, correcting three-dimensional coordinates of Harris corner points of the image based on the triangulation network: respectively correcting the three-dimensional coordinates of the reliable Harris angular points obtained in the step 605 according to the point cloud triangulation network constructed in the step 601;
when the three-dimensional coordinates of any one of the reliable Harris corner points obtained in step 605 are corrected, the reliable Harris corner point is a current correction point, and the process is as follows:
step C1, correction and judgment: finding out the triangle of the current correction point in the point cloud triangulation network constructed in the step 601 according to the three-dimensional coordinates of the current correction point obtained in the step 605, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the three-dimensional coordinate of the current correction point needs to be corrected, and entering the step C2; otherwise, the three-dimensional coordinate of the current correction point does not need to be corrected, and the three-dimensional coordinate of the current correction point is the three-dimensional coordinate of the current correction point obtained in the step 605;
step C2, coordinate correction: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step C1 to obtain an elevation value of the current correction point, and replacing the elevation value in the three-dimensional coordinate of the current correction point obtained in the step 604 with the elevation value of the current correction point obtained by interpolation at the moment to obtain the three-dimensional coordinate of the current correction point after correction;
step C3, repeating the steps C1 to C2 one or more times, and respectively correcting the three-dimensional coordinates of the reliable Harris corner points to obtain corrected three-dimensional coordinates of the reliable Harris corner points;
the three-dimensional coordinates of the plurality of reliable Harris corner points corrected in the step C3 are the three-dimensional coordinates of the plurality of reliable Harris corner points of the reference image;
step seven, calculating the coordinates of the feature point image slice of the image to be processed: respectively calculating the photo coordinates of the characteristic points of each image to be processed in the sixth step by adopting the data processing equipment to obtain the photo coordinates of a plurality of Harris angular points corresponding to each image to be processed;
the characteristic point photo coordinate calculation methods of all the images to be processed are the same;
when calculating the feature point photo coordinates of any image to be processed, calling a photo coordinate calculation module to calculate the photo coordinates of the reliable Harris corner points on the image to be processed respectively according to the three-dimensional coordinates of the reliable Harris corner points of the reference image obtained in step C3 and by combining the external orientation element of the image to be processed at this time, so as to obtain the photo coordinates of the corresponding Harris corner points of the image to be processed;
each corresponding Harris angular point is a pixel point of one reliable Harris angular point on the image to be processed, and the pixel coordinate of each reliable Harris angular point on the image to be processed is the coordinate of the pixel point of the reliable Harris angular point on the image to be processed;
step eight, image matching: finding photo coordinates of the plurality of reliable Harris corner points in the step a3 from the photo coordinates of each feature point extracted in the step 602 by using the data processing equipment and calling a feature point finding module, wherein the found photo coordinates of the plurality of reliable Harris corner points are the photo coordinates of the plurality of reliable Harris corner points of the reference image; then, according to the photo coordinates of the reliable Harris angular points of the reference image and the photo coordinates of the corresponding Harris angular points of the images to be processed obtained in the seventh step, image matching is carried out by adopting the data processing equipment and calling an image matching module, and all Harris angular points matched between the reference image and the images to be processed are obtained;
all Harris angular points matched between the obtained reference image and each image to be processed are matched control points;
step nine, set updating: adding the photo coordinates of all the matching control points obtained in the step eight on each aerial photographic image into the complete characteristic point set of the aerial photographic image in the step four by adopting the data processing equipment to obtain the updated complete characteristic point set of each aerial photographic image; meanwhile, adding the three-dimensional coordinates of the plurality of reliable Harris angular points corrected in the sixth step into the control point set at the moment to obtain an updated control point set;
step ten, updating the adjustment of the area network and the exterior orientation elements by using a beam method: according to the exterior orientation elements of the aerial photographic images, the complete characteristic point set of the aerial photographic images in the ninth step and the control point set at the moment, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain exterior orientation elements of the aerial photographic images after adjustment; then calling a data updating module, updating the external orientation element of each aerial photographic image into the external orientation element of the obtained aerial photographic image after adjustment, obtaining the external orientation element of each updated aerial photographic image, and finishing the one-time automatic registration process of the point cloud data and the aerial photographic image;
eleventh, automatic registration end judgment: adopting the data processing equipment and calling a numerical value comparison module to respectively judge the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating in the step ten: when the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating are all smaller than a preset limit difference value, judging that the automatic registration process of the point cloud data and the plurality of aerial photographic images is completed, and outputting an automatic registration result by adopting the data processing equipment, wherein the automatic registration result is the external orientation element of each aerial photographic image after updating in the step ten; otherwise, adopting the data processing equipment to judge the automatic registration times;
when the data processing equipment is adopted to judge the automatic registration times, the data processing equipment is adopted to judge whether the automatic registration times completed at the moment reach the preset maximum registration times: when the number of times of automatic registration completed at this time reaches the preset maximum number of times of registration, judging that the automatic registration fails, and outputting an automatic registration result by the data processing equipment at this time, wherein the automatic registration result is the automatic registration failure; and otherwise, returning to the step six, and carrying out automatic registration on the point cloud data and the aerial photographic image for the next time. The limit difference value for judging the correction value of each angle element can be set according to specific requirements, and the use mode is flexible. And, the tolerance value is a tolerance value of a conventional angle element correction value.
And eleventh, judging that the automatic registration process of the point cloud data and the plurality of aerial photographic images is completed, wherein the residual errors of the control points on the aerial photographic images and the control points in the point cloud data need to meet the tolerance requirement, and the automatic registration requirement of the point cloud data and the aerial photographic images is met.
And C, respectively judging the correction values of the three angle elements in the external orientation elements of the aerial photographic images after updating in the step ten, wherein the judgment method is a conventional judgment method. And the value of the limited difference value can be determined by referring to the digital aerial photogrammetry air triangulation specification GB/T23236-2009.
In this embodiment, when the ground coordinate conversion module is called to convert the ground coordinates of the current processing point based on the exterior orientation element of the aerial photograph image and the photograph coordinates of the current processing point on the aerial photograph image at this time in step a1 and step B1,
are all according to the formula
Figure GDA0002727826500000201
Calculating to obtain the ground coordinates (X, Y) of the current processing point on the aerial photographic image;
wherein (X)S,YS,ZS) F is a parameter in the internal orientation element of the aerial photography image and is the focal length of an aerial photography instrument used for carrying out aerial photography measurement on the measured area in the step one; (x, y) is the photo coordinates of the current processing point on the aerial photographic image in the step; z is the ground average height of the measured area;
the rotation matrix of the aerial photographic image
Figure GDA0002727826500000202
At present, when an aerial image (namely an aerial photographic image) and point cloud data are matched, the aerial image (namely the aerial photographic image) and the point cloud data are directly matched by adopting the conventional method, but the point cloud data are discrete, and the aerial image is continuous, so that the point with the same name or the line with the same name is difficult to find, and even if the point with the same name or the line with the same name is found, the precision is not high.
In addition, in the conventional method, when an aerial image (that is, an aerial photographic image) and point cloud data are registered, all used image connection points are only the photo coordinates of control points, but when the control points are matched, due to the influences of errors of external orientation elements, errors of the point cloud data, errors of a registration model and the like, the precision of the photo connection points (also called as the image connection points) is low, and the distribution of the photo connection points is not uniform, so that the registration precision is reduced. The control point is a ground control point which is arranged in the measured area in advance.
In the invention, the matching control point found in the step eight is an image connection point.
When the point cloud data and the aerial photography image are automatically registered according to the method in the sixth step to the tenth step, the adopted image connecting points directly adopt the characteristic points (namely Harris angular points) extracted from the aerial photography image, the extracted characteristic points are fused and matched with the point cloud data according to the method in the sixth step to the eighth step, and the obtained connecting point precision can be guaranteed. Therefore, the image connection points use the matching characteristic points of the aerial photography images, the connection point accuracy is high, the point location accuracy is within 1 pixel and is uniformly distributed, and the requirement of the point cloud data and the aerial image registration connection points is met.
In addition, when the extracted feature points and the point cloud data are subjected to fusion matching in the sixth step to the eighth step, the actually measured three-dimensional coordinates of the plurality of ground control points in the first step are utilized, so that errors can be further reduced, and the registration accuracy is further improved.
Therefore, when the point cloud data and the aerial photographic image are automatically registered according to the method in the sixth step to the tenth step, the method in the sixth step to the tenth step forms a set of matching adjustment method which can be automatically completed, has good robustness and high registration precision, the existing registration method of the aerial image (namely the aerial photographic image) and the point cloud data is thoroughly changed, and various problems existing in the existing registration method of the aerial image (namely the aerial photographic image) and the point cloud data can be effectively solved.
On the other hand, the registration model used in the existing registration method of point cloud data and aerial images is only suitable for urban areas, the control points are not uniformly distributed, and the registration accuracy is reduced.
When the point cloud data and the aerial photography image are automatically registered according to the method in the sixth step to the tenth step, the actual measurement three-dimensional coordinates of the plurality of ground control points are used for registration, so that field measurement control points assist in registration, and the method is suitable for different terrains and has a wider application range.
In addition, when the point cloud data and the aerial photography image are automatically registered, a complex registration model does not need to be established, and the calculation amount is small, so that the method can be effectively suitable for registering mass remote sensing data, and the application range is further widened.
The existing registration method of point cloud data and aerial images uses a complex registration model, has large calculation amount, and can only perform registration of a small amount of data.
The registration method adopted by the invention has the advantages of very high registration precision, good robustness and wide application range.
In this embodiment, TL is 3m and TH is 1m in step a 1.
In actual use, the TL and the TH can be respectively subjected to image adjustment according to specific requirements.
In this embodiment, when the next automatic registration is performed on the point cloud data and the aerial photographic image in the eleventh step, the point cloud data and the aerial photographic image are automatically registered according to the method described in the sixth step to the tenth step, and then the eleventh step is performed to perform the judgment of the end of the automatic registration.
For convenience of operation, before image preprocessing in the step two, the data processing equipment is adopted to set the automatic registration times N; at this time, N is 0;
after completing the one-time automatic registration process of the point cloud data and the aerial photographic image in the step ten, adding 1 to the automatic registration times N by adopting the data processing equipment;
the maximum registration times preset in the step eleven are recorded as Nmax(ii) a Wherein N ismaxIs a positive integer and Nmax≥3;
Step eleven, judging whether the number of times of the automatic registration completed at the moment reaches the preset maximum registration number by adopting the data processing equipment, and comparing the N and the N at the momentmaxAnd (3) comparing difference values: when N is more than or equal to NmaxJudging that the number of times of automatic registration completed at the moment reaches the preset maximum registration number; otherwise, judging that the number of times of the completed automatic registration does not reach the preset maximum registration number.
When actually performing registration, both in step five and in step ten, the data processing device is used to perform adjustment according to the conventional beam method block adjustment method, and the adjustment is performed by the adjustment settlement to obtain the exterior orientation element of each aerial photographic image (i.e., the exterior orientation element of each aerial photographic image after adjustment).
In this embodiment, when the adjustment of the beam method area network is performed in the fifth step and the tenth step, the adjustment of the POS auxiliary beam method area network is performed by using the data processing device and calling the adjustment module of the beam method area network. The leveling method adopted when actually performing POS auxiliary beam method block leveling is a conventional POS auxiliary beam method block leveling method.
In the embodiment, when the aerial photogrammetry is performed on the measured area in the first step, the aerial photogrammetry is performed by adopting a POS system for the aerial photogrammetry;
in the first step, the exterior orientation elements of the aerial photographic images are all the exterior orientation elements obtained by the POS system when the aerial photographic measurement is carried out on the measured area.
The POS system is a conventional POS system for aerial photogrammetry. The POS system is an aerial photography navigation system that directly measures exterior orientation elements of a aerial photo using a Global Positioning System (GPS) and an Inertial Measurement Unit (IMU), and the POS is short for english position.
In order to further improve the registration accuracy, before the image coordinate calculation of the feature points of the image to be processed is performed in the seventh step, the data processing equipment is adopted to respectively perform corner validity judgment on the plurality of reliable Harris corners of the reference image obtained in the step C3;
the corner validity judgment methods of all reliable Harris corners are the same;
when the angular point validity judgment is performed on any one of the reliable Harris angular points of the reference image, according to the three-dimensional coordinates of the reliable Harris angular point of the reference image obtained in the step C3, and in combination with the external orientation element of the reference image at the moment, a photo coordinate calculation module is called to calculate the photo coordinates of the reliable Harris angular point on the reference image, and the computed reliable Harris angular point on the reference image is marked (x is the position of the reliable Harris angular point on the reference image)*,y*) (ii) a Then, the data processing device is adopted to find out the pixel coordinates of the reliable Harris angular points of the reference image from the image coordinates of the reliable Harris angular points of the reference image in the step eight, and the found pixel coordinates of the reliable Harris angular points are marked as (x ^ y ^); then, calling a numerical calculation module and according to a formula
Figure GDA0002727826500000231
Calculating to obtain the image coordinate deviation value delta r of the reliable Harris angular point, wherein delta x is x*-x^,Δy=y*-y ^ y; and then calling a difference value comparison module to judge whether the delta r is smaller than the delta t: when delta r is less than delta t, the reliable Harris angular point is judged to be an effective angular point; otherwise, judging that the reliable Harris angular point is an invalid angular point;
wherein, Δ t is a preset photo coordinate deviation judgment threshold;
removing all invalid corner points in the reliable Harris corner points of the reference image obtained in the step C3 by using the data processing equipment before calculating the feature point photo coordinates of any image to be processed;
seventhly, when the feature point photo coordinates of any image to be processed are calculated, all the reliable Harris angular points of the reference image are the effective angular points;
each Harris angular point is a pixel point of one effective angular point on the image to be processed;
the reliable Harris corner points added to the control point set at this time in step nine are all the effective corner points.
Wherein, the delta t is less than or equal to 5 pixels. The delta t is 1 pixel, 2 pixels, 3 pixels, 4 pixels or 5 pixels. In this embodiment, Δ t is 3 pixels. In actual use, the value of delta t can be correspondingly adjusted according to specific requirements.
In this embodiment, when the data processing device is used in the step eight and an image matching module is called to perform image matching, correlation coefficient calculation is performed on the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the step seven, and all matching control points matched between the reference image and each image to be processed are found according to a correlation coefficient calculation result;
when correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of any image to be processed obtained in the seventh step, correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of the image to be processed respectively, and all Harris corners matched between the reference image and the image to be processed are found out according to the calculation result of the correlation coefficients;
and after the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the seventh step are all subjected to correlation coefficient calculation, obtaining all Harris angular points matched between the reference image and each image to be processed, wherein the found Harris angular points are matching control points.
Therefore, adopt in the step eight data processing equipment and call image matching module when carrying out image matching, adopt coefficient of correlation matching method to match, can portably, quick matching reachs all Harris angular points of assorted between reference image and each pending image to it is simple and convenient to realize, only need accomplish Harris angular point matching process between reference image and each pending image can, need not all to accomplish the Harris angular point matching process of two arbitrary pending images in carrying out all aerial photography images, can not only ensure the matching precision, can effectively reduce the computational complexity simultaneously.
And step eight, when the data processing equipment is adopted and the image matching module is called for image matching, the data processing equipment is adopted for carrying out least square correction on the rest images to be processed according to the reference image, so that when the images are matched in the step eight, all matched control points matched between the reference image and each image to be processed are found out according to a correlation coefficient calculation result, and the least square correction of the plurality of aerial photographic images is realized.
In this embodiment, in step 604, a coordinate transformation matrix calculation module is called to calculate a coordinate transformation matrix of the three-dimensional coordinates of the plurality of reliable control points obtained in step B3 and the actually measured three-dimensional coordinates of the plurality of reliable control points found, and the coordinate transformation matrix is calculated by using an Iterative Closest Point (ICP) algorithm.
In actual use, other methods can be adopted to calculate the coordinate transformation matrix.
The outward orientation element is a parameter for determining the spatial position and posture of the photographing beam at the moment of photographing on the basis of the restoration of the inward orientation element (i.e., the restoration of the photographing beam), and is called an outward orientation element. The external orientation element of one photo comprises six parameters, wherein three of the six parameters are straight line elements and are used for describing a space coordinate value of the photographing center; the other three are angle elements for describing the spatial pose of the shot.
In the correction values of three angle elements in the external orientation elements of each of the aerial photographic images updated in the step ten, the correction value of each of the angle elements in the external orientation elements of each of the aerial photographic images updated in the step ten is an absolute value of a difference between a numerical value (i.e., an angle element value) of the angle element in the external orientation element of the aerial photographic image before the updating in the step ten and a numerical value (i.e., an angle element value) of the angle element in the external orientation element of the aerial photographic image after the updating in the step ten.
In this embodiment, after the elevation coordinate calculation module is called in step a2 and the elevation value of the current processing point is obtained by interpolating the triangle found in step a1, the elevation value correction module needs to be called to correct the interpolated elevation value, and the process is as follows:
step A21, finding the nearest measuring point in the triangulation network: according to the interpolation, the elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step A1 are obtained, a triangle where the current processing point is located is found out in the point cloud triangulation network constructed in the step 601, a measuring point closest to the current processing point is found out in the found triangle, and the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A22, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A21 and the elevation value of the current processing point before correction in the step A21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step A23 for next correction;
wherein, Δ H is a preset elevation difference judgment threshold;
step A23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step A1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A24, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at this time as the reliable elevation value of the current processing point; otherwise, returning to the step A23 for next correction;
in step a2, the ground coordinates of the current processing point are obtained by combining the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point, and the ground coordinates of the current processing point and the reliable elevation value of the current processing point are obtained by the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point.
In this example, Δ H is 0.001.
In actual use, the value of Δ H can be adjusted accordingly according to specific needs.
Correspondingly, in order to further improve the accuracy of the three-dimensional coordinates of the feature points and further improve the connection precision and the registration precision, in this embodiment, after the elevation coordinate calculation module is called in step B2 and the elevation value of the current processing point is obtained by interpolating the triangle found in step B1, an elevation value correction module needs to be called to correct the interpolated elevation value, and the process is as follows:
step B21, finding the nearest measuring point in the triangulation network: according to the interpolation, the elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step B1 are obtained, a triangle where the current processing point is located is found out in the point cloud triangulation network constructed in the step 601, a measuring point closest to the current processing point is found out in the found triangle, and the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B22, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B21 and the elevation value of the current processing point before correction in the step B21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step B23 for next correction;
step B23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step B1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B24, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, returning to the step B23 for next correction;
and B2, obtaining the ground coordinates of the current processing point by combining the ground coordinates of the current processing point obtained by conversion in the step B1, obtaining the three-dimensional coordinates of the current processing point, and obtaining the three-dimensional coordinates of the current processing point according to the ground coordinates of the current processing point and the reliable elevation value of the current processing point obtained by conversion in the step B1.
In this embodiment, in the first step, an airborne LiDAR measurement system is used to acquire point cloud data of an area to be measured, and the point cloud data acquisition method is a conventional point cloud data acquisition method. The onboard LiDAR measurement system is in bidirectional communication with the data processing device, and the POS system is in bidirectional communication with the data processing device.
The point cloud data in the first step is obtained by eliminating gross errors of the point cloud data of the area to be measured, which is obtained by adopting an onboard LiDAR measuring system, and the adopted processing method for eliminating the gross errors is a conventional processing method for eliminating the gross errors.
In the first step, the resolution ratios of a plurality of aerial photographic images are consistent, and the image sizes are the same; the plurality of aerial photographic images and the point cloud data can be acquired synchronously or asynchronously.
When the aerial photogrammetry is carried out on the measured area in the first step, the aerial photography instrument parameters (including the initial external orientation elements of each aerial photography image), the aerial photography ground spatial resolution (GSD), the flight design flight height and the reference surface height (namely the ground average height of the measured area) of the measured area are synchronously obtained.
In this embodiment, the number of the ground control points in the first step is not less than 5.
Since the cameras used in the aerial photogrammetry are divided into a measuring camera and a non-measuring camera, if the cameras are non-measuring cameras, the images have radial and tangential distortions of the objective lens, and the radial and tangential distortions need to be eliminated before the images are used, so that distortion correction is carried out on the aerial images (namely, aerial photographic images).
Before image preprocessing in the second step, distortion correction needs to be carried out on the plurality of aerial photographic images by the data processing equipment. Also, the distortion correction method employed is a conventional distortion correction method. And distortion correction is carried out on the image according to the acquired parameter data of the aerial camera.
And when image preprocessing is carried out in the step two, denoising and filtering the plurality of aerial photographic images in the step one according to a conventional drying method and a conventional filtering method.
In this embodiment, when the plurality of aerial photographic images in the step one are denoised respectively, the aerial photographic images are subjected to gaussian smoothing processing because the image noise may be very large. Furthermore, Gaussian smoothing is performed using a window of w × w (w ≧ 3).
And when filtering the plurality of aerial photographic images in the step one, performing Wallis filtering processing on the aerial photographic images subjected to Gaussian smooth filtering.
The Wallis filtering process is a special filter, can enhance the contrast of an original image and suppress noise at the same time, and particularly can greatly enhance image texture modes of different scales in the image, so that the quantity and the precision of point features can be improved when the point features in the image are extracted, and the reliability and the precision of a matching result are improved.
And in the third step, when image matching is carried out, the data processing equipment is adopted and the image pyramid generation module is called to generate image pyramid data of each aerial photographic image, and the obtained image pyramid data of each aerial photographic image comprises L layers of pyramid images, wherein L is a positive integer and is more than or equal to 3.
And when the images are matched in the third step, performing image matching on the L-layer pyramid images of the plurality of aerial photographic images layer by layer. And when the images of the L layers of pyramid images of the plurality of aerial photographic images are matched layer by layer, the adopted image matching method is a conventional image matching method.
In this embodiment, when image matching is performed in step three, the data processing device is used for matching, and the process is as follows:
step 301, SIFT feature extraction: adopting the data processing equipment to respectively extract SIFT features of the L-layer pyramid images of the aerial photographic images to obtain all feature points of the L-layer pyramid images of the aerial photographic images and SIFT feature descriptors of the feature points;
step 302, matching the image data of the L-th layer pyramid: performing feature matching on the L-th layer pyramid image of the plurality of aerial photographic images;
when the characteristics of the L-th pyramid images of the plurality of aerial photographic images are matched, constructing a kd tree by using the characteristic points of each aerial photographic image and performing bidirectional search with the rest aerial photographic images to obtain mutually matched characteristic points among the L-th pyramid images of the plurality of aerial photographic images; the L-th layer pyramid image is an image to be matched;
when the kd tree is constructed by the feature points of each image to be matched and bidirectional search is carried out on the kd tree and the rest images to be matched, the feature points of the image to be matched are transferred to other images to be matched by using the initial external orientation elements of the image to be matched, and if the transferred points are within the shot range of the rest images to be matched, bidirectional search is carried out between the two images to be matched, so that reliable matching points are obtained.
When any two images to be matched are searched in a two-way mode, the two images to be matched are respectively a first image and a second image, and the process is as follows:
step D, first matching, comprising the following steps:
d1, constructing a kd tree: taking the second image as a reference image to be matched, and constructing a kd tree by using SIFT feature descriptors of all feature points of the second image;
step D2, feature matching: performing feature matching on the SIFT feature descriptors of all the feature points of the first image extracted in the second step by using the kd tree constructed in the step D1 and adopting a nearest neighbor algorithm, and finding out all the feature points in the second image, which are matched with the first image, wherein the feature points in the second image, which are matched with the first image, are matching points;
and E, matching for the second time, wherein the method comprises the following steps:
step E1, constructing a kd-tree: taking the first image as a reference image to be matched, and constructing a kd tree by using SIFT feature descriptors of all feature points of the first image;
step E2, feature matching: performing feature matching on all matching points obtained by matching in the step D2 by using the kd tree constructed in the step D1 and adopting a nearest neighbor algorithm, and finding out all feature points matched with the second image in the first image, wherein the feature points matched with the second image in the first image are reliable matching points;
completing the bidirectional search between each image to be matched and the rest of images to be matched according to the methods in the steps D to E, obtaining reliable matching points between each image to be matched and the rest of images to be matched, and obtaining all reliable matching points of the L-th layer pyramid image;
then, by adopting the data processing equipment and an RANSAC algorithm, performing gross error elimination on all reliable matching points of the L-th pyramid image obtained by matching, obtaining all reliable matching points of the L-th pyramid image after elimination, and ensuring the correctness of the final matching point result;
step 303, matching the L-1 layer pyramid image: transferring all reliable matching points of the L-th layer pyramid image removed in the step 302 to an L-1 layer pyramid image of the plurality of aerial photographic images for correlation coefficient calculation, and removing all reliable matching points with the correlation number not more than 0.9 to obtain all reliable matching points of the L-1 layer pyramid image after removal;
step 304, next pyramid image matching: transferring all reliable matching points of the removed pyramid image of the previous layer to the pyramid images of the multiple aerial photographic images for correlation coefficient calculation, and removing all reliable matching points with the correlation number not more than 0.9 to obtain all reliable matching points of the removed pyramid image of the current layer;
305, repeating the step 304 once or more times until the image matching process of the original image layers (namely the pyramid image of the 0 th layer) of the plurality of aerial photographic images is completed, and obtaining all reliable matching points of the original images of the plurality of aerial photographic images after being removed;
and all the reliable matching points of the obtained original images of the plurality of the aerial photographic images after being removed are image connecting points which are matched with each other among the plurality of the aerial photographic images in the step three.
In this embodiment, L is a positive integer and L < 10.
In actual use, the value of L can be adjusted correspondingly according to specific requirements.
In this embodiment, after obtaining the photo coordinates of the plurality of ground control points on each aerial photographic image in the fourth step, the mutual data processing device is further required to adjust the photo coordinates of each ground control point on each aerial photographic image according to the point memory file of the ground control point, so as to improve the accuracy of the photo coordinates of each ground control point on each aerial photographic image.
In this embodiment, when the Harris corner is extracted in step 602, the Harris corner is divided according to a rectangular frame of m × m (0 < m < 100), and the Harris corner is extracted within the rectangular frame.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and all simple modifications, changes and equivalent structural changes made to the above embodiment according to the technical spirit of the present invention still fall within the protection scope of the technical solution of the present invention.

Claims (10)

1. An automatic registration method for airborne LiDAR point cloud data and aerial images is characterized in that: the method comprises the following steps:
the method comprises the following steps of firstly, point cloud data and aerial image acquisition and control point field actual measurement: acquiring point cloud data of an area to be measured by adopting an airborne LiDAR measuring system, and transmitting the acquired point cloud data to data processing equipment; the point cloud data comprises a plurality of measuring points of a region to be measured and three-dimensional coordinates of each measuring point;
meanwhile, a plurality of ground control points are distributed in a measured area, and field actual measurement is carried out on the three-dimensional coordinates of each ground control point to obtain the actual measurement three-dimensional coordinates of each ground control point; then, carrying out aerial photogrammetry on the measured area by utilizing the distributed ground control points, shooting a plurality of aerial photographic images of the measured area, and synchronously transmitting the obtained plurality of aerial photographic images to the data processing equipment; each aerial photographic image is a digital image and is a two-dimensional image;
when the aerial photogrammetry is carried out on the measured area, the external orientation element of each aerial photographic image is obtained, and the obtained external orientation element of each aerial photographic image is synchronously transmitted to the data processing equipment; in the step, the exterior orientation elements of the aerial photographic images are the initial exterior orientation elements of the aerial photographic images;
step two, image preprocessing: denoising and filtering the plurality of aerial photographic images in the step one by adopting the data processing equipment to obtain a plurality of preprocessed aerial photographic images;
step three, image matching: adopting the data processing equipment and calling an image matching module to perform image matching on the plurality of aerial photographic images preprocessed in the step two to obtain all feature points matched with each other among the plurality of aerial photographic images; the obtained characteristic points are image connection points which are matched with the plurality of aerial photographic images;
in the step, all the feature points on each acquired aerial photographic image form a feature point set of the aerial photographic image, and the feature point set of each aerial photographic image comprises the image coordinates of all the feature points on the aerial photographic image obtained by matching in the step;
step four, acquiring coordinates of the control point photo: according to the actually measured three-dimensional coordinates of the ground control points in the first step, respectively calculating the photo coordinates of the aerial photographic images preprocessed by the ground control points in the second step by adopting the data processing equipment and calling a photo coordinate calculation module to obtain the photo coordinates of the ground control points on the aerial photographic images; adding the photo coordinates of the plurality of ground control points on each aerial photographic image into the feature point set of the aerial photographic image in the third step to obtain a complete feature point set of each aerial photographic image;
step five, updating the adjustment of the area network and the external orientation elements by the primary beam method: according to the initial exterior orientation element of each aerial photographic image in the step one, the complete characteristic point set of each aerial photographic image in the step four and the measured three-dimensional coordinates of the plurality of ground control points in the step one, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain the exterior orientation element of each aerial photographic image after adjustment; then calling a data updating module to update the external orientation element of each aerial photographic image into the external orientation element of the aerial photographic image after adjustment obtained at the moment;
in the first step, the measured three-dimensional coordinates of a plurality of ground control points form a control point set;
step six, matching the point cloud data with the aviation image feature points: matching the point cloud data in the step one with any one of the aerial photographic images preprocessed in the step two by adopting the data processing equipment, wherein the aerial photographic image is a reference image; all the rest aerial photographic images except the reference image in the plurality of the aerial photographic images preprocessed in the step two are to-be-processed images;
when the point cloud data and the reference image are matched, the process is as follows:
step 601, constructing a triangulation network: calling a triangulation network construction module to construct a triangulation network according to the point cloud data in the first step; the constructed triangulation network is a point cloud triangulation network;
step 602, extracting a Harris corner: calling a Harris angular point detection module to extract characteristic points of the aerial photographic image, and recording the photo coordinates of the extracted characteristic points; the extracted characteristic points are Harris angular points;
step 603, determining three-dimensional coordinates of an image Harris corner point and a ground control point based on a triangulation network: respectively determining three-dimensional coordinates of all Harris angular points of the aerial photographic image in the step 602 and a plurality of ground control points in the step four according to the point cloud triangulation network in the step 601;
when three-dimensional coordinates of all Harris angular points of the aerial photographic image are determined, the three-dimensional coordinates of all Harris angular points are respectively determined according to the photo coordinates of all Harris angular points on the aerial photographic image, and three-dimensional coordinates of a plurality of reliable Harris angular points are obtained;
when the three-dimensional coordinates of any Harris angular point on the aerial photographic image are determined, the Harris angular point is a current processing point, and the process is as follows:
step A1, effective judgment of processing points: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering the step A2; otherwise, discarding the current processed point;
TL is a preset triangle side length judgment threshold; TH is a preset triangle top point height difference judgment threshold; the effective processing point is the reliable Harris angular point;
step A2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step A1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step A1 to obtain a three-dimensional coordinate of the current processing point;
step A3, repeating the steps A1 to A2 one or more times, and respectively determining the three-dimensional coordinates of all Harris angular points of the aerial photographic image to obtain the three-dimensional coordinates of a plurality of reliable Harris angular points of the aerial photographic image;
when the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined, the three-dimensional coordinates of the plurality of ground control points in the fourth step are respectively determined according to the photo coordinates of the plurality of ground control points in the fourth step on the aerial photographic image, and the three-dimensional coordinates of the plurality of reliable control points are obtained;
when the three-dimensional coordinates of any one of the ground control points are determined, the ground control point is a current processing point, and the process is as follows:
step B1, judging the validity of the processing point: calling a ground coordinate conversion module to convert to obtain the ground coordinate of the current processing point according to the exterior orientation element of the aerial photographic image and the photo coordinate of the current processing point on the aerial photographic image; then, according to the ground coordinates of the current processing point obtained by conversion, finding out the triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes of the three vertexes is smaller than TH, judging that the current processing point is an effective processing point, and entering step B2; otherwise, discarding the current processed point;
in this step, the effective processing point is the reliable control point;
step B2, three-dimensional coordinate determination: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step B1 to obtain an elevation value of the current processing point, and obtaining a ground coordinate of the current processing point by combining the ground coordinate converted in the step B1 to obtain a three-dimensional coordinate of the current processing point;
step B3, repeating the steps B1 to B2 one or more times, and respectively determining the three-dimensional coordinates of the plurality of ground control points in the step four to obtain the three-dimensional coordinates of the plurality of reliable control points;
step 604, calculating a coordinate transformation matrix: calling a control point searching module, and searching the actual measurement three-dimensional coordinates of the reliable control points from the actual measurement three-dimensional coordinates of the ground control points in the step one; then, a coordinate transformation matrix calculation module is called to calculate a coordinate transformation matrix of the three-dimensional coordinates of the reliable control points obtained in the step B3 and the actual measurement three-dimensional coordinates of the reliable control points;
step 605, determining three-dimensional coordinates of Harris corner points of the image based on the coordinate transformation matrix: according to the coordinate transformation matrix calculated in the step 604, calling a coordinate transformation module to respectively perform coordinate transformation on the three-dimensional coordinates of the reliable Harris angular points of the aerial photographic image in the step A3, and calculating to obtain the three-dimensional coordinates of the reliable Harris angular points after coordinate transformation;
step 606, correcting three-dimensional coordinates of Harris corner points of the image based on the triangulation network: respectively correcting the three-dimensional coordinates of the reliable Harris angular points obtained in the step 605 according to the point cloud triangulation network constructed in the step 601;
when the three-dimensional coordinates of any one of the reliable Harris corner points obtained in step 605 are corrected, the reliable Harris corner point is a current correction point, and the process is as follows:
step C1, correction and judgment: finding out the triangle of the current correction point in the point cloud triangulation network constructed in the step 601 according to the three-dimensional coordinates of the current correction point obtained in the step 605, and calling a triangle judgment module to judge the found triangle: when the lengths of the three sides of the found triangle are all smaller than TL and the elevation difference between any two vertexes in the three vertexes is smaller than TH, judging that the three-dimensional coordinate of the current correction point needs to be corrected, and entering the step C2; otherwise, the three-dimensional coordinate of the current correction point does not need to be corrected, and the three-dimensional coordinate of the current correction point is the three-dimensional coordinate of the current correction point obtained in the step 605;
step C2, coordinate correction: calling an elevation coordinate calculation module, interpolating by using the triangle found in the step C1 to obtain an elevation value of the current correction point, and replacing the elevation value in the three-dimensional coordinate of the current correction point obtained in the step 604 with the elevation value of the current correction point obtained by interpolation at the moment to obtain the three-dimensional coordinate of the current correction point after correction;
step C3, repeating the steps C1 to C2 one or more times, and respectively correcting the three-dimensional coordinates of the reliable Harris corner points to obtain corrected three-dimensional coordinates of the reliable Harris corner points;
the three-dimensional coordinates of the plurality of reliable Harris corner points corrected in the step C3 are the three-dimensional coordinates of the plurality of reliable Harris corner points of the reference image;
step seven, calculating the coordinates of the feature point image slice of the image to be processed: respectively calculating the photo coordinates of the characteristic points of each image to be processed in the sixth step by adopting the data processing equipment to obtain the photo coordinates of a plurality of Harris angular points corresponding to each image to be processed;
the characteristic point photo coordinate calculation methods of all the images to be processed are the same;
when calculating the feature point photo coordinates of any image to be processed, calling a photo coordinate calculation module to calculate the photo coordinates of the reliable Harris corner points on the image to be processed respectively according to the three-dimensional coordinates of the reliable Harris corner points of the reference image obtained in step C3 and by combining the external orientation element of the image to be processed at this time, so as to obtain the photo coordinates of the corresponding Harris corner points of the image to be processed;
each corresponding Harris angular point is a pixel point of one reliable Harris angular point on the image to be processed, and the pixel coordinate of each reliable Harris angular point on the image to be processed is the coordinate of the pixel point of the reliable Harris angular point on the image to be processed;
step eight, image matching: finding photo coordinates of the plurality of reliable Harris corner points in the step a3 from the photo coordinates of each feature point extracted in the step 602 by using the data processing equipment and calling a feature point finding module, wherein the found photo coordinates of the plurality of reliable Harris corner points are the photo coordinates of the plurality of reliable Harris corner points of the reference image; then, according to the photo coordinates of the reliable Harris angular points of the reference image and the photo coordinates of the corresponding Harris angular points of the images to be processed obtained in the seventh step, image matching is carried out by adopting the data processing equipment and calling an image matching module, and all Harris angular points matched between the reference image and the images to be processed are obtained;
all Harris angular points matched between the obtained reference image and each image to be processed are matched control points;
step nine, set updating: adding the photo coordinates of all the matching control points obtained in the step eight on each aerial photographic image into the complete characteristic point set of the aerial photographic image in the step four by adopting the data processing equipment to obtain the updated complete characteristic point set of each aerial photographic image; meanwhile, adding the three-dimensional coordinates of the plurality of reliable Harris angular points corrected in the sixth step into the control point set at the moment to obtain an updated control point set;
step ten, updating the adjustment of the area network and the exterior orientation elements by using a beam method: according to the exterior orientation elements of the aerial photographic images, the complete characteristic point set of the aerial photographic images in the ninth step and the control point set at the moment, carrying out beam method area network adjustment by adopting the data processing equipment and calling a beam method area network adjustment module to obtain exterior orientation elements of the aerial photographic images after adjustment; then calling a data updating module, updating the external orientation element of each aerial photographic image into the external orientation element of the obtained aerial photographic image after adjustment, obtaining the external orientation element of each updated aerial photographic image, and finishing the one-time automatic registration process of the point cloud data and the aerial photographic image;
eleventh, automatic registration end judgment: adopting the data processing equipment and calling a numerical value comparison module to respectively judge the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating in the step ten: when the correction values of three angle elements in the external orientation elements of each aerial photographic image after updating are all smaller than a preset limit difference value, judging that the automatic registration process of the point cloud data and the plurality of aerial photographic images is completed, and outputting an automatic registration result by adopting the data processing equipment, wherein the automatic registration result is the external orientation element of each aerial photographic image after updating in the step ten; otherwise, adopting the data processing equipment to judge the automatic registration times;
when the data processing equipment is adopted to judge the automatic registration times, the data processing equipment is adopted to judge whether the automatic registration times completed at the moment reach the preset maximum registration times: when the number of times of automatic registration completed at this time reaches the preset maximum number of times of registration, judging that the automatic registration fails, and outputting an automatic registration result by the data processing equipment at this time, wherein the automatic registration result is the automatic registration failure; and otherwise, returning to the step six, and carrying out automatic registration on the point cloud data and the aerial photographic image for the next time.
2. The method of claim 1 for automatic registration of airborne LiDAR point cloud data with aerial images, wherein: in step a1, TL is 3m and TH is 1 m.
3. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: and step eleven, when the point cloud data and the aerial photographic image are automatically registered for the next time, automatically registering the point cloud data and the aerial photographic image according to the method in the step six to the step eleven, and then entering the step eleven to judge the end of automatic registration.
4. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: before image preprocessing in the second step, setting the automatic registration times N by adopting the data processing equipment; at this time, N is 0;
after completing the one-time automatic registration process of the point cloud data and the aerial photographic image in the step ten, adding 1 to the automatic registration times N by adopting the data processing equipment;
the maximum registration times preset in the step eleven are recorded as Nmax(ii) a Wherein N ismaxIs a positive integer and Nmax≥3;
Step eleven, judging whether the number of times of the automatic registration completed at the moment reaches the preset maximum registration number by adopting the data processing equipment, and comparing the N and the N at the momentmaxAnd (3) comparing difference values: when N is more than or equal to NmaxJudging that the number of times of automatic registration completed at the moment reaches the preset maximum registration number; otherwise, the judgment is that the process is finished at the momentThe number of times of automatic registration of (2) does not reach the preset maximum number of times of registration.
5. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: and in the fifth step and the tenth step, when the adjustment of the beam method area network is carried out, the data processing equipment is adopted and the adjustment module of the beam method area network is called to carry out POS auxiliary beam method area network adjustment.
6. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: when the aerial photogrammetry is carried out on the measured area in the first step, the aerial photogrammetry is carried out by adopting a POS system for the aerial photogrammetry;
in the first step, the exterior orientation elements of the aerial photographic images are all the exterior orientation elements obtained by the POS system when the aerial photographic measurement is carried out on the measured area.
7. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: before calculating coordinates of feature points and images of the image to be processed in the seventh step, respectively performing corner validity judgment on a plurality of reliable Harris corners of the reference image obtained in the step C3 by using the data processing equipment;
the corner validity judgment methods of all reliable Harris corners are the same;
when the angular point validity judgment is performed on any one of the reliable Harris angular points of the reference image, according to the three-dimensional coordinates of the reliable Harris angular point of the reference image obtained in step C3, and by combining with the external orientation element of the reference image at that time, a photo coordinate calculation module is invoked to calculate the photo coordinates of the reliable Harris angular point on the reference image, and the computed photo coordinate of the reliable Harris angular point on the reference image is marked as (x, y); and re-using the data processing apparatus to derive a plurality of said Harris corner-reliable shots of the reference imageIn the coordinates, finding out the pixel coordinates of the reliable Harris angular point of the reference image, and marking the found pixel coordinates of the reliable Harris angular point as (x ^ y ^); then, calling a numerical calculation module and according to a formula
Figure FDA0002727826490000091
Calculating to obtain an image coordinate deviation value delta r of the reliable Harris angular point, wherein delta x is x-x ^ and delta y is y-y ^ in the formula; and then calling a difference value comparison module to judge whether the delta r is smaller than the delta t: when delta r is less than delta t, the reliable Harris angular point is judged to be an effective angular point; otherwise, judging that the reliable Harris angular point is an invalid angular point;
wherein, Δ t is a preset photo coordinate deviation judgment threshold;
removing all invalid corner points in the reliable Harris corner points of the reference image obtained in the step C3 by using the data processing equipment before calculating the feature point photo coordinates of any image to be processed;
seventhly, when the feature point photo coordinates of any image to be processed are calculated, all the reliable Harris angular points of the reference image are the effective angular points;
each Harris angular point is a pixel point of one effective angular point on the image to be processed;
the reliable Harris corner points added to the control point set at this time in step nine are all the effective corner points.
8. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: when the data processing equipment is adopted and an image matching module is called to perform image matching in the step eight, correlation coefficient calculation is performed on the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the step seven respectively, and all matching control points matched between the reference image and each image to be processed are found out according to the correlation coefficient calculation result;
when correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of any image to be processed obtained in the seventh step, correlation coefficient calculation is performed on the plurality of reliable Harris corners of the reference image and the plurality of corresponding Harris corners of the image to be processed respectively, and all Harris corners matched between the reference image and the image to be processed are found out according to the calculation result of the correlation coefficients;
and after the plurality of reliable Harris angular points of the reference image and the plurality of corresponding Harris angular points of each image to be processed obtained in the seventh step are all subjected to correlation coefficient calculation, obtaining all Harris angular points matched between the reference image and each image to be processed, wherein the found Harris angular points are matching control points.
9. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: after the elevation coordinate calculation module is called in step a2 and the triangle found in step a1 is used to interpolate to obtain the elevation value of the current processing point, an elevation value correction module is also called to correct the interpolated elevation value, and the process is as follows:
step A21, finding the nearest measuring point in the triangulation network: according to the interpolation, obtaining an elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step A1, finding out a triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and finding out a measuring point which is closest to the current processing point in the found triangle, wherein the found measuring point is one measuring point in the point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A22, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A21 and the elevation value of the current processing point before correction in the step A21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step A23 for next correction;
wherein, Δ H is a preset elevation difference judgment threshold;
step A23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step A1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step A24, elevation value correction finishing judgment: comparing the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step A23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at this time as the reliable elevation value of the current processing point; otherwise, returning to the step A23 for next correction;
in step a2, the ground coordinates of the current processing point are obtained by combining the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point, and the ground coordinates of the current processing point and the reliable elevation value of the current processing point are obtained by the conversion in step a1, so as to obtain the three-dimensional coordinates of the current processing point.
10. The method of automatic registration of airborne LiDAR point cloud data with aerial images of claim 1 or 2, wherein: in step B2, an elevation coordinate calculation module is called, and after the elevation value of the current processing point is obtained by interpolating the triangle found in step B1, an elevation value correction module is also called to correct the interpolated elevation value, and the process is as follows:
step B21, finding the nearest measuring point in the triangulation network: according to the interpolation, obtaining an elevation value of the current processing point and the ground coordinate of the current processing point obtained through conversion in the step B1, finding out a triangle where the current processing point is located in the point cloud triangulation network constructed in the step 601, and finding out a measuring point which is closest to the current processing point in the found triangle, wherein the found measuring point is one measuring point in the point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B22, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B21 and the elevation value of the current processing point before correction in the step B21, judging the difference as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point and the elevation value of the current processing point is smaller than delta H, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, entering step B23 for next correction;
step B23, finding the nearest measuring point in the triangulation network: according to the corrected elevation value of the current processing point and the ground coordinate of the current processing point converted in the step B1, finding out a triangle where the current processing point is located from the point cloud triangulation network constructed in the step 601, and finding out a measuring point closest to the current processing point from the found triangle, wherein the found measuring point is one measuring point in the measuring point cloud data; then taking the elevation value of the found measuring point as the corrected elevation value of the current processing point;
step B24, judging the end of elevation value correction: comparing the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time, judging as an elevation value correction result when the absolute value of the difference between the elevation value of the current processing point corrected in the step B23 and the elevation value of the current processing point corrected last time is less than 0.001, and taking the elevation value of the current processing point corrected at the moment as the reliable elevation value of the current processing point; otherwise, returning to the step B23 for next correction;
and B2, obtaining the ground coordinates of the current processing point by combining the ground coordinates of the current processing point obtained by conversion in the step B1, obtaining the three-dimensional coordinates of the current processing point, and obtaining the three-dimensional coordinates of the current processing point according to the ground coordinates of the current processing point and the reliable elevation value of the current processing point obtained by conversion in the step B1.
CN201811651300.0A 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image Active CN109727278B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811651300.0A CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811651300.0A CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Publications (2)

Publication Number Publication Date
CN109727278A CN109727278A (en) 2019-05-07
CN109727278B true CN109727278B (en) 2020-12-18

Family

ID=66298534

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811651300.0A Active CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Country Status (1)

Country Link
CN (1) CN109727278B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767460A (en) * 2020-12-31 2021-05-07 武汉大学 Spatial fingerprint image registration element feature description and matching method

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110264502B (en) * 2019-05-17 2021-05-18 华为技术有限公司 Point cloud registration method and device
CN111060910B (en) * 2019-12-11 2023-08-29 西安电子科技大学 InSAR carrier reverse positioning based on topography-image matching
CN112002007B (en) * 2020-08-31 2024-01-19 胡翰 Model acquisition method and device based on air-ground image, equipment and storage medium
CN112381941B (en) * 2021-01-15 2021-03-26 武汉鸿宇飞规划设计技术有限公司 Aviation flight image coordinate correction method
CN112927370B (en) * 2021-02-25 2024-09-27 苍穹数码技术股份有限公司 Three-dimensional building model construction method and device, electronic equipment and storage medium
CN113593023B (en) * 2021-07-14 2024-02-02 中国科学院空天信息创新研究院 Three-dimensional drawing method, device, equipment and storage medium
CN117036622B (en) * 2023-10-08 2024-02-23 海纳云物联科技有限公司 Three-dimensional reconstruction method, device and equipment for fusing aerial image and ground scanning
CN118640878B (en) * 2024-08-16 2024-10-22 南昌航空大学 Topography mapping method based on aviation mapping technology

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN103093459A (en) * 2013-01-06 2013-05-08 中国人民解放军信息工程大学 Assisting image matching method by means of airborne lidar point cloud data
CN103106339A (en) * 2013-01-21 2013-05-15 武汉大学 Synchronous aerial image assisting airborne laser point cloud error correction method
CN103744086A (en) * 2013-12-23 2014-04-23 北京建筑大学 High-precision registration method for ground laser radar and close-range photography measurement data
US9466143B1 (en) * 2013-05-03 2016-10-11 Exelis, Inc. Geoaccurate three-dimensional reconstruction via image-based geometry
CN109087339A (en) * 2018-06-13 2018-12-25 武汉朗视软件有限公司 A kind of laser scanning point and Image registration method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7944547B2 (en) * 2006-05-20 2011-05-17 Zheng Wang Method and system of generating 3D images with airborne oblique/vertical imagery, GPS/IMU data, and LIDAR elevation data
CN102930540B (en) * 2012-10-26 2015-06-10 中国地质大学(武汉) Method and system for detecting contour of urban building
CN103017739B (en) * 2012-11-20 2015-04-29 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
US10426372B2 (en) * 2014-07-23 2019-10-01 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
CN104123730B (en) * 2014-07-31 2016-09-14 武汉大学 Remote sensing image based on roadway characteristic and laser point cloud method for registering and system
CN104599272B (en) * 2015-01-22 2018-05-15 中国测绘科学研究院 Towards the airborne LiDAR point cloud and image association method for registering of removable target ball
US20180347978A1 (en) * 2017-06-01 2018-12-06 Michael David SÁNCHEZ System and method of photogrammetry

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN103093459A (en) * 2013-01-06 2013-05-08 中国人民解放军信息工程大学 Assisting image matching method by means of airborne lidar point cloud data
CN103106339A (en) * 2013-01-21 2013-05-15 武汉大学 Synchronous aerial image assisting airborne laser point cloud error correction method
US9466143B1 (en) * 2013-05-03 2016-10-11 Exelis, Inc. Geoaccurate three-dimensional reconstruction via image-based geometry
CN103744086A (en) * 2013-12-23 2014-04-23 北京建筑大学 High-precision registration method for ground laser radar and close-range photography measurement data
CN109087339A (en) * 2018-06-13 2018-12-25 武汉朗视软件有限公司 A kind of laser scanning point and Image registration method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Automatic registration of UAV-borne sequent images and LiDAR data;Bisheng Yang等;《ISPRS Journal of Photogrammetry and Remote Sensing》;20150331;第262-274页 *
数字图像与激光点云配准及在建筑物三维建模中的应用;顾斌;《中国优秀硕士学位论文全文数据库 信息科技辑》;20150215(第02期);第I138-899页 *
无地面控制的航空影像与LiDAR数据自动高精度配准;杜全叶;《中国博士学位论文全文数据库 信息科技辑》;20150515(第05期);第I138-90页 *
机载 LiDAR点云与航空影像自动配准的精度分析;贾娇;《中国优秀硕士学位论文全文数据库 基础科学辑》;20140215(第02期);第A008-139页 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767460A (en) * 2020-12-31 2021-05-07 武汉大学 Spatial fingerprint image registration element feature description and matching method

Also Published As

Publication number Publication date
CN109727278A (en) 2019-05-07

Similar Documents

Publication Publication Date Title
CN109727278B (en) Automatic registration method for airborne LiDAR point cloud data and aerial image
CN112102458B (en) Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN110969668B (en) Stereo calibration algorithm of long-focus binocular camera
CN114399554B (en) Calibration method and system of multi-camera system
CN105096329B (en) Method for accurately correcting image distortion of ultra-wide-angle camera
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN104501779A (en) High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN108759788B (en) Unmanned aerial vehicle image positioning and attitude determining method and unmanned aerial vehicle
CN107330927B (en) Airborne visible light image positioning method
CN105809706B (en) A kind of overall calibration method of the more camera systems of distribution
CN112270698B (en) Non-rigid geometric registration method based on nearest curved surface
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN117197333A (en) Space target reconstruction and pose estimation method and system based on multi-view vision
CN113947638B (en) Method for correcting orthographic image of fish-eye camera
WO2024098428A1 (en) Registration method and system
CN113298947A (en) Multi-source data fusion-based three-dimensional modeling method medium and system for transformer substation
CN108801225B (en) Unmanned aerial vehicle oblique image positioning method, system, medium and equipment
CN112929626A (en) Three-dimensional information extraction method based on smartphone image
CN108594255B (en) Laser ranging auxiliary optical image joint adjustment method and system
CN112767461A (en) Automatic registration method for laser point cloud and sequence panoramic image
CN116295279A (en) Unmanned aerial vehicle remote sensing-based building mapping method and unmanned aerial vehicle
CN113450334B (en) Overwater target detection method, electronic equipment and storage medium
CN107784666B (en) Three-dimensional change detection and updating method for terrain and ground features based on three-dimensional images
CN113008206B (en) Aerial triangulation mapping method and device, aircraft and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant