CN113324527B - Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method - Google Patents

Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method Download PDF

Info

Publication number
CN113324527B
CN113324527B CN202110589868.XA CN202110589868A CN113324527B CN 113324527 B CN113324527 B CN 113324527B CN 202110589868 A CN202110589868 A CN 202110589868A CN 113324527 B CN113324527 B CN 113324527B
Authority
CN
China
Prior art keywords
image
point
points
rfm
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110589868.XA
Other languages
Chinese (zh)
Other versions
CN113324527A (en
Inventor
周平
唐新明
王霞
王懿哲
岳明宇
王艺颖
张恒
刘昌儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ministry Of Natural Resources Land Satellite Remote Sensing Application Center
Original Assignee
Ministry Of Natural Resources Land Satellite Remote Sensing Application Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ministry Of Natural Resources Land Satellite Remote Sensing Application Center filed Critical Ministry Of Natural Resources Land Satellite Remote Sensing Application Center
Priority to CN202110589868.XA priority Critical patent/CN113324527B/en
Publication of CN113324527A publication Critical patent/CN113324527A/en
Application granted granted Critical
Publication of CN113324527B publication Critical patent/CN113324527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a co-orbit laser height measurement and three-linear array three-dimensional image combined mapping processing method, which comprises the steps of firstly taking a laser height measurement point as an elevation control point, and acquiring accurate image square coordinates of laser height measurement points distributed in various terrain types (especially hilly and mountain land terrains) on a three-linear array three-dimensional image by utilizing the characteristic that the relative plane error of an orthographic image acquired by the laser height measurement point and the co-orbit is smaller; then, constructing an adjustment area network for the whole-rail three-line array stereo image by taking a rail as a unit, and carrying out adjustment of a combined area network of the laser height measurement point and the stereo image; and finally, updating the imaging geometric model of the stereoscopic image according to the adjustment result. The method fully utilizes the advantages of co-orbit acquisition of the laser height measurement point and the optical image and extremely high elevation precision, effectively improves the elevation precision of the three-linear array three-dimensional image under the condition of no ground control point, and provides technical support for developing global high-precision geographic information resource construction by utilizing the satellite image.

Description

Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method
Technical Field
The invention belongs to the technical field of remote sensing and photogrammetry, relates to a combined geometric processing method of laser height measurement and a three-dimensional image, and particularly relates to a combined surveying and mapping processing method of laser height measurement and a three-linear array three-dimensional image acquired on the same rail.
Background
Mapping satellite stereoscopic images have become the main data source for the generation of multi-scale mapping geographic information products in countries of the world. The biggest difficulty in developing surveying and mapping applications by using satellite stereoscopic images is to ensure the geometric precision, especially the elevation precision of the images. Although the geometric accuracy of satellite images can be effectively improved by using a sufficient amount of high-accuracy ground control data, when the global mapping application is developed, the global mapping application is limited by factors such as politics and economy, and many areas cannot acquire the control data at all. In addition, ground control points are not used or are reduced, and the urgent requirements for reducing the mapping cost and improving the mapping efficiency when mapping application is developed are also met.
Resource number three 03 star (ZY3-03) launched on 25.7.2020 primarily serves as a global perspective map on a scale of 1:5 million. The satellite carries three panchromatic linear array push-broom optical cameras and a multispectral camera containing 4 spectral bands of near infrared, red, green and blue, wherein the three panchromatic cameras are respectively a front-view camera inclining forwards by 22 degrees along the axial direction, a front-view camera approximately perpendicular to the ground and a rear-view camera inclining backwards by 22 degrees along the axial direction, and can acquire a three-linear array three-dimensional image and a 4-spectral band multispectral image when one-time ambient photography imaging is performed. The ground resolution of the front-view camera is 2.1 meters, the ground resolution of the front-view camera and the rear-view camera is 2.5 meters, and the ground resolution of the multispectral camera is 5.8 meters. The width is larger than 51 kilometers. According to the practical situation of satellite image application, the error in the original elevation of the ZY3-03 star three-linear array stereo image is about 6-10 meters, the error in the original plane is 8-20 meters, and the elevation precision of the image cannot meet the application precision requirement of 1:5 ten thousand scale mapping in the global range. Therefore, the ZY3-03 star is specially provided with a single-beam laser altimeter for synchronously acquiring high-precision laser altimetry points within a stereo image range at a ground observation frequency of 2Hz while acquiring the stereo image, wherein the laser altimetry points have no footprint image, and the diameter of a laser spot on the ground is about 50 meters. As a high-precision distance measuring instrument, the laser height measuring instrument has the characteristics of good directivity, high distance measuring precision and the like, and the obtained ground surface elevation information has extremely high precision. According to the design index of the satellite and the practical situation of the satellite application, the error in the elevation of the ZY3-03 star laser height measurement point is better than 0.5 meter in a plain area, better than 0.8 meter in a hilly area and better than 1 meter in a mountain area. The ZY3-03 star is the first mapping satellite in the world carrying a business laser altimeter and a three-line stereo camera at the same time.
The method is characterized in that a laser height measurement point is used as an elevation control point of a three-dimensional image and combined surveying and mapping processing is carried out to improve the elevation precision of the three-dimensional image, the method comprises the core steps of obtaining accurate image space coordinates of the laser height measurement point on the three-dimensional image, and if the position of an image point of the laser height measurement point on the three-dimensional image has larger deviation with the position of an actual image point, in hilly and mountain terrain areas (namely when the surface relief is larger), the method is equivalent to introducing extra elevation errors into the laser height measurement point, and the combined surveying and mapping processing effect can be seriously influenced. However, how to overcome the image of the ZY3-03 star laser height measurement point without a footprint image and realize the accurate positioning of the laser side height point on the three-dimensional image, and further, an effective method for carrying out the joint geometric processing on the laser height measurement point and the three-linear array three-dimensional image is designed, so that the aim of greatly improving the elevation precision of the satellite three-dimensional image is realized, the urgent need for realizing the application of 1:5 ten thousand and more scale three-dimensional mapping in the global range (namely under the condition of rare control or no ground control) by the ZY3-03 star (and similar satellites) image is met, and the technical blank in the aspect of the joint processing of the three-linear array three-dimensional image and the laser height measurement point is filled.
Disclosure of Invention
The invention mainly solves the technical problems that: in order to overcome the defects that the geometric precision (especially the elevation precision) of a satellite image cannot meet the application of high-precision mapping under the condition of no ground control (such as the condition that ground control points cannot be obtained in an overseas area), and the like, the characteristic that the elevation precision of a laser height measurement point is high is fully utilized, and the characteristic that the relative plane error between the laser height measurement point and a front view image is small is obtained in the same orbit, the method for jointly surveying and mapping the same-orbit laser height measurement point and the three-linear-array three-dimensional image is provided, and the elevation precision of the three-linear-array three-dimensional image in various terrain areas can be greatly improved.
In order to solve the above technical problems, the object of the present invention is achieved by the following technical solutions:
a co-orbit laser altimetry point and three-linear array three-dimensional image combined mapping processing method comprises the following steps:
step 1, taking a satellite imaging shooting orbit as a unit, constructing an adjustment area network for a same-orbit multi-scene three-line array stereo image, and adopting an area network adjustment model based on an image space affine transformation compensation rational Function model RFM (rational Function model);
step 2, densely distributing connection points inside each three-linear array stereoscopic image, and distributing sufficient public connection points between adjacent three-linear array stereoscopic images;
step 3, calculating and acquiring coordinates of image points of the laser height measuring points on the front-view image through RFM of the front-view image according to longitude and latitude coordinates and elevation values of the laser height measuring points by utilizing the characteristics of small relative plane errors of the laser height measuring points and the front-view image acquired on the same track and the like;
step 4, respectively matching the front-view image and the rear-view image with the high-precision homonymous points of the front-view image and the front-view image, and acquiring homonymous points of the laser height measuring point image points on the front-view image and the rear-view image, namely acquiring image point coordinates of the laser height measuring points on the front-view image and the rear-view image;
step 5, aiming at each laser height measurement point image point, taking an image space affine transformation parameter of the image RFM and a ground longitude and latitude coordinate corresponding to the laser height measurement point image point as unknowns, taking an elevation value of the laser height measurement point as a known quantity, and constructing an error equation point by point; aiming at each connection point, taking image space affine transformation parameters of the image RFM and ground three-dimensional coordinates corresponding to the connection points as unknowns, and constructing an error equation point by point;
step 6, carrying out a normalization on the error equation to obtain a normal equation; adopting a least square principle to carry out integral adjustment calculation, and calculating affine transformation parameters of each image RFM, ground three-dimensional coordinate correction numbers corresponding to the connection points and ground longitude and latitude coordinate correction numbers corresponding to the laser height measurement image points;
step 7, judging whether an iterative convergence condition is met (namely, a translation parameter in affine transformation parameters of all the images RFM is smaller than a threshold) every time adjustment calculation is completed; if so, the adjustment is finished; and otherwise, updating the image RFM, the ground three-dimensional coordinates corresponding to the connecting points and the ground longitude and latitude coordinates corresponding to the laser height measurement image points by using the correction obtained by adjustment calculation, repeating the step 5 to the step 6, performing adjustment iterative calculation again, and continuously updating the affine transformation parameters of the image RFM, the ground three-dimensional coordinates corresponding to the connecting points and the ground longitude and latitude coordinates corresponding to the laser height measurement image points until the iterative convergence condition is met. Finally, outputting affine transformation parameters of each image RFM;
and 8, performing image space affine transformation compensation on the RFM model by using the affine transformation parameters of each image RFM obtained by the last adjustment iterative calculation, and generating a new RFM for each image.
One or more embodiments of the present invention may have the following advantages over the prior art:
(1) by taking the laser height measurement points as elevation control, the block adjustment is carried out aiming at the three-dimensional image, the elevation precision of the three-linear-array three-dimensional image is greatly improved, the three-linear-array three-dimensional image meets the requirement of the mapping precision of a scale of 1:5 ten thousand or even more, and the high-precision mapping is possible under the condition of no ground control in the global range.
(2) The method fully utilizes the characteristic that the plane relative error between the laser height measurement point and the front-view image acquired on the same track is small, the accuracy of the image point position of the acquired laser height measurement point on the three-dimensional image is extremely high, and the method effectively avoids the problem that the deviation between the measurement position of the laser height measurement point on the three-dimensional image and the actual position of the laser height measurement point on the three-dimensional image is large, so that the adjustment accuracy of the final area network is influenced. The effective utilization of laser height measuring points in hilly and mountain land terrain areas is ensured.
(3) By utilizing the advantage of extremely high elevation precision of the laser height measurement points, the ground high-precision elevation control points are replaced in mapping application, field measurement is avoided, and the working cost is greatly reduced.
Drawings
FIG. 1 is a flow chart of a co-rail laser altimetry point and three-linear array stereo image combined mapping processing method.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings.
As shown in fig. 1, a process of a co-rail laser altimetry and three-linear array stereo image joint mapping processing method includes the following steps:
step 1, taking a satellite imaging photography orbit as a unit, constructing an adjustment area network for a same-orbit multi-view three-line-array stereo image, and adopting an area network adjustment model based on an image space affine transformation compensation Rational Function Model (RFM);
the RFM is a mathematical mapping relationship between the pixel coordinates of an image and the geodetic coordinates of a corresponding ground point, which is established by using a rational polynomial, and the basic equation is defined as follows:
Figure BDA0003088958760000041
Figure BDA0003088958760000051
in the formula (X)n,Yn,Zn) And (r)n,cn) Normalization of ground point coordinates (X, Y, Z) and image coordinates (r, c) after translation and scaling, respectivelyCoordinates with the value of [ -1,1 [)]The purpose of normalization is to avoid introducing rounding errors due to too large magnitude difference of parameter values in the calculation process, so as to enhance the stability of parameter solution. R0,C0Translation parameter, X, for image coordinate regularization0,Y0,Z0Translation parameters regularized for ground point coordinates. Rs,CsScale factor, X, for image coordinate regularizations,Ys,ZsAnd (4) a scaling coefficient for ground point coordinate regularization. Pi(i-1, 2,3,4) each represents a general polynomial, and each variable X in the formulan,Yn,ZnIs no more than 3, nor is the sum of the powers of all variables more than 3, in the form:
Figure BDA0003088958760000052
in the formula, aij(i 1,2,3, 4; j 0,1, …,19) is a rational polynomial coefficient;
the error compensation of the image RFM adopts an image space affine transformation supplementary model, and the relationship between the image point coordinates (r, c) and the ground point coordinates (X, Y, Z) described by the RFM in the formula (1) is corrected as follows:
Figure BDA0003088958760000053
where (Δ r, Δ c) is the system error image space compensation value for the image point coordinates (r, c), which is:
Figure BDA0003088958760000054
wherein (a)0,a1,a2,b0,b1,b2) Affine transformation parameters that compensate for RFM system errors are represented.
Step 2: dense connection points are distributed in each three-linear array stereoscopic image, and sufficient public connection points are distributed between adjacent three-linear array stereoscopic images.
The method further comprises the following steps:
step 2.1, firstly, automatically matching the image with the connection points, wherein common image automatic matching algorithms include an image matching algorithm (such as a least square method) based on image space gray scale, an image matching algorithm based on object space, a bridging method image matching based on image space characteristics, pyramid multi-level image matching, SIFT and the like. The connection points in each three-linear array three-dimensional model in the area network are required to be uniformly distributed, and the number of the connection points is more than 25; the number of the connection points between the adjacent three-linear array three-dimensional models is more than 5; 90% of the connection point dimension (i.e., the number of connection point images) in the region should be equal to the total number of overlapping region images.
And 2.2, when the automatically matched connecting points of the images are not uniformly distributed, the number is insufficient or the dimensionality is insufficient, the connecting points are subjected to additional measurement by adopting a manual interpretation/measurement method, and the manually interpreted/measured connecting points are positioned on a fixed target which is clear in image, obvious in characteristic, large in contrast and easy to be pricked and measured.
And step 3: and calculating and acquiring coordinates of image points of the laser height measuring points on the front-view image acquired on the same orbit through RFM of the front-view image according to longitude and latitude coordinates and elevation values of the laser height measuring points by utilizing the characteristics of small relative plane errors of the laser height measuring points and the front-view image acquired on the same orbit and the like.
Because the laser height measurement point and the front-view image are almost the same in acquisition time, the types and the magnitudes of external errors borne by the laser height measurement point and the front-view image are basically the same, and the pointing angle of the laser height measurement instrument and the pointing angle of the main optical axis of the front-view camera are vertical to the ground, so that the influence effects of the external errors on the plane precision of the laser height measurement point and the front-view image are very similar. Due to the factors, the relative plane error of the laser height measurement point and the orthographic view image on the same track is small and is inevitably smaller than the ground footprint radius of the laser height measurement point, so that the ground longitude and latitude coordinates and the elevation value of the laser height measurement point can be directly substituted into an orthographic view image RFM formula (formula 1) to calculate the image point coordinates of the laser height measurement point on the orthographic view image.
And 4, respectively matching the front-view image and the rear-view image with the high-precision homonymous points of the front-view image and the front-view image, and acquiring homonymous points of the laser height measuring point image points on the front-view image and the rear-view image, namely acquiring the image point coordinates of the laser height measuring points on the front-view image and the rear-view image.
The method further comprises the following steps:
step 4.1, calculating Scale-invariant feature transform (SIFT) features of the coordinate position (integer pixel position) of the image point of the laser height measurement point on the front-view image, wherein the SIFT is used as a local feature descriptor in the field of image processing, and the description has Scale invariance to a certain extent and has better stability for Scale change, rigid body transformation, illumination intensity and shielding of an object. Storing the obtained SIFT feature descriptors;
step 4.2, calculating the approximate pixel position of the laser height measuring point on the front-view image by using the RFM of the front-view image according to the ground longitude and latitude coordinates and the elevation value of the laser height measuring point, and expanding a certain pixel range outwards around the position as a matched search range by taking the approximate pixel position as the center; calculating SIFT characteristics of each pixel in the searching range of the forward-looking image, and storing a characteristic descriptor of the SIFT characteristics;
and 4.3, searching the SIFT feature descriptor Nearest Neighbor value of the laser height measurement point image point on the front-view image in the plurality of feature descriptors obtained in the step 4.2 by using a Fast Nearest Neighbor approximation Search function Library (Fast Approximate Neighbor Search Library), and obtaining the image point coordinate on the front-view image corresponding to the feature descriptor, namely the integer pixel coordinate value of the laser height measurement point image point on the front-view image.
And 4.4, performing least square matching with the laser height measuring point image point on the front-view image by taking the whole pixel coordinate position of the laser height measuring point image point on the front-view image as a center, and obtaining the accurate coordinate value of the laser height measuring point image point on the front-view image.
And 4.5, repeating the method from the step 4.2 to the step 4.4, and obtaining the accurate coordinate value of the laser height measurement point image point on the rearview image.
Step 5, aiming at each laser height measurement point image point, taking an image space affine transformation parameter of the image RFM and a ground longitude and latitude coordinate corresponding to the laser height measurement point image point as unknowns, taking an elevation value of the laser height measurement point as a known quantity, and constructing an error equation point by point; and aiming at each connection point, taking the image space affine transformation parameters of the image RFM and the ground three-dimensional coordinates corresponding to the connection points as unknowns, and constructing an error equation point by point.
The method further comprises the following steps:
step 5.1, determining a ground three-dimensional coordinate initial value corresponding to the connecting point, a ground longitude and latitude coordinate initial value corresponding to the laser height measurement point image point and an image RFM affine transformation parameter initial value;
and performing direct forward intersection on the connection points through the RFM model to obtain ground point coordinates as initial ground three-dimensional coordinates corresponding to the connection points. And performing direct forward intersection on the laser height measuring point image points through an RFM model to obtain ground point longitude and latitude coordinates which are used as initial values of the ground longitude and latitude coordinates corresponding to the laser height measuring point image points.
In the present embodiment, the initial value of the affine transformation model is generally set to
Figure BDA0003088958760000071
Step 5.2, setting an error equation of the connecting point and the laser height measuring point as follows:
Figure BDA0003088958760000072
in the formula, the first row of error equations are error equations listed by connecting points on a three-dimensional image, and affine transformation parameters of the image RFM and ground three-dimensional coordinates corresponding to the connecting points are used as unknowns; the second row of error equations is an error equation listed by the coordinates of the image points of the laser height points on the three-dimensional image, and the affine transformation parameters of the image RFM and the ground longitude and latitude coordinates corresponding to the image points of the laser height measuring points on the three-dimensional image are used as unknowns. V1=[vR1 vC1]TAnd V2=[vR2 vC2]TRespectively connecting points and laser height measurement points in the residual vectors of the observed values of the uplink and the column coordinates of the image; t ═ Δ a0 Δa1 Δa2 Δb0 Δb1 Δb2]TA correction vector of affine change compensation parameters of image space coordinate system errors of the image RFM; x is the number of1=[ΔXtie ΔYtie ΔZtie]TThe correction vector of the ground geodetic coordinate corresponding to the connection point is obtained; x is the number of2=[ΔXlaser ΔYlaser]TCorrecting vector of ground longitude and latitude coordinates corresponding to the laser height measuring point image point;
Figure BDA0003088958760000081
is a partial derivative coefficient matrix corresponding to the unknown t;
Figure BDA0003088958760000082
are respectively an unknown number x1And x2A corresponding partial derivative coefficient matrix;
Figure BDA0003088958760000083
for constant terms calculated by substituting the initial values into the observation equation, (r)0,c0) The coordinates of the rows and the columns of the connecting point image points are calculated by RFM according to the approximate value of the unknown number; piAnd (i is 1 and 2) is a weight matrix of each observation value, the weight of each observation value is usually determined by the prior information of the observation value, generally 10 times of the prior standard deviation of the observation value is taken, and the weight of each observation value is recalculated after each adjustment iteration calculation.
Step 6, carrying out a normalization on the error equation to obtain a normal equation; and (3) adopting a least square principle to carry out integral adjustment calculation, and calculating affine transformation parameters of each image RFM, ground three-dimensional coordinate correction numbers corresponding to the connection points and ground longitude and latitude coordinate correction numbers corresponding to the laser height measurement point image points.
The method further comprises the following steps:
step 6.1: according to the least square adjustment principle, an error equation is normalized, and the obtained normal equation is as follows:
Figure BDA0003088958760000084
is recorded as:
Figure BDA0003088958760000091
step 6.2: because the number of the connecting points on the image is large, the unknown numbers to be solved in the equation are excessive, the error equation is transformed to eliminate X, only the correction number of affine transformation parameters in the error equation is solved, and then the ground three-dimensional coordinates corresponding to the connecting points are updated in a front intersection mode again, so that the resolving efficiency is improved, and the formula (8) is changed into the following form:
Nt=G (9)
wherein the content of the first and second substances,
Figure BDA0003088958760000092
in N is
Figure BDA0003088958760000093
The matrix is a 2-order diagonal matrix and is obtained by inverting each small matrix during calculation; the step can be processed in a multi-thread parallel mode, resolving time can be greatly saved, and computing efficiency is improved.
Step 6.3: solve out
Figure BDA0003088958760000094
N, G can be obtained respectively, the conjugate gradient descent method in mathematics is used for iterative solution aiming at the equation of the equation (9), the iteration is finished after the difference value of t obtained by two times of solution is smaller than the set threshold value or the number of times of solution exceeds the set number of times, and the final t, namely the affine transformation parameter correction number, is obtained through output.
And 7, judging a translation parameter a in affine transformation parameters of all the images RFM when the adjustment calculation is finished once0,b0Whether it is less than a threshold (0.1 pixels in this embodiment, but not limited thereto); if so, the adjustment is finished; otherwise, updating the image RFM, the ground three-dimensional coordinates corresponding to the connecting points and the ground longitude and latitude coordinates corresponding to the laser height measuring point image points by using the correction number obtained by the adjustment calculation, and repeating the step 5-stepAnd 6, performing adjustment iterative calculation again, and continuously updating affine transformation parameters of the RFM, ground three-dimensional coordinates corresponding to the connection points and ground longitude and latitude coordinates corresponding to the laser height measurement point image points until an iterative convergence condition is met. And finally outputting affine transformation parameters of each image RFM.
And 8, performing image space affine transformation compensation on the RFM of the image by using the RFM affine transformation parameter of each image obtained by the last adjustment iterative calculation, and generating a new RFM for each image.
The method further comprises the following steps:
step 8.1: calculating the ground range corresponding to four corner points of the image by using the original RFM and affine transformation compensation parameters of the image; the maximum minimum ellipsoidal height for the region was calculated according to the Global 1km resolution DEM (Global 30-arc-second Digital Elevation Model) provided by the U.S. geological survey. Then, the image coordinates of the control points are generated by layering at regular intervals in the elevation direction (in order to prevent the state of the design matrix from deteriorating, the number of layers layered in the elevation direction is generally more than 2, but the plane of the embodiment is divided into 10 layers, but not limited thereto), on the plane, a ground regular grid is built with a certain grid size (the plane of the embodiment is divided into a 15 × 15 grid, that is, an image range corresponding to the image is divided into 15 × 15 grids, and 16 × 16 grid points are total, but not limited thereto), and finally, the image coordinates of the control points are calculated by using inverse transformation and affine transformation compensation parameters of the original image RFM. And encrypting the control grid and the layer to establish an independent check point.
Step 8.2: the RFM parameter is solved by using the least square adjustment principle, and firstly, the equation (formula 1) of the RFM is transformed into:
Figure BDA0003088958760000101
the linearized error equation is then obtained as:
V=Bx-l,W (11)
in the formula (I), the compound is shown in the specification,
Figure BDA0003088958760000102
x=[a1ia2j a3i a4j]Tand W is a weight matrix.
According to the least squares adjustment principle, one can solve:
x=(BTB)-1BTl (12)
through the deformed RFM model form, the error equation of the adjustment is a linear model, so that an initial value is not needed in the process of solving the RFM parameters.
Step 8.3: and calculating the image coordinates corresponding to the check points by using the solved RFM parameters, and evaluating the precision of the solved RFM parameters by the difference value of the original RFM of the image and the check point image coordinates calculated by the affine transformation compensation parameters. And outputs new RFM parameters.
Although the embodiments of the present invention have been described above, the above descriptions are only for the convenience of understanding the present invention, and are not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (5)

1. A co-orbit laser altimetry point and three-linear array stereoscopic image combined mapping processing method is characterized by comprising the following steps:
step 1, taking a satellite imaging photography orbit as a unit, constructing an adjustment area network for a same-orbit multi-scene three-line array stereo image, and adopting an area network adjustment model based on an image space affine transformation compensation rational function model RFM;
step 2, arranging connection points inside each three-linear array stereoscopic image, and arranging common connection points between adjacent three-linear array stereoscopic images;
step 3, calculating and acquiring coordinates of image points of the laser height measuring points on the front-view image acquired on the same track through the RFM of the front-view image according to the longitude and latitude coordinates and the elevation values of the laser height measuring points;
step 4, respectively matching the front-view image and the rear-view image with the high-precision homonymous points of the front-view image and the front-view image, and obtaining homonymous points of the laser height measuring point image points on the front-view image and the rear-view image, namely obtaining the image point coordinates of the laser height measuring points on the front-view image and the rear-view image;
step 5, aiming at each laser height measurement point image point, taking an image space affine transformation parameter of the image RFM and a ground longitude and latitude coordinate corresponding to the laser height measurement point image point as unknowns, taking an elevation value of the laser height measurement point as a known quantity, and constructing an error equation point by point; aiming at each connection point, taking image space affine transformation parameters of the image RFM and ground three-dimensional coordinates corresponding to the connection points as unknowns, and constructing an error equation point by point;
step 6, carrying out a normalization on the error equation to obtain a normal equation; adopting a least square principle to carry out integral adjustment calculation, and calculating affine transformation parameters of each image RFM, ground three-dimensional coordinate correction numbers corresponding to the connection points and ground longitude and latitude coordinate correction numbers corresponding to the laser height measurement image points;
7, judging whether the adjustment meets a convergence condition or not every time the adjustment calculation is finished, and if so, finishing the adjustment; otherwise, updating the image RFM, the ground three-dimensional coordinates corresponding to the connecting points and the ground longitude and latitude coordinates corresponding to the laser height measurement image points by using the correction obtained by adjustment calculation, repeating the step 5 to the step 6, and performing adjustment iterative calculation again to continuously update the affine transformation parameters of the image RFM, the ground three-dimensional coordinates corresponding to the connecting points and the ground longitude and latitude coordinates corresponding to the laser height measurement image points until the convergence condition is met; finally, outputting affine transformation parameters of each image RFM;
step 8, performing image space affine transformation compensation on the RFM model by using affine transformation parameters of each image RFM obtained by the last adjustment iterative computation to generate a new RFM for each image;
in the step 4, matching the laser height measurement point image point on the front-view image with the homonymous point on the rear-view image respectively comprises the following steps:
step 4.1, calculating Scale Invariant Feature Transform (SIFT) features of the coordinate positions of the image points of the laser altimetry points on the front-view image, and storing feature descriptors;
step 4.2, calculating the approximate image point position of the laser height measuring point on the front-view image by using the RFM of the front-view image according to the ground longitude and latitude coordinates and the elevation value of the laser height measuring point, and expanding a certain pixel range outwards around the position as a matched search range by taking the approximate image point position as the center; calculating SIFT characteristics of each pixel in the searching range of the forward-looking image, and storing a characteristic descriptor of the SIFT characteristics;
step 4.3, searching the SIFT feature descriptor nearest neighbor value of the laser height measurement point image point on the front-view image in the plurality of feature descriptors acquired in the step 4.2 by using a fast nearest neighbor approximation search function library, and acquiring the image point coordinate on the front-view image corresponding to the feature descriptor, namely the integer pixel coordinate value of the laser height measurement point image point on the front-view image;
step 4.4, taking the whole pixel coordinate position of the laser height measurement point image point on the forward-looking image as a center, performing least square matching with the laser height measurement point image point on the front-looking image, and obtaining the accurate coordinate value of the laser height measurement point image point on the forward-looking image;
and 4.5, repeatedly executing the step 4.2 to the step 4.4, and also obtaining the accurate coordinate value of the image point of the laser height measurement point on the rearview image.
2. The method according to claim 1, wherein in step 1, RFM means a rational polynomial is used to establish a mathematical mapping relationship between the pixel coordinates of the image and the geodetic coordinates of the corresponding ground point, and the basic equation is defined as follows:
Figure FDA0003293551580000021
Figure FDA0003293551580000022
in the formula (X)n,Yn,Zn) And (r)n,cn) Respectively ground point coordinates (X, Y, Z) and image coordinates (r, c)) The normalized coordinates after translation and scaling take values of [ -1,1 []To (c) to (d); r0,C0Translation parameter, X, for image coordinate regularization0,Y0,Z0Translation parameters for ground point coordinate regularization; rs,CsScale factor, X, for image coordinate regularizations,Ys,ZsA scaling factor for ground point coordinate regularization; pi(i-1, 2,3,4) each represents a general polynomial, and each variable X in the formulan,Yn,ZnNeither does the power of (a) exceed three, nor does the sum of the powers of all variables exceed three, which is of the form:
Figure FDA0003293551580000031
in the formula, aij(i 1,2,3, 4; j 0,1, …,19) is a rational polynomial coefficient;
the error compensation of the image RFM adopts an image space affine transformation supplementary model, and the relationship between the image point coordinates (r, c) and the ground point coordinates (X, Y, Z) described by the RFM in the formula (1) is corrected as follows:
Figure FDA0003293551580000032
where (Δ r, Δ c) is the system error image space compensation value for the image point coordinates (r, c), which is:
Figure FDA0003293551580000033
wherein (a)0,a1,a2,b0,b1,b2) Affine transformation parameters that compensate for RFM system errors are represented.
3. The co-orbit laser altimetry and three-linear array stereo image joint mapping processing method of claim 1, wherein the step 3 is to substitute the ground longitude and latitude coordinates and the elevation value of the laser altimetry point into the front view image RFM formula 1 acquired in the co-orbit to calculate the image point coordinates of the laser altimetry point on the front view image.
4. The co-rail laser altimetry and three-linear array stereoscopic image joint mapping processing method of claim 1, wherein in the step 5, the error equations of the connecting points and the laser altimetry point are as follows:
Figure FDA0003293551580000034
in the formula, the first row of error equations are error equations listed by connecting points on a three-dimensional image, and affine transformation parameters of the image RFM and ground three-dimensional coordinates corresponding to the connecting points are used as unknowns; the second row of error equations are error equations listed by the image point coordinates of the laser high points on the three-dimensional image, and affine transformation parameters of the image RFM and the ground longitude and latitude coordinates of the laser points are used as unknowns; v1=[vR1 vC1]TAnd V2=[vR2 vC2]TRespectively connecting points and laser points in the image uplink and column coordinate observation value residual vectors; t ═ Δ a0 Δa1 Δa2 Δb0 Δb1 Δb2]TA correction vector of an image space coordinate system error affine change compensation parameter of the image RFM; x is the number of1=[ΔXtie ΔYtie ΔZtie]TThe correction vector of the ground geodetic coordinate corresponding to the connection point is obtained; x is the number of2=[ΔXlaser ΔYlaser]TThe correction vector is the ground longitude and latitude of the laser height measuring point;
Figure FDA0003293551580000041
is a partial derivative coefficient matrix corresponding to the unknown t;
Figure FDA0003293551580000042
and
Figure FDA0003293551580000043
are respectively an unknown number x1And x2A corresponding partial derivative coefficient matrix;
Figure FDA0003293551580000044
for constant terms calculated by substituting the initial values into the observation equation, (r)0,c0) The coordinates of the rows and the columns of the connecting point image points are calculated by RFM according to the approximate value of the unknown number; piAnd (i is 1 and 2) is a weight matrix of each observation value, the weight of each observation value is usually determined by the prior information of the observation value, generally 10 times of the prior standard deviation of the observation value is taken, and the weight of each observation value is recalculated after each adjustment iteration calculation.
5. The co-rail laser altimetry and three-linear array stereoscopic image joint mapping processing method according to claim 1, wherein the step 6 specifically comprises the following steps:
step 6.1, according to the least square adjustment principle, an error equation is normalized, and the obtained normal equation is in the following form:
Figure FDA0003293551580000045
is recorded as:
Figure FDA0003293551580000046
step 6.2: because the number of the connecting points on the image is large, the unknown numbers to be solved in the equation are excessive, the error equation is transformed to eliminate X, only the correction number of affine transformation parameters in the error equation is solved, and then the ground three-dimensional coordinates corresponding to the connecting points are updated in a front intersection mode again, so that the resolving efficiency is improved, and the formula (8) is changed into the following form:
Nt=G (9)
wherein the content of the first and second substances,
Figure FDA0003293551580000051
in N is
Figure FDA0003293551580000052
The matrix is a 2-order diagonal matrix and is obtained by inverting each small matrix during calculation; the step can be processed in a multi-thread parallel mode, so that resolving time can be greatly saved, and computing efficiency is improved;
step 6.3: solve out
Figure FDA0003293551580000053
N, G can be obtained respectively, iterative solution is carried out by using a conjugate gradient descent method in mathematics according to the equation of the formula (9), the iteration is ended after the difference value of t obtained by two times of solution is smaller than a set threshold value or the number of times of solution exceeds a set number of times, and the final t, namely the affine transformation parameter correction number, is obtained through output.
CN202110589868.XA 2021-05-28 2021-05-28 Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method Active CN113324527B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110589868.XA CN113324527B (en) 2021-05-28 2021-05-28 Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110589868.XA CN113324527B (en) 2021-05-28 2021-05-28 Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method

Publications (2)

Publication Number Publication Date
CN113324527A CN113324527A (en) 2021-08-31
CN113324527B true CN113324527B (en) 2021-11-26

Family

ID=77421943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110589868.XA Active CN113324527B (en) 2021-05-28 2021-05-28 Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method

Country Status (1)

Country Link
CN (1) CN113324527B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113920046B (en) * 2021-09-30 2023-07-18 中国人民解放军战略支援部队信息工程大学 Multi-fragment satellite image stitching and geometric model construction method

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007021426A1 (en) * 2005-08-18 2007-02-22 Itt Manufacturing Enterprises, Inc. Multi-sensors and differential absorption lidar data fusion
DE102009001875A1 (en) * 2009-03-26 2010-09-30 Robert Bosch Gmbh Self-leveling multi-line 360 ° laser device
KR101003412B1 (en) * 2010-08-31 2010-12-23 (주)동광지엔티 Apparatus and method for airborne laser surveying using detection dilution critical value of precision
CN106960174A (en) * 2017-02-06 2017-07-18 中国测绘科学研究院 High score image laser radar vertical control point is extracted and its assisted location method
CN107504981A (en) * 2017-07-25 2017-12-22 国家测绘地理信息局卫星测绘应用中心 A kind of attitude of satellite error correcting method and equipment based on laser-measured height data
EP3355486A1 (en) * 2017-01-30 2018-08-01 Space Systems/Loral, LLC Adaptive communication system
WO2019041161A1 (en) * 2017-08-30 2019-03-07 深圳玩智商科技有限公司 Laser module optical path automatic adjusting device and method
CN110487241A (en) * 2019-08-15 2019-11-22 中国测绘科学研究院 Laser satellite surveys high extraction building area vertical control point method
EP3608633A1 (en) * 2018-08-07 2020-02-12 General Electric Company System and method for guiding a vehicle along a travel path
CN111156960A (en) * 2019-12-28 2020-05-15 同济大学 Satellite laser elevation control point screening method suitable for unstable ground surface area
CN111505608A (en) * 2020-05-06 2020-08-07 自然资源部国土卫星遥感应用中心 Laser pointing on-orbit calibration method based on satellite-borne laser single-chip footprint image
CN112099032A (en) * 2020-09-07 2020-12-18 自然资源部国土卫星遥感应用中心 Ice crack morphology analysis method and device based on laser height measurement satellite data

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7228230B2 (en) * 2004-11-12 2007-06-05 Mitsubishi Denki Kabushiki Kaisha System for autonomous vehicle navigation with carrier phase DGPS and laser-scanner augmentation
CN102735216B (en) * 2011-04-08 2016-01-27 中国科学院国家天文台 CCD stereoscopic camera three-line imagery data adjustment processing method
CN104931022B (en) * 2015-04-21 2018-03-16 国家测绘地理信息局卫星测绘应用中心 Satellite image stereoblock adjustment method based on spaceborne laser altimeter system data
CN111492403A (en) * 2017-10-19 2020-08-04 迪普迈普有限公司 Lidar to camera calibration for generating high definition maps
CN108226982B (en) * 2017-12-25 2020-12-25 航天天绘科技有限公司 Single linear array satellite laser combined high-precision positioning processing method
CN109977344B (en) * 2019-03-20 2022-11-25 武汉大学 Area network adjustment method for satellite-borne noctilucent remote sensing image
CN111174753B (en) * 2019-12-28 2021-05-11 同济大学 Optical image and laser height measurement data adjustment method based on rational function model

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007021426A1 (en) * 2005-08-18 2007-02-22 Itt Manufacturing Enterprises, Inc. Multi-sensors and differential absorption lidar data fusion
DE102009001875A1 (en) * 2009-03-26 2010-09-30 Robert Bosch Gmbh Self-leveling multi-line 360 ° laser device
KR101003412B1 (en) * 2010-08-31 2010-12-23 (주)동광지엔티 Apparatus and method for airborne laser surveying using detection dilution critical value of precision
EP3355486A1 (en) * 2017-01-30 2018-08-01 Space Systems/Loral, LLC Adaptive communication system
CN106960174A (en) * 2017-02-06 2017-07-18 中国测绘科学研究院 High score image laser radar vertical control point is extracted and its assisted location method
CN107504981A (en) * 2017-07-25 2017-12-22 国家测绘地理信息局卫星测绘应用中心 A kind of attitude of satellite error correcting method and equipment based on laser-measured height data
WO2019041161A1 (en) * 2017-08-30 2019-03-07 深圳玩智商科技有限公司 Laser module optical path automatic adjusting device and method
EP3608633A1 (en) * 2018-08-07 2020-02-12 General Electric Company System and method for guiding a vehicle along a travel path
CN110487241A (en) * 2019-08-15 2019-11-22 中国测绘科学研究院 Laser satellite surveys high extraction building area vertical control point method
CN111156960A (en) * 2019-12-28 2020-05-15 同济大学 Satellite laser elevation control point screening method suitable for unstable ground surface area
CN111505608A (en) * 2020-05-06 2020-08-07 自然资源部国土卫星遥感应用中心 Laser pointing on-orbit calibration method based on satellite-borne laser single-chip footprint image
CN112099032A (en) * 2020-09-07 2020-12-18 自然资源部国土卫星遥感应用中心 Ice crack morphology analysis method and device based on laser height measurement satellite data

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Xie, Junfeng等.COMPOSITE GEOLOCATING OF ZY-3-02 LASER ALTIMETRY DATA AND OPTICAL SATELLITE STEREO IMAGERY.《IEEE International Geoscience and Remote Sensing Symposium》.2019, *
Zhang, Guo 等.Integrating Stereo Images and Laser Altimeter Data of the ZY3-02 Satellite for Improved Earth Topographic Modeling.《REMOTE SENSING 》.2019,第11卷(第20期), *
曹海翊 等."高分七号"高精度光学立体测绘卫星实现途径研究.《航天返回与遥感》.2020,第41卷(第2期), *
权学烽 等.卫星激光测高数据在极地区域的应用及展望.《测绘与空间地理信息》.2019,第42卷(第10期), *
王晋 等.ICESat激光高程点辅助的天绘一号卫星影像立体区域网平差.《测绘学报》.2018,第47卷(第3期), *

Also Published As

Publication number Publication date
CN113324527A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
KR101965965B1 (en) A method of automatic geometric correction of digital elevation model made from satellite images and provided rpc
CN110388898B (en) Multisource multiple coverage remote sensing image adjustment method for constructing virtual control point constraint
Li et al. Rigorous photogrammetric processing of HiRISE stereo imagery for Mars topographic mapping
CN107705329B (en) High-resolution optical satellite staring image registration method based on geometric constraint
Wang et al. Geometric accuracy validation for ZY-3 satellite imagery
CN107341778B (en) SAR image orthorectification method based on satellite control point library and DEM
Yang et al. Large-scale block adjustment without use of ground control points based on the compensation of geometric calibration for ZY-3 images
CN113358091B (en) Method for producing digital elevation model DEM (digital elevation model) by using three-linear array three-dimensional satellite image
CN111003214B (en) Attitude and orbit refinement method for domestic land observation satellite based on cloud control
Zhang et al. Block adjustment for satellite imagery based on the strip constraint
CN113538595B (en) Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner
CN113514829A (en) InSAR-oriented initial DSM block adjustment method
Fraser et al. Precise georefrencing of long strips of ALOS imagery
CN113324527B (en) Co-rail laser height measurement point and three-linear array three-dimensional image combined surveying and mapping processing method
CN111508028A (en) Autonomous in-orbit geometric calibration method and system for optical stereo mapping satellite camera
CN109029379B (en) High-precision small-base-height-ratio three-dimensional mapping method
Yan et al. Topographic reconstruction of the “Tianwen-1” landing area on the Mars using high resolution imaging camera images
Hu et al. Planetary3D: A photogrammetric tool for 3D topographic mapping of planetary bodies
Dong et al. A novel RPC bias model for improving the positioning accuracy of satellite images
Chen et al. Large-Scale Block Bundle Adjustment of LROC NAC Images for Lunar South Pole Mapping Based on Topographic Constraint
KR102015817B1 (en) A method of automatic correction of provided rpc of stereo satellite images
Pi et al. Large-scale planar block adjustment of GaoFen1 WFV images covering most of mainland China
CN113514035B (en) Image block adjustment method constrained by global digital elevation model
CN116310901A (en) Debris flow material source dynamic migration identification method based on low-altitude remote sensing
CN116045920A (en) Method for extracting DEM based on GF7 stereopair

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant