CN102538764A - Combined type image pair three-dimensional location method - Google Patents

Combined type image pair three-dimensional location method Download PDF

Info

Publication number
CN102538764A
CN102538764A CN2011104487075A CN201110448707A CN102538764A CN 102538764 A CN102538764 A CN 102538764A CN 2011104487075 A CN2011104487075 A CN 2011104487075A CN 201110448707 A CN201110448707 A CN 201110448707A CN 102538764 A CN102538764 A CN 102538764A
Authority
CN
China
Prior art keywords
model
image
partiald
picture
prime
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011104487075A
Other languages
Chinese (zh)
Inventor
邢帅
徐青
孙伟
蓝朝桢
李建胜
何钰
周杨
郭海涛
马东洋
靳国旺
施群山
王栋
候一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PLA Information Engineering University
Original Assignee
PLA Information Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PLA Information Engineering University filed Critical PLA Information Engineering University
Priority to CN2011104487075A priority Critical patent/CN102538764A/en
Publication of CN102538764A publication Critical patent/CN102538764A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to a combined type image pair three-dimensional location method, which comprises the following steps: step 1 respectively establishing construction image models corresponding to various sensors according to imaging methods of the various sensors; step 2 choosing two same or different images having been obtained as a three-dimensional image pair, and respectively calculating model parameters of the image pair construction image models according to known control conditions to achieve construction image models of known model parameters in the image pair; step 3 choosing a target point, respectively substituting two image coordinates of the target point on the image pair into the construction image models of the known model parameter corresponding to the step 2, and resolving geodetic coordinates of the target point in combination mode; and step 4 repeating the step 3, and resolving geodetic coordinates of the whole image area to achieve location. The combined type image pair three-dimensional location method can guarantee that positional information of focus objectives in interested area can be accurately obtained under special or extreme situations.

Description

A kind of combined type is as the stereo localization method
Technical field
The invention belongs to the analytical aerial triangulation field in the Photogrammetry and Remote Sensing technology, particularly relate to a kind of in the photogrammetric field based on the multiple sensors structure as " combined type " of model as the stereo localization method.
Background technology
Stereotaxic technique, i.e. stereophotogrammetric survey is to be the basis with the stereogram, by the right observation of stereo picture with measure a special kind of skill of appearance target shape, size, locus and character definitely.Stereogram is meant: a pair of photo with superimposed image of taking the photograph the station picked-up from difference.In conventional radiography measurement and remote sensing; Stereogram is defaulted as and must be obtained by same sensor, but in fact, no matter be two photos that same sensor obtains; As long as satisfy " taking the photograph the station picked-up " and " having certain image overlap " in difference; Just can constitute stereogram, promptly the application's described " combined type " realizes three-dimensional location as right.
Different according to sensor type, the image that people can obtain comprises SAR image and optical image.SAR (Synthetic Aperture Radar), synthetic-aperture radar is to adopt the mobile radar that carries on satellite or aircraft, reaches the radar system of the same precision of large-scale antenna.To the place radar wave takes place through radar, the mode that receives reflection wave is measured, and than optical measurement certain advantage is arranged, as not receiving that meteorological and time factor influence etc.
Propose SAR image and optical image are united the thought of carrying out stereoplotting from the existing abroad scholar of the seventies in last century.But ripe achievement is not arranged so far as yet, and bearing accuracy is not high or only can be fit to the image of same type.
Relevant technical terms is introduced: aerotriangulation is in the stereophotogrammetry, according to a spot of open-air reference mark, encrypts at the indoor reference mark of carrying out, and tries to achieve the elevation of pass point and the measuring method of planimetric position.The analytical aerial triangulation that produces owing to the development of computing machine; Can use Calculation Method; According to the picpointed coordinate that measures on the remote sensing photo and a small amount of ground control point; Adopt tighter mathematical formulae, press principle of least square method, resolve the planimetric coordinates and the elevation of point to be located with digital computer.
The Spaceborne SAR System of various countries is given an example, the U.S.: Seasat-1, Sir-A, Sir-B, Sir-C, LACROS SE SAR, LightSAR, Medsat SAR.Europe: ERS-1, ERS-2, XSAR, ASAR.Canada: Radarsat-1, Radarsat-2.
DEM (Digital Elevation Model) is the planimetric coordinates (X of regular grid point in the certain limit; Y) and the data set of elevation (Z); It mainly is a space distribution of describing regional landform shape; Be to carry out data acquisition (comprising sampling and measurement), carry out interpolation of data then and form through level line or similar stereoscopic model.
The measurement adjustment method: utilize principle of least square method rationally to adjust observational error, a kind of computing method of achievement precision are measured in evaluation.Its purpose is to eliminate the contradiction between each observed reading, in the hope of the precision of the most reliable result and evaluation measurement result.
Summary of the invention
" combined type " of the present invention as the purpose of stereo localization method provide a kind of can utilize multiple sensors difference constantly, the same type of the areal that obtains of diverse location or dissimilar remote sensing images form stereogram, and calculate the method that obtains the terrain object volume coordinate.The present invention can blocked owing to cloud layer and reason such as atmospheric pollution causes the decline of obtaining the optical image quality, and under the SAR stereoplotting situation that is difficult to again carry out, realizes that image that different sensors is obtained constitutes " combined type " as to positioning.
In order to realize goal of the invention, scheme of the present invention is: a kind of combined type is as the stereo localization method, and step is following:
Step 1), according to various sensor imaging modes, the structure of setting up corresponding various sensors respectively is as model;
Step 2); The image of selecting acquired two identical or different types is as stereogram; Select in the step 1) with said picture to corresponding respectively structure as model; According to the known control condition, calculate said picture respectively to the model parameter of structure as model, the structure of known models parameter that obtains picture centering is as model;
Step 3) is selected an impact point, with this impact point picture on two picture coordinates corresponding steps 2 of substitution respectively) structure of described known models parameter is as model, unites the terrestrial coordinate that resolves this impact point;
Step 4), repeating step 3), resolves the terrestrial coordinate of whole imagery zone, realize the location.
The described structure of step 1) comprises as model: optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.In the step 3), calculate the terrestrial coordinate of impact point through simultaneous adjustment.
The present invention be based on the multiple sensors structure as " combined type " of model as the stereo localization method.The step 3) of scheme; Because the array mode as right is more; The multiple mode that combines is in twos arranged, and two right " structure of known models parameter is as models " of said picture are combined into one " the associating structure is as model " that the combined type picture is right, owing to obtained corresponding model parameter according to controlled condition; Bring the two group coordinates of same impact point in the difference picture into, just can solve corresponding topocentric terrestrial coordinate through photogrammetric compensating computation.Progressively increase impact point quantity, finally realize the solid location of target area.Said process is all accomplished on computer platform.
The present invention's " combined type " can realize the remote sensing image formation stereogram that dissimilar sensor obtains as the stereo localization method and position, compare traditional solid locating method and have applicability widely; With the strict of various sensors or approximate strict structure as model be fundamental construction " combined type " as the stereo location model; Can utilize various remote sensing image datas more efficiently; And can guarantee data under the incomplete situation, accurately to be obtained the spatial information of regional highest priority interested at the conventional stereo picture.The more important thing is that it can guarantee at some accurately to obtain the positional information of regional highest priority interested under the special or extreme case.Particularly obtain channel when obstructed at remote sensing image data, can be through the combination of new and old image, the modes such as combination of combination, the dissimilar images of image constitute " combined type " stereogram and position both at home and abroad.Even can carry out " combined type " stereoplotting under certain condition, realize the fast updating of regional topographic(al) data interested.
Description of drawings
" combined type " that Fig. 1 optics and SAR sensor constitute is as stereo location synoptic diagram;
Fig. 2 " combined type " is as stereo localization method process flow diagram;
The object-image relation of Fig. 3 picture formula photo;
The object-image relation of Fig. 4 line array CCD;
Fig. 5 distance condition synoptic diagram;
Fig. 6 Doppler condition synoptic diagram;
Four width of cloth satellite image corresponding regions and reference mark distribution schematic diagram in Fig. 7 specific embodiment one;
In Fig. 8 specific embodiment one six " combined types " as error statistics to the checkpoint, ◇ directions X error, Y deflection error, △ Z deflection error;
The spatial relation of four satellites in Fig. 9 specific embodiment one;
Four width of cloth image corresponding regions and checkpoint distribution schematic diagram in Figure 10 specific embodiment two;
In Figure 11 specific embodiment two six " combined types " as error statistics to the checkpoint, ◇ directions X error, Y deflection error, △ Z deflection error;
The spatial relation synoptic diagram of four width of cloth images in Figure 12 specific embodiment two.
Embodiment
Some structures that adopted among the embodiment below at first introducing are as model, optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.
1, sensor structure commonly used is as model
(1) face central projection model
Picture formula image adopts face central projection model usually; Every scape image has unique projection centre (video camera object lens center S), and the reflection ray of the each point on the regional ground of taking the photograph arrives as forming image on the plane through projection centre, and is as shown in Figure 3; Wherein O-XYZ is an earth axes; S-xyz is that initial point is being taken the photograph the photogrammetric coordinate system at station, and o-xy is a photo coordinate system, and So is the primary optical axis direction.
Ground point P (X, Y, Z) with picture point p (x, y) and projection centre S conllinear, the strict imaging model of picture formula image is:
x = - f a 1 ( X - X S ) + b 1 ( Y - Y S ) + c 1 ( Z - Z S ) a 3 ( X - X S ) + b 3 ( Y - Y S ) + c 3 ( Z - Z S ) y = - f a 2 ( X - X S ) + b 2 ( Y - Y S ) + c 2 ( Z - Z S ) a 3 ( X - X S ) + b 3 ( Y - Y S ) + c 3 ( Z - Z S ) - - - ( 1 )
Wherein f is the focal length of photography camera, and (x is the picture planimetric coordinates of picture point p in o-xy y), and (X, Y Z) are the coordinate of ground point P in earth axes O-XYZ, (X S, Y S, Z S) for taking the photograph the coordinate of station S in earth axes O-XYZ, a i, b i, c i(i=1,2,3) do
Figure BDA0000126177460000062
ω, the element in the determined rotation matrix of three foreign sides of κ parallactic angle element.
(2) line central projection model
The line array CCD image adopts line central projection model usually; Because it is to be pushed away to sweep along heading by the linear array sensor to form; Have strict central projection relation between each scan line image and the subject, each scanning provisional capital has elements of exterior orientation separately.But in the platform flight course, its attitude changes can be thought quite stably, so what can suppose every scape image is the mid point of central scan row as the planimetric coordinates initial point, the elements of exterior orientation of each scan line is with the variation of y value, and then elements of exterior orientation can be described as:
Figure BDA0000126177460000063
Figure BDA0000126177460000064
is the elements of exterior orientation of i scan line in the formula; Y is the picture planimetric coordinates of this scan line along heading;
Figure BDA0000126177460000065
is the capable elements of exterior orientation of centre scan, and
Figure BDA0000126177460000066
is the single order variability of elements of exterior orientation.
As shown in Figure 4, the picture point on the i scan line and correspondingly the central projection between millet cake relation be:
x i = - f a 1 ( X - X si ) + b 1 ( Y - Y si ) + c 1 ( Z - Z si ) a 3 ( X - X si ) + b 3 ( Y - Y si ) + c 3 ( Z - Z si ) 0 = - f a 2 ( X - X si ) + b 2 ( Y - Y si ) + c 2 ( Z - Z si ) a 3 ( X - X si ) + b 3 ( Y - Y si ) + c 3 ( Z - Z si ) - - - ( 3 )
Being write as matrix form is:
x i 0 - f = 1 λ M i T X - X Si Y - Y Si Z - Z Si = 1 λ a 1 a 2 a 3 b 1 b 2 b 3 c 3 c 2 c 3 T X - X Si Y - Y Si Z - Z Si - - - ( 4 )
(x in the formula i, 0) be i capable on picture point the picture planimetric coordinates, (X, Y Z) are its ground coordinate of millet cake accordingly, (X Si, Y Si, Z Si) be i capable take the photograph station coordinates, λ is a scaling factor, a i, b i, c i(i=1,2,3) are by the capable foreign side's parallactic angle element of i
Figure BDA0000126177460000073
ω i, k iNine elements in the determined rotation matrix, promptly
Figure BDA0000126177460000074
(3) F.Leberl model
Can know by the SAR image-forming principle; Length according to the target reflection echo time; Can confirm that picture point arrives the distance of radar antenna, and confirm thus the target picture point distance to the picture planimetric position, can also be according to the doppler characterization of target echo; Through the orientation processed compressed, confirm the target picture point in the orientation to the picture planimetric position.The F.Leberl structure is exactly above-mentioned 2 mathematical models of expressing radar image moment structure picture as model.
1. distance condition
As shown in Figure 5, D S0For the low coverage scan edge postpones, V SBe the speed along heading, R SBe the oblique distance of center of antenna S to ground point P, H is the flying height of center of antenna S to data naturalization plane (reference field), y SFor the distance of ground point P on the slant-range display image to the picture coordinate, y gFor the distance of ground point P on the distance display image to the picture coordinate, R 0Be the projection of delayed sweep on data naturalization plane, M yFor the distance of slant-range display image to pixel resolution, m yFor flat distance apart from display image to pixel resolution.
Have for the slant-range display image:
(X-X S) 2+(Y-Y S) 2+(Z-Z S) 2=(y sM y+D S0) 2 (6)
In the formula: (X, Y Z) are the object space coordinate of ground point P, (X S, Y S, Z S) being the object space coordinate of center of antenna instantaneous position S (being the polynomial function of flight time T), can be expressed as:
X S = X S 0 + X V 0 T + 1 2 X a 0 T 2 + . . . . . . Y S = Y S 0 + Y V 0 T + 1 2 Y a 0 T 2 + . . . . . . Z S = Z S 0 + Z V 0 T + 1 2 Z a 0 T 2 + . . . . . . T = K x · x - - - ( 7 )
X wherein S0, Y S0, Z S0Be the instantaneous object space coordinate in radar antenna center corresponding to the picture true origin, X V0, Y V0, Z V0Be the component (the single order variability of elements of exterior orientation) of aircraft corresponding to the velocity of picture true origin, X A0, Y A0, Z A0Be the component (the second order variability of elements of exterior orientation) of aircraft corresponding to the acceleration of picture true origin, T for picture coordinate x with respect to the initial point flight time constantly, x is that the orientation of radar image is to picture planimetric coordinates, K xFor image orientation to every row used sweep time.
If any time, the velocity of center of antenna was (X V, Y V, Z V), by (7) Shi Kede:
X V = ∂ X S ∂ T = X V 0 + X a 0 T + . . . . . . Y V = ∂ Y S ∂ T = Y V 0 + Y a 0 T + . . . . . . Z V = ∂ Z S ∂ T = Z V 0 + Z a 0 T + . . . . . . - - - ( 8 )
In like manner, have apart from display image for flat:
(X-X S) 2+(Y-Y S) 2+(Z-Z S) 2=(y gm y+R 0) 2+H 2 (9)
2. Doppler's condition
As shown in Figure 6, τ is the angle that waits Doppler's circular conical surface and vertical direction, and D-XYZ is an earth axes, the same Fig. 5 of other variable-definition.
Doppler's condition formula of radar image does
f DC = 2 ( V → S - V → P ) · ( S → - P → ) λ | S → - P → | = 2 | V → S - V → P | λ · ( V → S - V → P ) · ( S → - P → ) | V → S - V → P | · | S → - P → | = 2 | V → S - V → P | λ · sin τ
(10)
F wherein DCBe doppler centroid, λ is a radar wavelength,
Figure BDA0000126177460000093
With
Figure BDA0000126177460000094
Be respectively the velocity of S and P,
Figure BDA0000126177460000095
With
Figure BDA0000126177460000096
Be respectively the position vector of S and P.Millet cake is static hypothetically, promptly at this moment (10) formula can be reduced to
X V ( X - X S ) + Y V ( Y - Y S ) + Z V ( Z - Z S ) = - λ R S 2 f DC - - - ( 11 )
R wherein SOblique distance for ground point P.
What F.Leberl adopted is positive side-looking corner condition or zero Doppler's condition, i.e. τ=0, and this moment, the satellite flight velocity was vertical with the position vector of S sensing P, f DC=0, so be called zero Doppler's condition.(11) formula becomes at this moment
X V(X-X S)+Y V(Y-Y S)+Z V(Z-Z S)=0(12)
(7) or (9) and (12) formula constituted the F.Leberl model jointly, X is arranged in the imaging equation S0, Y S0, Z S0, X V0, Y V0, Z V0, X A0, Y A0, Z A0, D S0, K x, M y(or m y) 12 orientation parameters, D usually S0, K x, M y(or m y) can be directly given by the systematic parameter of radar.
(4) rational function model
1. the normal solution form of rational function model
(Rational Function Model is that (r c) is expressed as that (X, Y Z) are the polynomial ratio of independent variable, that is: with millet cake volume coordinate correspondingly with picpointed coordinate RFM) to rational function model
r n = p 1 ( X n , Y n , Z n ) p 2 ( X n , Y n , Z n ) c n = p 3 ( X n , Y n , Z n ) p 4 ( X n , Y n , Z n ) - - - ( 13 )
Wherein, p i(X n, Y n, Z n) (i=1,2,3,4) be common polynomial expression, the highest being no more than 3 times, form is following:
p i(X,Y,Z)=a 0+a 1Z+a 2Y+a 3X+a 4ZY+a 5ZX+a 6YX+a 7Z 2+a 8Y 2+a 9X 2+a 10ZYX+a 11Z 2Y+a 12Z 2X+a 13Y 2Z+a 14Y 2X+a 15ZX 2+a 16YX 2+a 17Z 3+a 18Y 3+a 19X 3
Multinomial coefficient a in the formula 0..., a 19Be called the rational function coefficient (Rational Function Coefficients, RFCs).
(r in the formula (2-38) n, c n) and (X n, Y n, Z n) represent respectively picpointed coordinate (r, c) and topocentric coordinates (X, Y, the Z) normalized coordinates behind Pan and Zoom, value is positioned between (1.0~+ 1.0).It is to reduce in the computation process because the round-off error of the excessive introducing of data bulk level difference that RFM adopts the purpose of normalized coordinates.
2. the anti-form of separating of rational function model
The anti-form of separating of rational function model:
X n = p 5 ( r n , c n , Z n ) p 6 ( r n , c n , Z n ) Y n = p 7 ( r n , c n , Z n ) p 8 ( r n , c n , Z n ) - - - ( 14 )
Polynomial expression p in the formula i(i=5,6,7,8) form is following:
p i(r,c,Z)=a′ 0+a′ 1Z+a′ 2c+a′ 3r+a′ 4cZ+a′ 5rZ+a′ 6cr+a′ 7Z 2+a′ 8c 2+a′ 9r 2+a′ 10crZ+a′ 11cZ 2+a′ 12rZ 2+a′ 13c 2Z+a′ 14c 2r+a′ 15r 2Z+a′ 16cr 2+a′ 17Z 3+a′ 18c 3+a′ 19r 3
Be different from collinearity equation, rational function model can only provide object space to arrive picture side or the picture side some direction transformations among the object space, and inverse transformation need align the transformation model linearization, accomplishes through the iterative process under certain initial value.
2, structure resolving as model parameter
(1) face central projection model
Formula (1) is carried out can getting after the linearization
In the formula:
Figure BDA0000126177460000121
In (16) formula, x See, y SeeThe observed reading of expression picpointed coordinate, x Meter, y MeterThe expression picpointed coordinate calculated value, it with
Figure BDA0000126177460000122
Should calculate with reference to (17) and (18) formula, promptly
X ‾ Y ‾ Z ‾ = a 1 a 2 a 3 b 1 b 2 b 3 c 1 c 2 c 3 T X - X S Y - Y S Z - Z S - - - ( 17 )
Figure BDA0000126177460000124
Every scape picture formula image has 6 elements of exterior orientation, adopts more than 3 not the ground control point of conllinear to utilize space resection just can find the solution 6 elements of exterior orientation of image according to (15) formula.
(2) line central projection model
To getting after (3) formula linearization:
In the formula
Figure BDA0000126177460000132
Be the correction of each elements of exterior orientation, dX, dY, dZ are the topocentric coordinates correction, C Ij(i=1,2; J=1 ..., 6) and be the coefficient of each correction, l x, l yBe constant term.
Figure BDA0000126177460000133
Z ‾ = a 3 ( X - X S ) + b 3 ( Y - Y S ) + c 3 ( Z - Z S ) - - - ( 21 )
Being write as matrix form is:
A · X + B · X · - L = V Power P (22)
A = C 11 C 12 C 13 C 14 C 15 C 16 y C 11 y C 12 y C 13 y C 14 y C 15 y C 16 C 21 C 22 C 23 C 24 C 25 C 26 yC 21 yC 22 yC 23 yC 24 y C 25 y C 26
B = - C 14 - C 15 - C 16 - C 24 - C 25 - C 26
Figure BDA0000126177460000138
X · = dX dY dZ T
L=[l x l y] T
P = p x 0 0 p y
Wherein A is a matrix of coefficients; L is an observation vector; X is the unknown parameter vector; V is the observational error vector, and Quan Zhen P generally is a symmetric positive definite,
Figure BDA0000126177460000141
be topocentric coordinates correction vector.
Each picture point on the image can be listed two error equations by (19) formula, and for reference mark dX=dY=dZ=0, contain 12 unknown parameters in the equation this moment, utilizes more than 6 not the reference mark of conllinear to find the solution.
(3) the F.Leberl structure is as model
If use F 1And F 2Represent (6) and (12) formula respectively, then have:
F 1 = ( y M y + D S 0 ) 2 - ( X - X S ) 2 - ( Y - Y S ) 2 - ( Z - Z S ) 2 = 0 F 2 = X V ( X - X S ) + Y V ( Y - Y S ) + Z V ( Z - Z S ) = 0 - - - ( 23 )
Formula (23) is a nonlinear function, can not directly be used for compensating computation, therefore, need carry out linearization to it.Can get after its linearization:
A 11 dX S 0 + A 12 dY S 0 + A 13 dZ S 0 + A 14 dX V 0 + A 15 d Y V 0 + A 16 dZ V 0 + A 17 dX a 0 + A 18 dY a 0 + A 19 dZ a 0 + B 11 dX + B 12 dY + B 13 dZ - F 1 = C 11 v x + C 12 v y A 21 dX S 0 + A 22 dY S 0 + A 23 dZ S 0 + A 24 dX V 0 + A 25 dY V 0 + A 26 dZ V 0 + A 27 dX a 0 + A 28 dY a 0 + A 29 dZ a 0 + B 21 dX + B 22 dY + B 23 dZ - F 2 = C 21 v x + C 22 v y - - - ( 24 )
Wherein:
The correction of orientation parameter does
Δ=[dX S0,dY S0,dZ S0,dX V0,dY V0,dZ V0,dX a0,dY a0,dZ a0] T
Constant term is L=-[F 1, F 2] T
Factor arrays A = A 11 A 12 A 13 A 14 A 15 A 16 A 17 A 18 A 19 A 21 A 22 A 23 A 24 A 25 A 26 A 27 A 28 A 29
B = B 11 B 12 B 13 B 21 B 22 B 23
C = C 11 C 12 C 21 C 22
And have:
A 11 = ∂ F 1 / ∂ X S 0 = 2 ( X - X S ) A 12 = ∂ F 1 / ∂ Y S 0 = 2 ( Y - Y S ) A 13 = ∂ F 1 / ∂ Y S 0 = 2 ( Z - Z S ) A 14 = ∂ F 1 / ∂ X V 0 = 2 ( X - X S ) T A 15 = ∂ F 1 / ∂ Y V 0 = 2 ( Y - Y S ) T A 16 = ∂ F 1 / ∂ Z V 0 = 2 ( Z - Z S ) T A 17 = ∂ F 1 / ∂ X a 0 = ( X - X S ) T 2 A 18 = ∂ F 1 / ∂ Y a 0 = ( Y - Y S ) T 2 A 19 = ∂ F 1 / ∂ Z a 0 = ( Z - Z S ) T 2 A 21 = ∂ F 2 / ∂ X S 0 = - X V A 22 = ∂ F 2 / ∂ Y S 0 = - Y V A 23 = ∂ F 2 / ∂ Z S 0 = - Z V A 24 = ∂ F 2 / ∂ X V 0 = ( X - X S ) - X V T A 25 = ∂ F 2 / ∂ Y V 0 = ( Y - Y S ) - Y V T A 26 = ∂ F 2 / ∂ Z V 0 = ( Z - Z S ) - Z V T A 27 = ∂ F 2 / ∂ X a 0 = ( X - X S ) T - X V T 2 / 2 A 28 = ∂ F 2 / ∂ Y a 0 = ( Y - Y S ) T - X V T 2 / 2 A 29 = ∂ F 2 / ∂ Z a 0 = ( Z - Z S ) T - X V T 2 / 2
B 11 = ∂ F 1 / ∂ X = - 2 ( X - X S ) B 12 = ∂ F 1 / ∂ Y = - 2 ( Y - Y S ) B 13 = ∂ F 1 / ∂ Z = - 2 ( Z - Z S ) B 21 = ∂ F 2 / ∂ X = X V B 22 = ∂ F 2 / ∂ Y = Y V B 23 = ∂ F 2 / ∂ Z = Z V
C 11 = ∂ F 1 / ∂ x = 2 K x · F 2 C 12 = ∂ F 1 / ∂ y = 2 M y ( y M y + D S 0 ) C 21 = ∂ F 2 / ∂ X = - K x ( X V 2 + Y V 2 + Z V 2 ) C 22 = ∂ F 2 / ∂ Y = 0
DX wherein S0... dZ A0Be the correction of each elements of exterior orientation, dX, dY, dZ are the topocentric coordinates correction, A IjBe the coefficient of each elements of exterior orientation correction, B IjBe the coefficient of each ground point coordinate correction number, F 1, F 2Be constant term, v x, v yBe picture coordinate residual error, C IjCoefficient for picture coordinate residual error.(24) formula is write as matrix form and is done
A Δ+BX-L=CV weighs P (25)
According to (25) formula, need 5 ground control points that satisfy the certain distributed condition at least, can utilize principle of least square iterative to go out orientation parameter.
(4) rational function model
The linearization procedure of RFM is following.
At first, we can be rewritten into following form with (13) formula:
r = 1 Z n Y n X n . . . Y n 3 X n 3 · a 0 a 1 . . . a 19 T 1 Z n Y n X n . . . Y n 3 X n 3 · 1 b 1 . . . b 19 T - - - ( 26 a )
c = 1 Z n Y n X n . . . Y n 3 X n 3 · c 0 c 1 . . . c 19 T 1 Z n Y n X n . . . Y n 3 X n 3 · 1 d 1 . . . d 19 T - - - ( 26 b )
Order: B=(1 Z nY nX nY n 3X n 3) (1 b 1B 19) T
D=(1?Z n Y n?X n …Y n 3?X n 3)·(1?d 1…d 19) T
J=(a 0?a 1…a 19?b 1?b 2?…b 19) T
K=(c 0?c 1…c 19?d 1?d 2…d 19) T
Then error equation can be expressed as:
v r = 1 B Z n B Y n B X n B . . . Y n 3 B X n 3 B - r n Z n B - r n Y n B . . . - r n Y n 3 B - r n X n 3 B · J - r n B
v c = 1 D Z n D Y n D X n D . . . Y n 3 D X n 3 D - c n Z n D - c n Y n D . . . - c n Y n 3 D - c n X n 3 D · K - c n D
A given N reference mark (N>39), can list following error equation (is example with r):
v r 1 v r 2 . . . v rN = 1 B 1 Z n 1 B 1 . . . X n 1 3 B 1 - r n 1 Z n 1 B 1 . . . - r n 1 X n 1 3 B 1 1 B 2 Z n 2 B 2 . . . X n 2 3 B 2 - r n 2 Z n 2 B 2 . . . - r n 2 X n 2 3 B 2 . . . . . . . . . . . . . . . . . . . . . 1 B N Z nN B N . . . X nN 3 B N - r nN Z nN B N . . . - r nN X nN 3 B N a 0 . . . a 19 b 1 . . . b 19 - r n 1 B 1 r n 2 B 2 . . . r nN B N
(27)
Or be expressed as: V r=M 1J-R
Then the least square solution of coefficient J is: J=(M 1 TM 1) -1M 1 TR
In like manner, the error equation as coordinate c is: V c=M 2K-C
So obtain K=(M 2 TM 2) -1M 2 TC.
RFCs has two kinds to resolve mode, i.e. direct solution and iterative solution method.Direct solution is about to B i, D i(i=1,2 ..., N) all be made as 1, the coefficient J that calculates, K as end product.The iterative solution rule as initial value, obtains optimal result through the iterative process under the certain condition with the result of direct solution.Because calculating the used error equation of RFCs is the mathematical model after the linearization, generally will carry out iteration for obtaining optimum solution.
3, " combined type " as the foundation of stereo location model with resolve
" combined type " as the stereo localization method, it mainly comprises following process: at first, the type right according to picture selects to be suitable for the right structure of picture as model; According to existing controlled condition, calculate as right structure as model parameter; Aforementioned 1; 2 liang of steps have specifically been narrated; Be exactly below according to " combined type " as the structure of each image of centering as model (known models parameter); Set up multi-form " combined type " as to " " find the solution at last, obtains terrestrial coordinate by (different pictures are different as model to the associating structure that constitutes) as model for the associating structure.Fig. 2 has provided a kind of process flow diagram of the inventive method.
Carrying out " combined type " with optical sensor and SAR sensor below orientates example as as stereo and specifies the present invention.Be illustrated in figure 1 as an optical sensor and SAR sensor and carry out " combined type " synoptic diagram as the stereo location.
Front 1,2 detailed description in step is arranged, and in fact " combined type " is exactly the picture coordinate (x of same impact point from two photos as the stereo location process 1, y 1x 2, y 2) confirm its terrestrial coordinate X, Y, Z.As given value substitution three-dimensional location model separately, 4 equations of simultaneous can be found the solution 3 unknown number X, Y, Z with the orientation parameter of picture coordinate and the photo of impact point.
Owing to have redundant observation, need to use the best fit approximation value that least square method is calculated unknown number.Accounting equation is (matrix form) as follows:
CV+DX+L=0 weighs P (28)
Wherein
The residual error coefficient battle array C = C 11 ′ C 12 ′ 0 0 C 21 ′ C 22 ′ 0 0 0 0 C 11 ′ ′ C 12 ′ ′ 0 0 C 21 ′ ′ C 22 ′ ′ = C → ′ 0 0 C → ′ ′
Residual error V = ( v x 1 , v y 1 , v x 2 , v y 2 ) T
Factor arrays D = D 11 ′ D 12 ′ D 13 ′ D 21 ′ D 22 ′ D 23 ′ D 11 ′ ′ D 12 ′ ′ D 13 ′ ′ D 21 ′ ′ D 22 ′ ′ D 23 ′ ′ = D ′ D ′ ′
Ground coordinate reduction X=(Δ X, Δ Y, Δ Z) T
Constant vector L = ( F x 1 , F y 1 , F x 2 , F y 2 ) T
Power P = p x 1 0 0 0 0 p y 1 0 0 0 0 p x 2 0 0 0 0 p y 2
This equation can be reduced to
V=D ··X-L ·
Wherein
D ·=-C -1·D L ·=C -1·L
X, Y, the increment of Z can be used computes
X=(D ·TPD ·) -1·(D ·TPL ·) (29)
Behind the given initial value, iterative computation ground coordinate X, Y, the increment Delta X of Z, Δ Y, Δ Z, up to Δ X, Δ Y, Z is enough little for Δ, obtains X at last, Y, Z.
(29) in the formula among C, D and the L value of parameters change as the different of model according to the photo structure, below we give the calculating formula of the matrix element of appear central projection model, line central projection model, F.Leberl model, RFM respectively.
1. face central projection model
C 11 ′ = 1 C 12 ′ = 0 C 21 ′ = 0 C 22 ′ = 1 D 11 ′ = ( a 1 f + a 3 x ) Z ‾ D 12 ′ = ( b 1 f + b 3 x ) Z ‾ D 13 ′ = ( c 1 f + c 3 x ) Z ‾ D 21 ′ = ( a 2 f + a 3 y ) Z ‾ D 22 ′ = ( b 2 f + b 3 y ) Z ‾ D 23 ′ = ( c 2 f + c 3 y ) Z ‾
Figure BDA0000126177460000193
2. line central projection model
C 11 ′ = 1 C 12 ′ = 0 C 21 ′ = 0 C 22 ′ = 1 D 11 ′ = 1 Z ‾ ( a 3 x 1 + fa 1 ) D 12 ′ = 1 Z ‾ ( b 3 x 1 + fb 1 ) D 13 ′ = 1 Z ‾ ( c 3 x 1 + fc 1 ) D 21 ′ = fa 2 Z ‾ D 22 ′ = fb 2 Z ‾ D 23 ′ = fc 2 Z ‾
Figure BDA0000126177460000196
3. F.Leberl model
C 11 ′ = 2 K x · F 2 C 12 ′ = 2 M y ( y M y + Ds 0 ) C 21 ′ = - K x ( Xv 2 + Yv 2 + Zv 2 ) C 22 ′ = 0 D 11 ′ = - 2 ( X - X S ) D 12 ′ = - 2 ( Y - Y S ) D 13 ′ = - 2 ( Z - Z S ) D 21 ′ = X V D 22 ′ = Y V D 23 ′ = Z V
F x 1 = ( X - X S ) 2 - ( Y - Y S ) 2 - ( Z - Z S ) 2 - ( y 1 M y + Ds 0 ) 2 F y 1 = - X V ( X - X S ) + Y V ( Y - Y S ) + Z V ( Z - Z S )
④RFM
C 11 ′ = 1 C 12 ′ = 0 C 21 ′ = 0 C 22 ′ = 1 D 11 ′ = c s X s · ∂ p 3 ∂ X n · p 4 - p 3 · ∂ p 4 ∂ X n p 4 · p 4 D 12 ′ = c s Y s · ∂ p 3 ∂ Y n · p 4 - p 3 · ∂ p 4 ∂ Y n p 4 · p 4 D 13 ′ = c s Z s · ∂ p 3 ∂ Z n · p 4 - p 3 · ∂ p 4 ∂ Z n p 4 · p 4 D 21 ′ = r s X s · ∂ p 1 ∂ X n · p 2 - p 1 · ∂ p 2 ∂ X n p 2 · p 2 D 22 ′ = r s Y s · ∂ p 1 ∂ Y n · p 2 - p 1 · ∂ p 2 ∂ Y n p 2 · p 2 D 23 ′ = r s Z s · ∂ p 1 ∂ Z n · p 2 - p 1 · ∂ p 2 ∂ Z n p 2 · p 2 F x 1 = c - c ^ F y 1 = r - r ^
4, below in conjunction with concrete experiment and experimental data effect of the present invention is described.
The subordinate list explanation of some that use
The parametric statistics of experiment image in the concrete experiment one of table 1;
" combined type " is as to the bearing accuracy statistics in the concrete experiment one of table 2;
" combined type " is as right correlation parameter in the concrete experiment one of table 3;
The parametric statistics of experiment image in the concrete experiment two of table 4;
" combined type " is as to the bearing accuracy statistics in the concrete experiment two of table 5;
" combined type " is as right correlation parameter in the concrete experiment two of table 6.
Experiment one
Selected the remote sensing image of four satellites in the Beijing area that different time obtains, its concrete parameter sees table 1.Fig. 7 has shown the overlay area of four width of cloth images and the distribution situation of ground control point, and what wherein triangle was represented is the reference mark, totally 50; What blue triangle was represented is the reference mark on the optical image; Totally 27, what red triangle was represented is the reference mark on the SAR image, totally 23; What green circle was represented is the checkpoint, totally 35.
The combination in twos of four width of cloth images can be formed 6 stereograms altogether, and their positioning result sees Table 2.Calculated the correlation parameter of each stereogram according to the orientation parameter of image, comprised photo degree of overlapping (pressing area estimates), intersection angle size, base length, base-height ratio, its result sees table 3.In addition, in Fig. 8, give the error statistics situation of all checkpoints in 6 stereograms, Fig. 9 has then shown the spatial relation and the earth observation situation of four satellites.
Experimental result shows that " combined type " is correctly feasible as the stereo localization method, but influence " combined type " is different as factor and conventional stereo location to bearing accuracy.
The parametric statistics of table 1 experiment image
Figure BDA0000126177460000211
Table 2 " combined type " is as to the bearing accuracy statistics
Figure BDA0000126177460000212
Table 3 " combined type " is as right correlation parameter
Figure BDA0000126177460000213
Experiment two
Selected a width of cloth IKONOS satellite image, two width of cloth SPOT-4 satellite images and a width of cloth carried SAR image in area, Tai'an, Shandong Province, its concrete parameter is seen table 4.Figure 10 has shown the overlay area of four width of cloth images, and has indicated the distribution situation of 28 ground check points with red triangle.
The combination in twos of four width of cloth images can be formed 6 stereograms altogether, and their positioning result sees Table 5.Calculated the correlation parameter of each stereogram according to the orientation parameter of image, its result sees table 6.Figure 11 has provided the error statistics situation of all checkpoints in 6 stereograms.Figure 12 has shown the spatial relation and the earth observation situation of four satellites.This embodiment result is similar with the result of embodiment one.
The parametric statistics of table 4 experiment image
Figure BDA0000126177460000222
Table 5 " combined type " is as to the bearing accuracy statistics
Figure BDA0000126177460000223
Figure BDA0000126177460000231
Table 6 " combined type " is as right correlation parameter
Figure BDA0000126177460000232

Claims (3)

1. a combined type is characterized in that as the stereo localization method step is following:
Step 1), according to various sensor imaging modes, the structure of setting up corresponding various sensors respectively is as model;
Step 2); The image of selecting acquired two identical or different types is as stereogram; Select in the step 1) with said picture to corresponding respectively structure as model; According to the known control condition, calculate said picture respectively to the model parameter of structure as model, the structure of known models parameter that obtains picture centering is as model;
Step 3) is selected an impact point, with this impact point picture on two picture coordinates corresponding steps 2 of substitution respectively) structure of described known models parameter is as model, unites the terrestrial coordinate that resolves this impact point;
Step 4), repeating step 3), resolves the terrestrial coordinate of whole imagery zone, realize the location.
2. a kind of combined type according to claim 1 is characterized in that as the stereo localization method the described structure of step 1) comprises as model: optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.
3. a kind of combined type according to claim 1 is characterized in that as the stereo localization method, in the step 3), calculates the terrestrial coordinate of impact point through simultaneous adjustment.
CN2011104487075A 2011-12-28 2011-12-28 Combined type image pair three-dimensional location method Pending CN102538764A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011104487075A CN102538764A (en) 2011-12-28 2011-12-28 Combined type image pair three-dimensional location method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011104487075A CN102538764A (en) 2011-12-28 2011-12-28 Combined type image pair three-dimensional location method

Publications (1)

Publication Number Publication Date
CN102538764A true CN102538764A (en) 2012-07-04

Family

ID=46346257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011104487075A Pending CN102538764A (en) 2011-12-28 2011-12-28 Combined type image pair three-dimensional location method

Country Status (1)

Country Link
CN (1) CN102538764A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103021021A (en) * 2012-11-15 2013-04-03 哈尔滨工业大学 Generalized stereopair three-dimensional reconstruction method adopting variance component estimation
CN103389065A (en) * 2013-08-01 2013-11-13 罗建刚 Measurement method and arithmetic facility of azimuth angle from photographing centre to objective physical point
CN106951567A (en) * 2017-04-07 2017-07-14 安徽建筑大学 Water system space analysis method based on remote sensing technology for Huizhou traditional settlement
CN109612439A (en) * 2018-12-13 2019-04-12 同济大学 Stereopsis intersection angle and baseline length estimation method based on rational function model
CN110826423A (en) * 2019-10-18 2020-02-21 中北大学 Method, device and system for detecting interested target in group target
CN116753918A (en) * 2023-06-19 2023-09-15 中国人民解放军61540部队 Ground target position estimation method and device based on empty antenna array sensor

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
《测绘学报》 20080531 邢帅 等 光学与SAR卫星遥感影像复合式"立体"定位技术的研究 第37卷, 第2期 *
巩丹超: "高分辨率卫星遥感立体影像处理模型与算法", 《中国博士学位论文全文数据库》 *
智长贵 等: "OrthoBASE模块对框幅式航片数字微分纠正原理的探讨", 《河南农业大学学报》 *
邢帅 等: "SPOT5与ERS_2卫星影像的联合立体定位", 《测绘科学》 *
邢帅 等: "不同类型遥感影像的联合立体定位", 《第十四届全国图象图形学学术会议论文集》 *
邢帅 等: "光学与SAR卫星遥感影像复合式"立体"定位技术的研究", 《测绘学报》 *
项仲贞 等: "摄影测量与遥感所用传感器类型及构像方程", 《铁道勘察》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103021021A (en) * 2012-11-15 2013-04-03 哈尔滨工业大学 Generalized stereopair three-dimensional reconstruction method adopting variance component estimation
CN103021021B (en) * 2012-11-15 2016-03-23 哈尔滨工业大学 Adopt the generalized stereopair three-dimensional rebuilding method of variance components estimate
CN103389065A (en) * 2013-08-01 2013-11-13 罗建刚 Measurement method and arithmetic facility of azimuth angle from photographing centre to objective physical point
CN103389065B (en) * 2013-08-01 2015-11-25 罗建刚 Photo centre is to azimuthal measuring method of target material object point
CN106951567A (en) * 2017-04-07 2017-07-14 安徽建筑大学 Water system space analysis method based on remote sensing technology for Huizhou traditional settlement
CN106951567B (en) * 2017-04-07 2020-07-17 安徽建筑大学 Water system space analysis method based on remote sensing technology for Huizhou traditional settlement
CN109612439A (en) * 2018-12-13 2019-04-12 同济大学 Stereopsis intersection angle and baseline length estimation method based on rational function model
CN110826423A (en) * 2019-10-18 2020-02-21 中北大学 Method, device and system for detecting interested target in group target
CN116753918A (en) * 2023-06-19 2023-09-15 中国人民解放军61540部队 Ground target position estimation method and device based on empty antenna array sensor
CN116753918B (en) * 2023-06-19 2024-03-19 中国人民解放军61540部队 Ground target position estimation method and device based on empty antenna array sensor

Similar Documents

Publication Publication Date Title
Grayson et al. GPS precise point positioning for UAV photogrammetry
CN103557841B (en) A kind of method improving polyphaser resultant image photogrammetric accuracy
CN101750619B (en) Method for directly positioning ground target by self-checking POS
CN103914808B (en) Method for splicing ZY3 satellite three-line-scanner image and multispectral image
CN102538764A (en) Combined type image pair three-dimensional location method
CN102735216B (en) CCD stereoscopic camera three-line imagery data adjustment processing method
Carvajal-Ramírez et al. Effects of image orientation and ground control points distribution on unmanned aerial vehicle photogrammetry projects on a road cut slope
CN103674063A (en) On-orbit geometric calibration method of optical remote sensing camera
CN104363438B (en) Full-view stereo making video method
CN102607533B (en) Adjustment locating method of linear array CCD (Charge Coupled Device) optical and SAR (Specific Absorption Rate) image integrated local area network
CN102636159A (en) In-orbit geometrical self-calibration method for multi-mirror aerospace linear array camera system
Haala et al. Hybrid georeferencing of images and LiDAR data for UAV-based point cloud collection at millimetre accuracy
CN105823469A (en) GNSS high precision assisted unmanned plane aerotriangulation method
CN102519436A (en) Chang'e-1 (CE-1) stereo camera and laser altimeter data combined adjustment method
CN110986888A (en) Aerial photography integrated method
CN112461204A (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
CN102944308B (en) Attitude error correcting method of time-space joint modulation interference imaging spectrometer
Maurice et al. A photogrammetric approach for map updating using UAV in Rwanda
Zongjian et al. Accuracy analysis of low altitude photogrammetry with wide-angle camera
CN101776451A (en) CCD mapping camera capable of self-stabilizing and self-correcting motion distortion
Amami et al. Investigations into utilizing low-cost amateur drones for creating ortho-mosaic and digital elevation model
Liu et al. Comparison of DEM accuracies generated from different stereo pairs over a plateau mountainous area
Javanmardi et al. Precise mobile laser scanning for urban mapping utilizing 3D aerial surveillance data
Deltsidis et al. Orthorectification of World View 2 stereo pair using a new rigorous orientation model
Zareei et al. Virtual ground control for survey-grade terrain modelling from satellite imagery

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120704