CN102538764A - Combined type image pair three-dimensional location method - Google Patents
Combined type image pair three-dimensional location method Download PDFInfo
- Publication number
- CN102538764A CN102538764A CN2011104487075A CN201110448707A CN102538764A CN 102538764 A CN102538764 A CN 102538764A CN 2011104487075 A CN2011104487075 A CN 2011104487075A CN 201110448707 A CN201110448707 A CN 201110448707A CN 102538764 A CN102538764 A CN 102538764A
- Authority
- CN
- China
- Prior art keywords
- model
- image
- partiald
- picture
- prime
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Processing (AREA)
Abstract
The invention relates to a combined type image pair three-dimensional location method, which comprises the following steps: step 1 respectively establishing construction image models corresponding to various sensors according to imaging methods of the various sensors; step 2 choosing two same or different images having been obtained as a three-dimensional image pair, and respectively calculating model parameters of the image pair construction image models according to known control conditions to achieve construction image models of known model parameters in the image pair; step 3 choosing a target point, respectively substituting two image coordinates of the target point on the image pair into the construction image models of the known model parameter corresponding to the step 2, and resolving geodetic coordinates of the target point in combination mode; and step 4 repeating the step 3, and resolving geodetic coordinates of the whole image area to achieve location. The combined type image pair three-dimensional location method can guarantee that positional information of focus objectives in interested area can be accurately obtained under special or extreme situations.
Description
Technical field
The invention belongs to the analytical aerial triangulation field in the Photogrammetry and Remote Sensing technology, particularly relate to a kind of in the photogrammetric field based on the multiple sensors structure as " combined type " of model as the stereo localization method.
Background technology
Stereotaxic technique, i.e. stereophotogrammetric survey is to be the basis with the stereogram, by the right observation of stereo picture with measure a special kind of skill of appearance target shape, size, locus and character definitely.Stereogram is meant: a pair of photo with superimposed image of taking the photograph the station picked-up from difference.In conventional radiography measurement and remote sensing; Stereogram is defaulted as and must be obtained by same sensor, but in fact, no matter be two photos that same sensor obtains; As long as satisfy " taking the photograph the station picked-up " and " having certain image overlap " in difference; Just can constitute stereogram, promptly the application's described " combined type " realizes three-dimensional location as right.
Different according to sensor type, the image that people can obtain comprises SAR image and optical image.SAR (Synthetic Aperture Radar), synthetic-aperture radar is to adopt the mobile radar that carries on satellite or aircraft, reaches the radar system of the same precision of large-scale antenna.To the place radar wave takes place through radar, the mode that receives reflection wave is measured, and than optical measurement certain advantage is arranged, as not receiving that meteorological and time factor influence etc.
Propose SAR image and optical image are united the thought of carrying out stereoplotting from the existing abroad scholar of the seventies in last century.But ripe achievement is not arranged so far as yet, and bearing accuracy is not high or only can be fit to the image of same type.
Relevant technical terms is introduced: aerotriangulation is in the stereophotogrammetry, according to a spot of open-air reference mark, encrypts at the indoor reference mark of carrying out, and tries to achieve the elevation of pass point and the measuring method of planimetric position.The analytical aerial triangulation that produces owing to the development of computing machine; Can use Calculation Method; According to the picpointed coordinate that measures on the remote sensing photo and a small amount of ground control point; Adopt tighter mathematical formulae, press principle of least square method, resolve the planimetric coordinates and the elevation of point to be located with digital computer.
The Spaceborne SAR System of various countries is given an example, the U.S.: Seasat-1, Sir-A, Sir-B, Sir-C, LACROS SE SAR, LightSAR, Medsat SAR.Europe: ERS-1, ERS-2, XSAR, ASAR.Canada: Radarsat-1, Radarsat-2.
DEM (Digital Elevation Model) is the planimetric coordinates (X of regular grid point in the certain limit; Y) and the data set of elevation (Z); It mainly is a space distribution of describing regional landform shape; Be to carry out data acquisition (comprising sampling and measurement), carry out interpolation of data then and form through level line or similar stereoscopic model.
The measurement adjustment method: utilize principle of least square method rationally to adjust observational error, a kind of computing method of achievement precision are measured in evaluation.Its purpose is to eliminate the contradiction between each observed reading, in the hope of the precision of the most reliable result and evaluation measurement result.
Summary of the invention
" combined type " of the present invention as the purpose of stereo localization method provide a kind of can utilize multiple sensors difference constantly, the same type of the areal that obtains of diverse location or dissimilar remote sensing images form stereogram, and calculate the method that obtains the terrain object volume coordinate.The present invention can blocked owing to cloud layer and reason such as atmospheric pollution causes the decline of obtaining the optical image quality, and under the SAR stereoplotting situation that is difficult to again carry out, realizes that image that different sensors is obtained constitutes " combined type " as to positioning.
In order to realize goal of the invention, scheme of the present invention is: a kind of combined type is as the stereo localization method, and step is following:
Step 1), according to various sensor imaging modes, the structure of setting up corresponding various sensors respectively is as model;
Step 2); The image of selecting acquired two identical or different types is as stereogram; Select in the step 1) with said picture to corresponding respectively structure as model; According to the known control condition, calculate said picture respectively to the model parameter of structure as model, the structure of known models parameter that obtains picture centering is as model;
Step 3) is selected an impact point, with this impact point picture on two picture coordinates corresponding steps 2 of substitution respectively) structure of described known models parameter is as model, unites the terrestrial coordinate that resolves this impact point;
Step 4), repeating step 3), resolves the terrestrial coordinate of whole imagery zone, realize the location.
The described structure of step 1) comprises as model: optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.In the step 3), calculate the terrestrial coordinate of impact point through simultaneous adjustment.
The present invention be based on the multiple sensors structure as " combined type " of model as the stereo localization method.The step 3) of scheme; Because the array mode as right is more; The multiple mode that combines is in twos arranged, and two right " structure of known models parameter is as models " of said picture are combined into one " the associating structure is as model " that the combined type picture is right, owing to obtained corresponding model parameter according to controlled condition; Bring the two group coordinates of same impact point in the difference picture into, just can solve corresponding topocentric terrestrial coordinate through photogrammetric compensating computation.Progressively increase impact point quantity, finally realize the solid location of target area.Said process is all accomplished on computer platform.
The present invention's " combined type " can realize the remote sensing image formation stereogram that dissimilar sensor obtains as the stereo localization method and position, compare traditional solid locating method and have applicability widely; With the strict of various sensors or approximate strict structure as model be fundamental construction " combined type " as the stereo location model; Can utilize various remote sensing image datas more efficiently; And can guarantee data under the incomplete situation, accurately to be obtained the spatial information of regional highest priority interested at the conventional stereo picture.The more important thing is that it can guarantee at some accurately to obtain the positional information of regional highest priority interested under the special or extreme case.Particularly obtain channel when obstructed at remote sensing image data, can be through the combination of new and old image, the modes such as combination of combination, the dissimilar images of image constitute " combined type " stereogram and position both at home and abroad.Even can carry out " combined type " stereoplotting under certain condition, realize the fast updating of regional topographic(al) data interested.
Description of drawings
" combined type " that Fig. 1 optics and SAR sensor constitute is as stereo location synoptic diagram;
Fig. 2 " combined type " is as stereo localization method process flow diagram;
The object-image relation of Fig. 3 picture formula photo;
The object-image relation of Fig. 4 line array CCD;
Fig. 5 distance condition synoptic diagram;
Fig. 6 Doppler condition synoptic diagram;
Four width of cloth satellite image corresponding regions and reference mark distribution schematic diagram in Fig. 7 specific embodiment one;
In Fig. 8 specific embodiment one six " combined types " as error statistics to the checkpoint, ◇ directions X error, Y deflection error, △ Z deflection error;
The spatial relation of four satellites in Fig. 9 specific embodiment one;
Four width of cloth image corresponding regions and checkpoint distribution schematic diagram in Figure 10 specific embodiment two;
In Figure 11 specific embodiment two six " combined types " as error statistics to the checkpoint, ◇ directions X error, Y deflection error, △ Z deflection error;
The spatial relation synoptic diagram of four width of cloth images in Figure 12 specific embodiment two.
Embodiment
Some structures that adopted among the embodiment below at first introducing are as model, optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.
1, sensor structure commonly used is as model
(1) face central projection model
Picture formula image adopts face central projection model usually; Every scape image has unique projection centre (video camera object lens center S), and the reflection ray of the each point on the regional ground of taking the photograph arrives as forming image on the plane through projection centre, and is as shown in Figure 3; Wherein O-XYZ is an earth axes; S-xyz is that initial point is being taken the photograph the photogrammetric coordinate system at station, and o-xy is a photo coordinate system, and So is the primary optical axis direction.
Ground point P (X, Y, Z) with picture point p (x, y) and projection centre S conllinear, the strict imaging model of picture formula image is:
Wherein f is the focal length of photography camera, and (x is the picture planimetric coordinates of picture point p in o-xy y), and (X, Y Z) are the coordinate of ground point P in earth axes O-XYZ, (X
S, Y
S, Z
S) for taking the photograph the coordinate of station S in earth axes O-XYZ, a
i, b
i, c
i(i=1,2,3) do
ω, the element in the determined rotation matrix of three foreign sides of κ parallactic angle element.
(2) line central projection model
The line array CCD image adopts line central projection model usually; Because it is to be pushed away to sweep along heading by the linear array sensor to form; Have strict central projection relation between each scan line image and the subject, each scanning provisional capital has elements of exterior orientation separately.But in the platform flight course, its attitude changes can be thought quite stably, so what can suppose every scape image is the mid point of central scan row as the planimetric coordinates initial point, the elements of exterior orientation of each scan line is with the variation of y value, and then elements of exterior orientation can be described as:
is the elements of exterior orientation of i scan line in the formula; Y is the picture planimetric coordinates of this scan line along heading;
is the capable elements of exterior orientation of centre scan, and
is the single order variability of elements of exterior orientation.
As shown in Figure 4, the picture point on the i scan line and correspondingly the central projection between millet cake relation be:
Being write as matrix form is:
(x in the formula
i, 0) be i capable on picture point the picture planimetric coordinates, (X, Y Z) are its ground coordinate of millet cake accordingly, (X
Si, Y
Si, Z
Si) be i capable take the photograph station coordinates, λ is a scaling factor, a
i, b
i, c
i(i=1,2,3) are by the capable foreign side's parallactic angle element of i
ω
i, k
iNine elements in the determined rotation matrix, promptly
(3) F.Leberl model
Can know by the SAR image-forming principle; Length according to the target reflection echo time; Can confirm that picture point arrives the distance of radar antenna, and confirm thus the target picture point distance to the picture planimetric position, can also be according to the doppler characterization of target echo; Through the orientation processed compressed, confirm the target picture point in the orientation to the picture planimetric position.The F.Leberl structure is exactly above-mentioned 2 mathematical models of expressing radar image moment structure picture as model.
1. distance condition
As shown in Figure 5, D
S0For the low coverage scan edge postpones, V
SBe the speed along heading, R
SBe the oblique distance of center of antenna S to ground point P, H is the flying height of center of antenna S to data naturalization plane (reference field), y
SFor the distance of ground point P on the slant-range display image to the picture coordinate, y
gFor the distance of ground point P on the distance display image to the picture coordinate, R
0Be the projection of delayed sweep on data naturalization plane, M
yFor the distance of slant-range display image to pixel resolution, m
yFor flat distance apart from display image to pixel resolution.
Have for the slant-range display image:
(X-X
S)
2+(Y-Y
S)
2+(Z-Z
S)
2=(y
sM
y+D
S0)
2 (6)
In the formula: (X, Y Z) are the object space coordinate of ground point P, (X
S, Y
S, Z
S) being the object space coordinate of center of antenna instantaneous position S (being the polynomial function of flight time T), can be expressed as:
X wherein
S0, Y
S0, Z
S0Be the instantaneous object space coordinate in radar antenna center corresponding to the picture true origin, X
V0, Y
V0, Z
V0Be the component (the single order variability of elements of exterior orientation) of aircraft corresponding to the velocity of picture true origin, X
A0, Y
A0, Z
A0Be the component (the second order variability of elements of exterior orientation) of aircraft corresponding to the acceleration of picture true origin, T for picture coordinate x with respect to the initial point flight time constantly, x is that the orientation of radar image is to picture planimetric coordinates, K
xFor image orientation to every row used sweep time.
If any time, the velocity of center of antenna was (X
V, Y
V, Z
V), by (7) Shi Kede:
In like manner, have apart from display image for flat:
(X-X
S)
2+(Y-Y
S)
2+(Z-Z
S)
2=(y
gm
y+R
0)
2+H
2 (9)
2. Doppler's condition
As shown in Figure 6, τ is the angle that waits Doppler's circular conical surface and vertical direction, and D-XYZ is an earth axes, the same Fig. 5 of other variable-definition.
Doppler's condition formula of radar image does
(10)
F wherein
DCBe doppler centroid, λ is a radar wavelength,
With
Be respectively the velocity of S and P,
With
Be respectively the position vector of S and P.Millet cake is static hypothetically, promptly
at this moment (10) formula can be reduced to
R wherein
SOblique distance for ground point P.
What F.Leberl adopted is positive side-looking corner condition or zero Doppler's condition, i.e. τ=0, and this moment, the satellite flight velocity was vertical with the position vector of S sensing P, f
DC=0, so be called zero Doppler's condition.(11) formula becomes at this moment
X
V(X-X
S)+Y
V(Y-Y
S)+Z
V(Z-Z
S)=0(12)
(7) or (9) and (12) formula constituted the F.Leberl model jointly, X is arranged in the imaging equation
S0, Y
S0, Z
S0, X
V0, Y
V0, Z
V0, X
A0, Y
A0, Z
A0, D
S0, K
x, M
y(or m
y) 12 orientation parameters, D usually
S0, K
x, M
y(or m
y) can be directly given by the systematic parameter of radar.
(4) rational function model
1. the normal solution form of rational function model
(Rational Function Model is that (r c) is expressed as that (X, Y Z) are the polynomial ratio of independent variable, that is: with millet cake volume coordinate correspondingly with picpointed coordinate RFM) to rational function model
Wherein, p
i(X
n, Y
n, Z
n) (i=1,2,3,4) be common polynomial expression, the highest being no more than 3 times, form is following:
p
i(X,Y,Z)=a
0+a
1Z+a
2Y+a
3X+a
4ZY+a
5ZX+a
6YX+a
7Z
2+a
8Y
2+a
9X
2+a
10ZYX+a
11Z
2Y+a
12Z
2X+a
13Y
2Z+a
14Y
2X+a
15ZX
2+a
16YX
2+a
17Z
3+a
18Y
3+a
19X
3
Multinomial coefficient a in the formula
0..., a
19Be called the rational function coefficient (Rational Function Coefficients, RFCs).
(r in the formula (2-38)
n, c
n) and (X
n, Y
n, Z
n) represent respectively picpointed coordinate (r, c) and topocentric coordinates (X, Y, the Z) normalized coordinates behind Pan and Zoom, value is positioned between (1.0~+ 1.0).It is to reduce in the computation process because the round-off error of the excessive introducing of data bulk level difference that RFM adopts the purpose of normalized coordinates.
2. the anti-form of separating of rational function model
The anti-form of separating of rational function model:
Polynomial expression p in the formula
i(i=5,6,7,8) form is following:
p
i(r,c,Z)=a′
0+a′
1Z+a′
2c+a′
3r+a′
4cZ+a′
5rZ+a′
6cr+a′
7Z
2+a′
8c
2+a′
9r
2+a′
10crZ+a′
11cZ
2+a′
12rZ
2+a′
13c
2Z+a′
14c
2r+a′
15r
2Z+a′
16cr
2+a′
17Z
3+a′
18c
3+a′
19r
3
Be different from collinearity equation, rational function model can only provide object space to arrive picture side or the picture side some direction transformations among the object space, and inverse transformation need align the transformation model linearization, accomplishes through the iterative process under certain initial value.
2, structure resolving as model parameter
(1) face central projection model
Formula (1) is carried out can getting after the linearization
In the formula:
In (16) formula, x
See, y
SeeThe observed reading of expression picpointed coordinate, x
Meter, y
MeterThe expression picpointed coordinate calculated value, it with
Should calculate with reference to (17) and (18) formula, promptly
Every scape picture formula image has 6 elements of exterior orientation, adopts more than 3 not the ground control point of conllinear to utilize space resection just can find the solution 6 elements of exterior orientation of image according to (15) formula.
(2) line central projection model
To getting after (3) formula linearization:
In the formula
Be the correction of each elements of exterior orientation, dX, dY, dZ are the topocentric coordinates correction, C
Ij(i=1,2; J=1 ..., 6) and be the coefficient of each correction, l
x, l
yBe constant term.
Being write as matrix form is:
L=[l
x l
y]
T
Wherein A is a matrix of coefficients; L is an observation vector; X is the unknown parameter vector; V is the observational error vector, and Quan Zhen P generally is a symmetric positive definite,
be topocentric coordinates correction vector.
Each picture point on the image can be listed two error equations by (19) formula, and for reference mark dX=dY=dZ=0, contain 12 unknown parameters in the equation this moment, utilizes more than 6 not the reference mark of conllinear to find the solution.
(3) the F.Leberl structure is as model
If use F
1And F
2Represent (6) and (12) formula respectively, then have:
Formula (23) is a nonlinear function, can not directly be used for compensating computation, therefore, need carry out linearization to it.Can get after its linearization:
Wherein:
The correction of orientation parameter does
Δ=[dX
S0,dY
S0,dZ
S0,dX
V0,dY
V0,dZ
V0,dX
a0,dY
a0,dZ
a0]
T
Constant term is L=-[F
1, F
2]
T
Factor arrays
And have:
DX wherein
S0... dZ
A0Be the correction of each elements of exterior orientation, dX, dY, dZ are the topocentric coordinates correction, A
IjBe the coefficient of each elements of exterior orientation correction, B
IjBe the coefficient of each ground point coordinate correction number, F
1, F
2Be constant term, v
x, v
yBe picture coordinate residual error, C
IjCoefficient for picture coordinate residual error.(24) formula is write as matrix form and is done
A Δ+BX-L=CV weighs P (25)
According to (25) formula, need 5 ground control points that satisfy the certain distributed condition at least, can utilize principle of least square iterative to go out orientation parameter.
(4) rational function model
The linearization procedure of RFM is following.
At first, we can be rewritten into following form with (13) formula:
Order: B=(1 Z
nY
nX
nY
n 3X
n 3) (1 b
1B
19)
T
D=(1?Z
n Y
n?X
n …Y
n 3?X
n 3)·(1?d
1…d
19)
T
J=(a
0?a
1…a
19?b
1?b
2?…b
19)
T
K=(c
0?c
1…c
19?d
1?d
2…d
19)
T
Then error equation can be expressed as:
A given N reference mark (N>39), can list following error equation (is example with r):
(27)
Or be expressed as: V
r=M
1J-R
Then the least square solution of coefficient J is: J=(M
1 TM
1)
-1M
1 TR
In like manner, the error equation as coordinate c is: V
c=M
2K-C
So obtain K=(M
2 TM
2)
-1M
2 TC.
RFCs has two kinds to resolve mode, i.e. direct solution and iterative solution method.Direct solution is about to B
i, D
i(i=1,2 ..., N) all be made as 1, the coefficient J that calculates, K as end product.The iterative solution rule as initial value, obtains optimal result through the iterative process under the certain condition with the result of direct solution.Because calculating the used error equation of RFCs is the mathematical model after the linearization, generally will carry out iteration for obtaining optimum solution.
3, " combined type " as the foundation of stereo location model with resolve
" combined type " as the stereo localization method, it mainly comprises following process: at first, the type right according to picture selects to be suitable for the right structure of picture as model; According to existing controlled condition, calculate as right structure as model parameter; Aforementioned 1; 2 liang of steps have specifically been narrated; Be exactly below according to " combined type " as the structure of each image of centering as model (known models parameter); Set up multi-form " combined type " as to " " find the solution at last, obtains terrestrial coordinate by (different pictures are different as model to the associating structure that constitutes) as model for the associating structure.Fig. 2 has provided a kind of process flow diagram of the inventive method.
Carrying out " combined type " with optical sensor and SAR sensor below orientates example as as stereo and specifies the present invention.Be illustrated in figure 1 as an optical sensor and SAR sensor and carry out " combined type " synoptic diagram as the stereo location.
Owing to have redundant observation, need to use the best fit approximation value that least square method is calculated unknown number.Accounting equation is (matrix form) as follows:
CV+DX+L=0 weighs P (28)
Wherein
The residual error coefficient battle array
Residual error
Factor arrays
Ground coordinate reduction X=(Δ X, Δ Y, Δ Z)
T
Constant vector
Power
This equation can be reduced to
V=D
··X-L
·
Wherein
D
·=-C
-1·D L
·=C
-1·L
X, Y, the increment of Z can be used computes
X=(D
·TPD
·)
-1·(D
·TPL
·) (29)
Behind the given initial value, iterative computation ground coordinate X, Y, the increment Delta X of Z, Δ Y, Δ Z, up to Δ X, Δ Y, Z is enough little for Δ, obtains X at last, Y, Z.
(29) in the formula among C, D and the L value of parameters change as the different of model according to the photo structure, below we give the calculating formula of the matrix element of appear central projection model, line central projection model, F.Leberl model, RFM respectively.
1. face central projection model
2. line central projection model
3. F.Leberl model
④RFM
4, below in conjunction with concrete experiment and experimental data effect of the present invention is described.
The subordinate list explanation of some that use
The parametric statistics of experiment image in the concrete experiment one of table 1;
" combined type " is as to the bearing accuracy statistics in the concrete experiment one of table 2;
" combined type " is as right correlation parameter in the concrete experiment one of table 3;
The parametric statistics of experiment image in the concrete experiment two of table 4;
" combined type " is as to the bearing accuracy statistics in the concrete experiment two of table 5;
" combined type " is as right correlation parameter in the concrete experiment two of table 6.
Experiment one
Selected the remote sensing image of four satellites in the Beijing area that different time obtains, its concrete parameter sees table 1.Fig. 7 has shown the overlay area of four width of cloth images and the distribution situation of ground control point, and what wherein triangle was represented is the reference mark, totally 50; What blue triangle was represented is the reference mark on the optical image; Totally 27, what red triangle was represented is the reference mark on the SAR image, totally 23; What green circle was represented is the checkpoint, totally 35.
The combination in twos of four width of cloth images can be formed 6 stereograms altogether, and their positioning result sees Table 2.Calculated the correlation parameter of each stereogram according to the orientation parameter of image, comprised photo degree of overlapping (pressing area estimates), intersection angle size, base length, base-height ratio, its result sees table 3.In addition, in Fig. 8, give the error statistics situation of all checkpoints in 6 stereograms, Fig. 9 has then shown the spatial relation and the earth observation situation of four satellites.
Experimental result shows that " combined type " is correctly feasible as the stereo localization method, but influence " combined type " is different as factor and conventional stereo location to bearing accuracy.
The parametric statistics of table 1 experiment image
Table 2 " combined type " is as to the bearing accuracy statistics
Table 3 " combined type " is as right correlation parameter
Experiment two
Selected a width of cloth IKONOS satellite image, two width of cloth SPOT-4 satellite images and a width of cloth carried SAR image in area, Tai'an, Shandong Province, its concrete parameter is seen table 4.Figure 10 has shown the overlay area of four width of cloth images, and has indicated the distribution situation of 28 ground check points with red triangle.
The combination in twos of four width of cloth images can be formed 6 stereograms altogether, and their positioning result sees Table 5.Calculated the correlation parameter of each stereogram according to the orientation parameter of image, its result sees table 6.Figure 11 has provided the error statistics situation of all checkpoints in 6 stereograms.Figure 12 has shown the spatial relation and the earth observation situation of four satellites.This embodiment result is similar with the result of embodiment one.
The parametric statistics of table 4 experiment image
Table 5 " combined type " is as to the bearing accuracy statistics
Table 6 " combined type " is as right correlation parameter
Claims (3)
1. a combined type is characterized in that as the stereo localization method step is following:
Step 1), according to various sensor imaging modes, the structure of setting up corresponding various sensors respectively is as model;
Step 2); The image of selecting acquired two identical or different types is as stereogram; Select in the step 1) with said picture to corresponding respectively structure as model; According to the known control condition, calculate said picture respectively to the model parameter of structure as model, the structure of known models parameter that obtains picture centering is as model;
Step 3) is selected an impact point, with this impact point picture on two picture coordinates corresponding steps 2 of substitution respectively) structure of described known models parameter is as model, unites the terrestrial coordinate that resolves this impact point;
Step 4), repeating step 3), resolves the terrestrial coordinate of whole imagery zone, realize the location.
2. a kind of combined type according to claim 1 is characterized in that as the stereo localization method the described structure of step 1) comprises as model: optics picture formula image corresponding surface central projection model; The corresponding line central projection of optics line array CCD image model; The corresponding F.Leberl model of SAR image; The rational function model that all images all are suitable for.
3. a kind of combined type according to claim 1 is characterized in that as the stereo localization method, in the step 3), calculates the terrestrial coordinate of impact point through simultaneous adjustment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011104487075A CN102538764A (en) | 2011-12-28 | 2011-12-28 | Combined type image pair three-dimensional location method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011104487075A CN102538764A (en) | 2011-12-28 | 2011-12-28 | Combined type image pair three-dimensional location method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102538764A true CN102538764A (en) | 2012-07-04 |
Family
ID=46346257
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011104487075A Pending CN102538764A (en) | 2011-12-28 | 2011-12-28 | Combined type image pair three-dimensional location method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102538764A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103021021A (en) * | 2012-11-15 | 2013-04-03 | 哈尔滨工业大学 | Generalized stereopair three-dimensional reconstruction method adopting variance component estimation |
CN103389065A (en) * | 2013-08-01 | 2013-11-13 | 罗建刚 | Measurement method and arithmetic facility of azimuth angle from photographing centre to objective physical point |
CN106951567A (en) * | 2017-04-07 | 2017-07-14 | 安徽建筑大学 | Water system space analysis method based on remote sensing technology for Huizhou traditional settlement |
CN109612439A (en) * | 2018-12-13 | 2019-04-12 | 同济大学 | Stereopsis intersection angle and baseline length estimation method based on rational function model |
CN110826423A (en) * | 2019-10-18 | 2020-02-21 | 中北大学 | Method, device and system for detecting interested target in group target |
CN116753918A (en) * | 2023-06-19 | 2023-09-15 | 中国人民解放军61540部队 | Ground target position estimation method and device based on empty antenna array sensor |
-
2011
- 2011-12-28 CN CN2011104487075A patent/CN102538764A/en active Pending
Non-Patent Citations (7)
Title |
---|
《测绘学报》 20080531 邢帅 等 光学与SAR卫星遥感影像复合式"立体"定位技术的研究 第37卷, 第2期 * |
巩丹超: "高分辨率卫星遥感立体影像处理模型与算法", 《中国博士学位论文全文数据库》 * |
智长贵 等: "OrthoBASE模块对框幅式航片数字微分纠正原理的探讨", 《河南农业大学学报》 * |
邢帅 等: "SPOT5与ERS_2卫星影像的联合立体定位", 《测绘科学》 * |
邢帅 等: "不同类型遥感影像的联合立体定位", 《第十四届全国图象图形学学术会议论文集》 * |
邢帅 等: "光学与SAR卫星遥感影像复合式"立体"定位技术的研究", 《测绘学报》 * |
项仲贞 等: "摄影测量与遥感所用传感器类型及构像方程", 《铁道勘察》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103021021A (en) * | 2012-11-15 | 2013-04-03 | 哈尔滨工业大学 | Generalized stereopair three-dimensional reconstruction method adopting variance component estimation |
CN103021021B (en) * | 2012-11-15 | 2016-03-23 | 哈尔滨工业大学 | Adopt the generalized stereopair three-dimensional rebuilding method of variance components estimate |
CN103389065A (en) * | 2013-08-01 | 2013-11-13 | 罗建刚 | Measurement method and arithmetic facility of azimuth angle from photographing centre to objective physical point |
CN103389065B (en) * | 2013-08-01 | 2015-11-25 | 罗建刚 | Photo centre is to azimuthal measuring method of target material object point |
CN106951567A (en) * | 2017-04-07 | 2017-07-14 | 安徽建筑大学 | Water system space analysis method based on remote sensing technology for Huizhou traditional settlement |
CN106951567B (en) * | 2017-04-07 | 2020-07-17 | 安徽建筑大学 | Water system space analysis method based on remote sensing technology for Huizhou traditional settlement |
CN109612439A (en) * | 2018-12-13 | 2019-04-12 | 同济大学 | Stereopsis intersection angle and baseline length estimation method based on rational function model |
CN110826423A (en) * | 2019-10-18 | 2020-02-21 | 中北大学 | Method, device and system for detecting interested target in group target |
CN116753918A (en) * | 2023-06-19 | 2023-09-15 | 中国人民解放军61540部队 | Ground target position estimation method and device based on empty antenna array sensor |
CN116753918B (en) * | 2023-06-19 | 2024-03-19 | 中国人民解放军61540部队 | Ground target position estimation method and device based on empty antenna array sensor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Grayson et al. | GPS precise point positioning for UAV photogrammetry | |
CN103557841B (en) | A kind of method improving polyphaser resultant image photogrammetric accuracy | |
CN101750619B (en) | Method for directly positioning ground target by self-checking POS | |
CN103914808B (en) | Method for splicing ZY3 satellite three-line-scanner image and multispectral image | |
CN102538764A (en) | Combined type image pair three-dimensional location method | |
CN102735216B (en) | CCD stereoscopic camera three-line imagery data adjustment processing method | |
Carvajal-Ramírez et al. | Effects of image orientation and ground control points distribution on unmanned aerial vehicle photogrammetry projects on a road cut slope | |
CN103674063A (en) | On-orbit geometric calibration method of optical remote sensing camera | |
CN104363438B (en) | Full-view stereo making video method | |
CN102607533B (en) | Adjustment locating method of linear array CCD (Charge Coupled Device) optical and SAR (Specific Absorption Rate) image integrated local area network | |
CN102636159A (en) | In-orbit geometrical self-calibration method for multi-mirror aerospace linear array camera system | |
Haala et al. | Hybrid georeferencing of images and LiDAR data for UAV-based point cloud collection at millimetre accuracy | |
CN105823469A (en) | GNSS high precision assisted unmanned plane aerotriangulation method | |
CN102519436A (en) | Chang'e-1 (CE-1) stereo camera and laser altimeter data combined adjustment method | |
CN110986888A (en) | Aerial photography integrated method | |
CN112461204A (en) | Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height | |
CN102944308B (en) | Attitude error correcting method of time-space joint modulation interference imaging spectrometer | |
Maurice et al. | A photogrammetric approach for map updating using UAV in Rwanda | |
Zongjian et al. | Accuracy analysis of low altitude photogrammetry with wide-angle camera | |
CN101776451A (en) | CCD mapping camera capable of self-stabilizing and self-correcting motion distortion | |
Amami et al. | Investigations into utilizing low-cost amateur drones for creating ortho-mosaic and digital elevation model | |
Liu et al. | Comparison of DEM accuracies generated from different stereo pairs over a plateau mountainous area | |
Javanmardi et al. | Precise mobile laser scanning for urban mapping utilizing 3D aerial surveillance data | |
Deltsidis et al. | Orthorectification of World View 2 stereo pair using a new rigorous orientation model | |
Zareei et al. | Virtual ground control for survey-grade terrain modelling from satellite imagery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C12 | Rejection of a patent application after its publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20120704 |