CN103644897A - Three-dimensional surveying method based on super-resolution image reconstruction - Google Patents

Three-dimensional surveying method based on super-resolution image reconstruction Download PDF

Info

Publication number
CN103644897A
CN103644897A CN201310692480.8A CN201310692480A CN103644897A CN 103644897 A CN103644897 A CN 103644897A CN 201310692480 A CN201310692480 A CN 201310692480A CN 103644897 A CN103644897 A CN 103644897A
Authority
CN
China
Prior art keywords
stereopsis
dimensional
outer orientation
coordinate
super
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310692480.8A
Other languages
Chinese (zh)
Other versions
CN103644897B (en
Inventor
邸凯昌
彭嫚
李力
刘召芹
孙义威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Remote Sensing and Digital Earth of CAS
Original Assignee
Institute of Remote Sensing and Digital Earth of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Remote Sensing and Digital Earth of CAS filed Critical Institute of Remote Sensing and Digital Earth of CAS
Priority to CN201310692480.8A priority Critical patent/CN103644897B/en
Publication of CN103644897A publication Critical patent/CN103644897A/en
Application granted granted Critical
Publication of CN103644897B publication Critical patent/CN103644897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a three-dimensional surveying method based on super-resolution image reconstruction. The three-dimensional surveying method comprises the steps of generating a super-resolution three-dimensional image according to three-dimensional images shot by multiple stations; checking a three-dimensional camera by using a three-dimensional checking field, establishing an east-north-heaven right hand coordinate system by using a left image of a first station as a reference, performing multiple-base line bundle balancing on the basis of checking the relative gesture of the three-dimensional camera, and determining elements of orientation of all three-dimensional images; regarding the super-resolution image being shot by a virtual camera, checking the virtual three-dimensional camera by using three-dimensional coordinates obtained through forward intersection of multiple-base line three-dimensional images as true values; obtaining re-checking affine transformation coefficient by using a least square method according internal and external parameters obtained by the virtual three-dimensional camera through checking; and performing forward intersection on image point coordinates of a target to be surveyed, the internal and external parameters of the virtual three-dimensional camera and the re-checking coefficient to obtain three-dimensional coordinates of the target to be surveyed. The three-dimensional surveying method can be widely applied to three-dimensional surveying of super-resolution images of terrestrial photogrammetric survey.

Description

A kind of three-dimensional measuring method based on super-resolution image reconstruction
Technical field
The present invention relates to a kind of three-dimensional measuring method of image reconstruction, particularly about a kind of three-dimensional measuring method based on super-resolution image reconstruction.
Background technology
In photogrammetric application, high-resolution image is localizing objects accurately.For reducing CCD(Charge-coupled Device, charge coupled cell) constraint of resolution, can use super-resolution (Super Resolution, SR) reconstruction technique to obtain high-resolution image.Technique of Super-resolution Image Construction be utilize about Same Scene have Displacement and containing multiframe low resolution (the Low Resolution of redundant information, LR) image reconstruction goes out high resolving power (High Resolution, HR) technology of image, its ultimate principle is to utilize nonredundancy information that the fine difference between sequence of low resolution pictures frame and frame comprises to improve the resolution of image.In close-range photogrammetry, can adopt many Baselines to improve the three-dimensional measuring precision of image, but high-resolution image can not be provided.
Summary of the invention
For the problems referred to above, the object of this invention is to provide a kind ofly can provide high resolution image and the high three-dimensional measuring method based on super-resolution image reconstruction of three-dimensional measuring precision.
For achieving the above object, the present invention takes following technical scheme: a kind of three-dimensional measuring method based on super-resolution image reconstruction, it comprises the following steps: 1) at four, take the photograph station and use respectively stereoscopic camera to treat to measure target and photograph, obtain four couples of stereopsis (L of target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4); To four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) carry out obtaining after Image registration and image restoration the super-resolution stereopsis (SR_L, SR_R) of target to be measured; 2) use stereoscopic camera to take three-dimensional calibration field, according to the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field, calculate the relative position T (V between the camera of left and right in the elements of interior orientation of stereoscopic camera and stereoscopic camera x, V y, V z) and attitude R; 3) with the left image L of target first stop to be measured 1center be initial point O, the left image L of first stop 1east to being X-axis,, to being Y-axis, take and point into the sky direction as Z axis perpendicular to XOY plane in its north, sets up O-XYZ coordinate system, under O-XYZ coordinate system, the left image L of first stop 1position, outer orientation and outer orientation attitude be respectively (0,0,0) and unit matrix; According to the left image L of first stop 1position, outer orientation and the relative position T (V between the left and right camera of outer orientation attitude and stereoscopic camera x, V y, V z) and attitude R, obtain the right image R of first stop 1position, outer orientation and outer orientation attitude be respectively T (V x, V y, V z) and R; By the left image L of first stop 1position, outer orientation and outer orientation attitude and the right image R of first stop 1position, outer orientation and outer orientation attitude obtain first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude; 4) according to first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude and four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, calculate stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude; 5) the super-resolution stereopsis (SR_L, SR_R) of target to be measured is considered as by virtual three-dimensional camera, take and obtain, at the upper selected equally distributed reference mark of super-resolution stereopsis (SR_L, SR_R); To be positioned at four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) upper and corresponding with the upper reference mark of super-resolution stereopsis (SR_L, SR_R) corresponding image points coordinate and stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3), (L 4, R 4) position, revised outer orientation and attitude substitution collinearity equation carry out many baselines intersection calculation, obtains the three-dimensional coordinate at the upper reference mark of super-resolution stereopsis (SR_L, SR_R); Utilize three-dimensional coordinate and the upper corresponding picpointed coordinate of super-resolution stereopsis (SR_L, SR_R) at these reference mark to carry out calibration to virtual three-dimensional camera, obtain elements of interior orientation, position, outer orientation and the outer orientation attitude of virtual three-dimensional camera; 6) at super-resolution stereopsis (SR_L, SR_R) three above reference mark of upper selection, utilize the three-dimensional coordinate at reference mark and the elements of interior orientation of virtual three-dimensional camera, position, outer orientation and outer orientation attitude respectively left image SR_L and right image SR_R to be set up to the affine Transform Model of calibration again, and utilize least square method to calculate the coefficient of calibration again of affine Transform Model; 7) by the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_R in elements of interior orientation, position, outer orientation and the outer orientation attitude of the virtual three-dimensional camera calibration being obtained by step 5), the super-resolution stereopsis (SR_L, SR_R) that obtained by step 6) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and carry out forward intersection computing in the coordinate substitution collinearity equation for the treatment of gauge point on super-resolution stereopsis (SR_L, SR_R), obtain the three-dimensional coordinate of target to be measured.
Described step 2) in, the relative position T (V in the elements of interior orientation of acquisition stereoscopic camera and stereoscopic camera between the camera of left and right x, V y, V z) and attitude R, it comprises the following steps: 1. use stereoscopic camera to take three-dimensional calibration field, obtain the stereopsis (Calib_L of three-dimensional calibration field, Calib_R), utilize characteristic point matching method to obtain the picpointed coordinate (x at the middle reference mark of stereopsis (Calib_L, Calib_R) of three-dimensional calibration field tL, y tL) and (x iR, y iR), t > 6; 2. by the known three-dimensional coordinate (X in reference mark in the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field t, Y t, Z t) and the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) substitution collinearity equation, by the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) distortion model of substitution pinhole imaging system, and by the distortion model of collinearity equation and pinhole imaging system, the two result after processing is carried out the compensating computation of entire light method, obtains position, the outer orientation T of the elements of interior orientation of stereoscopic camera, left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_Land position, the outer orientation T of right image Calib_R calib_R(X rs, Y rs, Z rs) and attitude R calib_R; 3. by position, the outer orientation T of left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_L, right image Calib_R position, outer orientation T calib_R(X rs, Y rs, Z rs) and attitude R calib_Rsubstitution formula
R = R Calib _ L - 1 · R Calib _ R T = R Calib _ L - 1 · ( T Calib _ R - T Calib _ L ) ,
Calculate the relative position T (V between the camera of left and right in stereoscopic camera x, V y, V z) and attitude R.
In described step 4), obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude, it specifically comprises the following steps: four couples of stereopsis (L that 1. extract target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, obtain unique point at stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on picpointed coordinate; 2. by the elements of interior orientation of stereoscopic camera, first stop stereopsis (L 1, R 1) picpointed coordinate and the first stop stereopsis (L of upper unique point 1, R 1) position, outer orientation and outer orientation attitude substitution collinearity equation carry out forward intersection computing, obtain first stop stereopsis (L 1, R 1) three-dimensional coordinate of upper some unique points; 3. to first stop stereopsis (L 1, R 1) three-dimensional coordinate and the left image L of upper some unique points 2, L 3and L 4on the picpointed coordinate of unique point carry out photogrammetric resection computing, obtain successively left image L 2, L 3and L 4position, outer orientation with outer orientation attitude
Figure BDA0000439483690000033
initial value, i=2,3,4; 4. according to left image L 2, L 3and L 4position, outer orientation with outer orientation attitude
Figure BDA0000439483690000035
initial value and the relative position T (V between the camera of left and right x, V y, V z) and attitude R, obtain right image R 2, R 3and R 4position, outer orientation
Figure BDA0000439483690000036
with outer orientation attitude initial value be respectively:
T i right = R i left · T + T i left R i right = R i left · R ;
By left image L 2, L 3and L 4position, outer orientation
Figure BDA0000439483690000038
with outer orientation attitude
Figure BDA0000439483690000039
and right image R 2, R 3and R 4position, outer orientation
Figure BDA00004394836900000310
with outer orientation attitude obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation
Figure BDA00004394836900000312
initial value with outer orientation attitude 5. adopt many baselines bundle adjustment method to stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation
Figure BDA00004394836900000314
initial value with outer orientation attitude
Figure BDA00004394836900000315
revise, obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude.
In described step 6), to super-resolution stereopsis (SR_L, SR_R) in, left image SR_L and right image SR_R set up the process of affine Transform Model, and it comprises the following steps: 1. in the upper selection of super-resolution stereopsis (SR_L, SR_R) and at stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) the upper corresponding unique point of unique point of the same name of extracting, and using these unique points of the same name as reference mark, utilize the forward intersection of many baselines to calculate the three-dimensional coordinate at these reference mark; 2. by super-resolution stereopsis (SR_L, SR_R) the upper three-dimensional coordinate at reference mark and the position, outer orientation of virtual three-dimensional camera and outer orientation attitude substitution collinearity equation, calculate coordinate under the photo coordinate system oxy of reference mark on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x' r, y' r); 3. by the elements of interior orientation of virtual three-dimensional camera, reference mark the coordinate under the photo coordinate system oxy on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x' r, y' r) the camera distortion model of substitution pinhole imaging system, use Newton iteration method to calculate the coordinate (m under the collimation mark coordinate system o'mn of reference mark on super-resolution stereopsis (SR_L, SR_R) l, n l) and (m r, n r); 4. according to reference mark the measurement coordinate under the collimation mark coordinate system o'mn on (SR_L, SR_R) image (m ' l, n ' l) and (m' r, n' r) and the collimation mark coordinate system o'mn of the reference mark that calculates in 3. of step on super-resolution stereopsis (SR_L, SR_R) under coordinate (m l, n l) and (m r, n r), left image SR_L in super-resolution stereopsis (SR_L, SR_R) and right image SR_R are set up respectively to affine Transform Model:
m l ′ - m l = a 0 + a 1 m l + a 2 n l n l ′ - n l = b 0 + b 1 m l + b 2 n l ,
m r ′ - m r = c 0 + c 1 m r + c 2 n r n r ′ - n r = d 0 + d 1 m r + d 2 n r ,
In formula, a 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2it is respectively the coefficient of calibration again of left image SR_L and right image SR_R in super-resolution stereopsis (SR_L, SR_R).
In described step 7), obtain the three-dimensional coordinate of target to be measured, it specifically comprises the following steps: 1. according to the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_R in super-resolution stereopsis (SR_L, SR_R) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and the picpointed coordinate that measures under collimation mark coordinate system o'mn of left image SR_L and right image SR_R with
Figure BDA0000439483690000044
calculate the coordinate after correcting under measuring the collimation mark coordinate system o'mn of target at left image SR_L and right image SR_R
Figure BDA0000439483690000045
with
Figure BDA0000439483690000046
be respectively:
x lrow ′ Tar = x lrow Tar + a 0 + a 1 x lrow Tar + a 2 y lcol Tar y lcol ′ Tar = y lcol Tar + b 0 + b 1 x lrow Tar + b 2 y lcol Tar ,
x rrow ′ Tar = x rrow Tar + c 0 + c 1 x rrow Tar + c 2 y rcol Tar y rcol ′ Tar = y rcol Tar + d 0 + d 1 x rrow Tar + d 2 y rcol Tar ;
2. use the elements of interior orientation of pinhole imaging system distortion model and the calibration of virtual three-dimensional camera, respectively by the coordinate after the correction under left image SR_L collimation mark coordinate system o'mn until gauge point with the coordinate after correction under right image SR_R collimation mark coordinate system o'mn
Figure BDA00004394836900000410
be converted to the point coordinate under left image SR_L photo coordinate system oxy with the point coordinate under right image SR_R photo coordinate system oxy
Figure BDA00004394836900000412
3. by the picpointed coordinate of the position, outer orientation of virtual three-dimensional camera calibration and outer orientation attitude and the left image SR_L of super-resolution stereopsis (SR_L, SR_R)
Figure BDA00004394836900000413
picpointed coordinate with right image SR_R carry out forward intersection computing, obtain the three-dimensional coordinate of target to be measured under O-XYZ coordinate system.
The present invention is owing to taking above technical scheme, it has the following advantages: 1, the present invention is owing to using three-dimensional calibration field to carry out calibration to stereoscopic camera, on the basis of calibration stereoscopic camera relative attitude, carry out many baselines bundle adjustment, determine the element of orientation of every width stereopsis of target to be measured, the stereopsis of taking according to multistation generates super-resolution stereopsis, super-resolution image is considered as to a pair of virtual camera and takes acquisition, the three-dimensional coordinate that the multi-base stereo image forward intersection of take obtains is true value, virtual three-dimensional camera is carried out to calibration, by the picpointed coordinate of target to be measured, the intrinsic parameter of virtual three-dimensional camera, outer parameter and again calibration coefficient carry out forward intersection computing, obtain the three-dimensional coordinate of target to be measured, therefore the present invention can guarantee that three-dimensional measuring has higher precision, and can provide high resolution image 2, the present invention is owing to using the unique point obtaining from multi-base stereo image as the elements of interior orientation and the elements of exterior orientation that obtain super-resolution stereopsis with reference to reference mark, therefore the present invention does not need extra measurement control point coordinate, be convenient to practical application.Based on above advantage, the present invention can be widely used in the three-dimensional measuring of terrestrial photogrammetry super-resolution image.
Accompanying drawing explanation
Fig. 1 is the three-dimensional measuring method flow schematic diagram of super-resolution image reconstruction of the present invention;
Fig. 2 is the conceptual schematic view of super-resolution image reconstruction of the present invention; Wherein, O-XYZ coordinate system represents east-north-sky right-handed coordinate system, and SR_L and SR_R represent respectively left image and the right image of the super-resolution stereopsis (SR_L, SR_R) of target to be measured; L 1, L 2, L 3and L 4the left image that represents respectively target first stop to be measured, second station, the 3rd station and the 4th stand body image, R 1, R 2, R 3and R 4the right image that represents respectively target first stop to be measured, second station, the 3rd station and the 4th stand body image.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in detail.
As shown in Figure 1, the three-dimensional measuring method based on super-resolution image reconstruction of the present invention, it comprises the following steps:
1) as shown in Figure 2, at four, take the photograph station and use respectively stereoscopic camera to treat to measure target and photograph, obtain four couples of stereopsis (L of target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4); To four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) carry out obtaining after Image registration and image restoration the super-resolution stereopsis (SR_L, SR_R) of target to be measured.
2) use stereoscopic camera to take three-dimensional calibration field, according to the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field, calculate the relative position T (V between the camera of left and right in the elements of interior orientation of stereoscopic camera and stereoscopic camera x, V y, V z) and attitude R, it comprises the following steps:
1. use stereoscopic camera to take three-dimensional calibration field, obtain the stereopsis (Calib_L of three-dimensional calibration field, Calib_R), utilize characteristic point matching method to obtain the picpointed coordinate (x at the middle reference mark of stereopsis (Calib_L, Calib_R) of three-dimensional calibration field tL, y tL) and (x tR, y tR), wherein (t > 6).
2. by the known three-dimensional coordinate (X in reference mark in the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field t, Y t, Z t) and the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) substitution collinearity equation, by the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) distortion model of substitution pinhole imaging system, and by the distortion model of collinearity equation and pinhole imaging system, the two result after processing is carried out the compensating computation of entire light method, obtains position, the outer orientation T of the elements of interior orientation of stereoscopic camera, left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_Land position, the outer orientation T of right image Calib_R calib_R(X rs, Y rs, Z rs) and attitude R calib_R
3. by position, the outer orientation T of left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_L, right image Calib_R position, outer orientation T calib_R(X rs, Y rs, Z rs) and attitude R calib_Rsubstitution following formula:
R = R Calib _ L - 1 · R Calib _ R T = R Calib _ L - 1 · ( T Calib _ R - T Calib _ L ) - - - ( 1 )
Calculate the relative position T (V between the camera of left and right in stereoscopic camera x, V y, V z) and attitude R.
3) with the left image L of first stop in four pairs of stereopsis of target to be measured 1center be initial point O, the left image L of first stop 1east to being X-axis,, to being Y-axis, take and point into the sky direction as Z axis perpendicular to XOY plane in its north, sets up east-north-day right-handed coordinate system (being O-XYZ coordinate system); Under O-XYZ coordinate system, the left image L of first stop 1position, outer orientation and outer orientation attitude be respectively (0,0,0) and unit matrix; According to the left image L of first stop 1position, outer orientation and outer orientation attitude and stereoscopic camera in relative position T (V between the camera of left and right x, V y, V z) and attitude R, obtain the right image R of first stop 1position, outer orientation and outer orientation attitude be respectively T (V x, V y, V z) and R.
By the left image L of first stop 1position, outer orientation and outer orientation attitude and the right image R of first stop 1position, outer orientation and outer orientation attitude obtain first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude.
4) according to first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude and four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, calculate stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude, it specifically comprises the following steps:
1. extract four couples of stereopsis (L of target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, obtain unique point at stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on picpointed coordinate.
2. by step 2) in the stereoscopic camera elements of interior orientation, the first stop stereopsis (L that obtain 1, R 1) picpointed coordinate and the first stop stereopsis (L of upper unique point 1, R 1) position, outer orientation and outer orientation attitude substitution collinearity equation carry out forward intersection computing, obtain first stop stereopsis (L 1, R 1) three-dimensional coordinate of upper some unique points.
3. to first stop stereopsis (L 1, R 1) three-dimensional coordinate and the left image L of upper some unique points 2, L 3and L 4on the picpointed coordinate of unique point carry out photogrammetric resection computing, obtain successively left image L 2, L 3and L 4position, outer orientation
Figure BDA0000439483690000062
(i=2,3,4) and outer orientation attitude
Figure BDA0000439483690000063
initial value.
4. according to left image L 2, L 3and L 4position, outer orientation
Figure BDA0000439483690000064
with outer orientation attitude
Figure BDA0000439483690000065
initial value and the relative position T (V between the camera of left and right x, V y, V z) and attitude R, obtain right image R 2, R 3and R 4position, outer orientation
Figure BDA0000439483690000066
with outer orientation attitude
Figure BDA0000439483690000067
initial value be respectively:
T i right = R i left · T + T i left R i right = R i left · R - - - ( 2 )
By left image L 2, L 3and L 4position, outer orientation
Figure BDA0000439483690000068
with outer orientation attitude
Figure BDA0000439483690000069
and right image R 2, R 3and R 4position, outer orientation
Figure BDA00004394836900000610
with outer orientation attitude obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation
Figure BDA00004394836900000612
initial value with outer orientation attitude
Figure BDA00004394836900000613
5. for to improve the position, outer orientation of target stereopsis to be measured and the precision of outer orientation attitude, adopt many baselines bundle adjustment method to stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation
Figure BDA00004394836900000614
initial value with outer orientation attitude
Figure BDA00004394836900000615
revise, obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude.
5) the super-resolution stereopsis (SR_L, SR_R) of target to be measured is considered as by virtual three-dimensional camera, take and obtain, at the upper selected equally distributed reference mark of super-resolution stereopsis (SR_L, SR_R);
To be positioned at four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) upper and corresponding with the upper reference mark of super-resolution stereopsis (SR_L, SR_R) corresponding image points coordinate and stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3), (L 4, R 4) position, outer orientation and attitude respectively substitution collinearity equation carry out many baselines intersection calculation, obtain the three-dimensional coordinate at the upper reference mark of super-resolution stereopsis (SR_L, SR_R);
Utilize three-dimensional coordinate and the upper corresponding picpointed coordinate of super-resolution stereopsis (SR_L, SR_R) at these reference mark to carry out calibration to virtual three-dimensional camera, obtain elements of interior orientation, position, outer orientation and the outer orientation attitude of virtual three-dimensional camera.
6) at super-resolution stereopsis (SR_L, SR_R) three above reference mark of upper selection, utilize the three-dimensional coordinate at reference mark and the elements of interior orientation of virtual three-dimensional camera, position, outer orientation and outer orientation attitude respectively left image SR_L and right image SR_R to be set up to the affine Transform Model of calibration again, and utilize least square method to calculate the coefficient of calibration again of affine Transform Model, it specifically comprises the following steps:
1. stereopsis (the L extracting in the upper selection of super-resolution stereopsis (SR_L, SR_R) and step 4) 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on the corresponding unique point of unique point of the same name, and using these unique points of the same name as reference mark, utilize the forward intersection of many baselines to calculate the three-dimensional coordinate at these reference mark.
2. by super-resolution stereopsis (SR_L, SR_R) the upper three-dimensional coordinate at reference mark and the position, outer orientation of virtual three-dimensional camera and outer orientation attitude substitution collinearity equation, calculate coordinate under the photo coordinate system oxy of reference mark on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x' r, y' r).
3. by the elements of interior orientation of virtual three-dimensional camera, reference mark the coordinate under the photo coordinate system oxy on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x' r, y' r) the camera distortion model of substitution pinhole imaging system, use Newton iteration method to calculate the coordinate (m under the collimation mark coordinate system o'mn of reference mark on super-resolution stereopsis (SR_L, SR_R) l, n l) and (m r, n r).
4. according to reference mark the measurement coordinate under the collimation mark coordinate system o'mn on (SR_L, SR_R) image (m ' l, n ' l) and (m' r, n' r) and the collimation mark coordinate system o'mn of the reference mark that calculates in 3. of step on super-resolution stereopsis (SR_L, SR_R) under coordinate (m l, n l) and (m r, n r), left image SR_L in super-resolution stereopsis (SR_L, SR_R) and right image SR_R are set up respectively to affine Transform Model:
m l ′ - m l = a 0 + a 1 m l + a 2 n l n l ′ - n l = b 0 + b 1 m l + b 2 n l - - - ( 3 )
m r ′ - m r = c 0 + c 1 m r + c 2 n r n r ′ - n r = d 0 + d 1 m r + d 2 n r - - - ( 4 )
In formula (3), a 0, a 1, a 2, b 0, b 1, b 2it is the coefficient of calibration again of left image SR_L in super-resolution stereopsis (SR_L, SR_R); In formula (4), c 0, c 1, c 2, d 0, d 1, d 2it is the coefficient of calibration again of right image SR_R in super-resolution stereopsis (SR_L, SR_R).
5. utilize least square method to calculate the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_R in super-resolution stereopsis (SR_L, SR_R) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2.
7) by the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_L in elements of interior orientation, position, outer orientation and the outer orientation attitude of the virtual three-dimensional camera calibration being obtained by step 5), the super-resolution stereopsis (SR_L, SR_R) that obtained by step 6) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and carry out forward intersection computing in the coordinate substitution collinearity equation for the treatment of gauge point on super-resolution stereopsis (SR_L, SR_R), and obtaining the three-dimensional coordinate of target to be measured, it specifically comprises the following steps:
1. according to the coefficient of the calibration again a of the affine Transform Model of left image SR_L in the super-resolution stereopsis (SR_L, SR_R) being obtained by step 5) and right image SR_R 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and the picpointed coordinate that in super-resolution stereopsis (SR_L, SR_R), left image SR_L measures under collimation mark coordinate system o'mn the picpointed coordinate that right image SR_R measures under collimation mark coordinate system o'mn
Figure BDA0000439483690000084
calculate the coordinate after the correction under the collimation mark coordinate system o'mn that measures target left image SR_L in super-resolution stereopsis with the coordinate after correction under the collimation mark coordinate system o'mn of right image SR_R in super-resolution stereopsis (SR_L, SR_R)
Figure BDA0000439483690000086
for:
x lrow ′ Tar = x lrow Tar + a 0 + a 1 x lrow Tar + a 2 y lcol Tar y lcol ′ Tar = y lcol Tar + b 0 + b 1 x lrow Tar + b 2 y lcol Tar - - - ( 5 )
x rrow ′ Tar = x rrow Tar + c 0 + c 1 x rrow Tar + c 2 y rcol Tar y rcol ′ Tar = y rcol Tar + d 0 + d 1 x rrow Tar + d 2 y rcol Tar - - - ( 6 )
2. use the elements of interior orientation of pinhole imaging system distortion model and the calibration of virtual three-dimensional camera, by the coordinate after the correction under left image SR_L collimation mark coordinate system o'mn until gauge point
Figure BDA0000439483690000087
be converted to the point coordinate under left image SR_L photo coordinate system oxy
Figure BDA0000439483690000088
by the coordinate after the correction under right image SR_R collimation mark coordinate system o'mn until gauge point
Figure BDA0000439483690000089
be converted to the point coordinate under right image SR_R photo coordinate system oxy
Figure BDA00004394836900000810
3. by the picpointed coordinate of the position, outer orientation of the virtual three-dimensional camera calibration being obtained by step 5) and outer orientation attitude and the left image SR_L of super-resolution stereopsis (SR_L, SR_R)
Figure BDA00004394836900000811
picpointed coordinate with right image SR_R
Figure BDA00004394836900000812
carry out forward intersection computing, obtain the three-dimensional coordinate of target to be measured under O-XYZ coordinate system.
The various embodiments described above are only for illustrating the present invention; wherein the structure of each parts, connected mode and method step etc. all can change to some extent; every equivalents of carrying out on the basis of technical solution of the present invention and improvement, all should not get rid of outside protection scope of the present invention.

Claims (5)

1. the three-dimensional measuring method based on super-resolution image reconstruction, it comprises the following steps:
1) at four, take the photograph station and use respectively stereoscopic camera to treat to measure target and photograph, obtain four couples of stereopsis (L of target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4); To four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) carry out obtaining after Image registration and image restoration the super-resolution stereopsis (SR_L, SR_R) of target to be measured;
2) use stereoscopic camera to take three-dimensional calibration field, according to the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field, calculate the relative position T (V between the camera of left and right in the elements of interior orientation of stereoscopic camera and stereoscopic camera x, V y, V z) and attitude R;
3) with the left image L of target first stop to be measured 1center be initial point O, the left image L of first stop 1east to being X-axis,, to being Y-axis, take and point into the sky direction as Z axis perpendicular to XOY plane in its north, sets up O-XYZ coordinate system, under O-XYZ coordinate system, the left image L of first stop 1position, outer orientation and outer orientation attitude be respectively (0,0,0) and unit matrix; According to the left image L of first stop 1position, outer orientation and the relative position T (V between the left and right camera of outer orientation attitude and stereoscopic camera x, V y, V z) and attitude R, obtain the right image R of first stop 1position, outer orientation and outer orientation attitude be respectively T (V x, V y, V z) and R; By the left image L of first stop 1position, outer orientation and outer orientation attitude and the right image R of first stop 1position, outer orientation and outer orientation attitude obtain first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude;
4) according to first stop stereopsis (L 1, R 1) position, outer orientation and outer orientation attitude and four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, calculate stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude;
5) the super-resolution stereopsis (SR_L, SR_R) of target to be measured is considered as by virtual three-dimensional camera, take and obtain, at the upper selected equally distributed reference mark of super-resolution stereopsis (SR_L, SR_R);
To be positioned at four couples of stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) upper and corresponding with the upper reference mark of super-resolution stereopsis (SR_L, SR_R) corresponding image points coordinate and stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3), (L 4, R 4) position, revised outer orientation and attitude substitution collinearity equation carry out many baselines intersection calculation, obtains the three-dimensional coordinate at the upper reference mark of super-resolution stereopsis (SR_L, SR_R);
Utilize three-dimensional coordinate and the upper corresponding picpointed coordinate of super-resolution stereopsis (SR_L, SR_R) at these reference mark to carry out calibration to virtual three-dimensional camera, obtain elements of interior orientation, position, outer orientation and the outer orientation attitude of virtual three-dimensional camera;
6) at super-resolution stereopsis (SR_L, SR_R) three above reference mark of upper selection, utilize the three-dimensional coordinate at reference mark and the elements of interior orientation of virtual three-dimensional camera, position, outer orientation and outer orientation attitude respectively left image SR_L and right image SR_R to be set up to the affine Transform Model of calibration again, and utilize least square method to calculate the coefficient of calibration again of affine Transform Model;
7) by the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_R in elements of interior orientation, position, outer orientation and the outer orientation attitude of the virtual three-dimensional camera calibration being obtained by step 5), the super-resolution stereopsis (SR_L, SR_R) that obtained by step 6) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and carry out forward intersection computing in the coordinate substitution collinearity equation for the treatment of gauge point on super-resolution stereopsis (SR_L, SR_R), obtain the three-dimensional coordinate of target to be measured.
2. a kind of three-dimensional measuring method based on super-resolution image reconstruction as claimed in claim 1, is characterized in that: described step 2), obtain the relative position T (V between the camera of left and right in the elements of interior orientation of stereoscopic camera and stereoscopic camera x, V y, V z) and attitude R, it comprises the following steps:
1. use stereoscopic camera to take three-dimensional calibration field, obtain the stereopsis (Calib_L of three-dimensional calibration field, Calib_R), utilize characteristic point matching method to obtain the picpointed coordinate (x at the middle reference mark of stereopsis (Calib_L, Calib_R) of three-dimensional calibration field tL, y tL) and (x iR, y iR), t > 6;
2. by the known three-dimensional coordinate (X in reference mark in the stereopsis (Calib_L, Calib_R) of three-dimensional calibration field t, Y t, Z t) and the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) substitution collinearity equation, by the picpointed coordinate (x at reference mark tL, y tL) and (x tR, y tR) distortion model of substitution pinhole imaging system, and by the distortion model of collinearity equation and pinhole imaging system, the two result after processing is carried out the compensating computation of entire light method, obtains position, the outer orientation T of the elements of interior orientation of stereoscopic camera, left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_Land position, the outer orientation T of right image Calib_R calib_R(X rs, Y rs, Z rs) and attitude R calib_R;
3. by position, the outer orientation T of left image Calib_L calib_L(X ls, Y ls, Z ls) and attitude R calib_L, right image Calib_R position, outer orientation T calib_R(X rs, Y rs, Z rs) and attitude R calib_Rsubstitution formula
R = R Calib _ L - 1 · R Calib _ R T = R Calib _ L - 1 · ( T Calib _ R - T Calib _ L ) ,
Calculate the relative position T (V between the camera of left and right in stereoscopic camera x, V y, V z) and attitude R.
3. a kind of three-dimensional measuring method based on super-resolution image reconstruction as claimed in claim 1 or 2, is characterized in that: in described step 4), obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude, it specifically comprises the following steps:
1. extract four couples of stereopsis (L of target to be measured 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on unique point of the same name, obtain unique point at stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) on picpointed coordinate;
2. by the elements of interior orientation of stereoscopic camera, first stop stereopsis (L 1, R 1) picpointed coordinate and the first stop stereopsis (L of upper unique point 1, R 1) position, outer orientation and outer orientation attitude substitution collinearity equation carry out forward intersection computing, obtain first stop stereopsis (L 1, R 1) three-dimensional coordinate of upper some unique points;
3. to first stop stereopsis (L 1, R 1) three-dimensional coordinate and the left image L of upper some unique points 2, L 3and L 4on the picpointed coordinate of unique point carry out photogrammetric resection computing, obtain successively left image L 2, L 3and L 4position, outer orientation
Figure FDA0000439483680000031
with outer orientation attitude initial value, i=2,3,4;
4. according to left image L 2, L 3and L 4position, outer orientation with outer orientation attitude
Figure FDA0000439483680000034
initial value and the relative position T (V between the camera of left and right x, V y, V z) and attitude R, obtain right image R 2, R 3and R 4position, outer orientation
Figure FDA0000439483680000035
with outer orientation attitude
Figure FDA0000439483680000036
initial value be respectively:
T i right = R i left · T + T i left R i right = R i left · R ;
By left image L 2, L 3and L 4position, outer orientation
Figure FDA0000439483680000038
with outer orientation attitude
Figure FDA0000439483680000039
and right image R 2, R 3and R 4position, outer orientation
Figure FDA00004394836800000310
with outer orientation attitude
Figure FDA00004394836800000311
obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation initial value with outer orientation attitude
Figure FDA00004394836800000313
5. adopt many baselines bundle adjustment method to stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) the initial value of position, outer orientation
Figure FDA00004394836800000314
initial value with outer orientation attitude
Figure FDA00004394836800000315
revise, obtain stereopsis (L 2, R 2), (L 3, R 3) and (L 4, R 4) position, revised outer orientation and outer orientation attitude.
4. a kind of three-dimensional measuring method based on super-resolution image reconstruction as claimed in claim 1 or 2, it is characterized in that: in described step 6), to super-resolution stereopsis (SR_L, SR_R) in, left image SR_L and right image SR_R set up the process of affine Transform Model, and it comprises the following steps:
1. in the upper selection of super-resolution stereopsis (SR_L, SR_R) and at stereopsis (L 1, R 1), (L 2, R 2), (L 3, R 3) and (L 4, R 4) the upper corresponding unique point of unique point of the same name of extracting, and using these unique points of the same name as reference mark, utilize the forward intersection of many baselines to calculate the three-dimensional coordinate at these reference mark;
2. by super-resolution stereopsis (SR_L, SR_R) the upper three-dimensional coordinate at reference mark and the position, outer orientation of virtual three-dimensional camera and outer orientation attitude substitution collinearity equation, calculate coordinate under the photo coordinate system oxy of reference mark on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x ' r, y' r);
3. by the elements of interior orientation of virtual three-dimensional camera, reference mark the coordinate under the photo coordinate system oxy on super-resolution stereopsis (SR_L, SR_R) (x ' l, y ' l) and (x' r, y' r) the camera distortion model of substitution pinhole imaging system, use Newton iteration method to calculate the coordinate (m under the collimation mark coordinate system o'mn of reference mark on super-resolution stereopsis (SR_L, SR_R) l, n l) and (m r, n r);
4. according to reference mark the measurement coordinate under the collimation mark coordinate system o'mn on (SR_L, SR_R) image (m ' l, n ' l) and (m' r, n' r) and the collimation mark coordinate system o'mn of the reference mark that calculates in 3. of step on super-resolution stereopsis (SR_L, SR_R) under coordinate (m l, n l) and (m r, n r), left image SR_L in super-resolution stereopsis (SR_L, SR_R) and right image SR_R are set up respectively to affine Transform Model:
m l ′ - m l = a 0 + a 1 m l + a 2 n l n l ′ - n l = b 0 + b 1 m l + b 2 n l ,
m r ′ - m r = c 0 + c 1 m r + c 2 n r n r ′ - n r = d 0 + d 1 m r + d 2 n r ,
In formula, a 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2it is respectively the coefficient of calibration again of left image SR_L and right image SR_R in super-resolution stereopsis (SR_L, SR_R).
5. a kind of three-dimensional measuring method based on super-resolution image reconstruction as claimed in claim 1 or 2, is characterized in that: in described step 7), obtain the three-dimensional coordinate of target to be measured, it specifically comprises the following steps:
1. according to the coefficient of the calibration again a of the affine Transform Model of left image SR_L and right image SR_R in super-resolution stereopsis (SR_L, SR_R) 0, a 1, a 2, b 0, b 1, b 2and c 0, c 1, c 2, d 0, d 1, d 2and the picpointed coordinate that measures under collimation mark coordinate system o'mn of left image SR_L and right image SR_R with
Figure FDA0000439483680000043
calculate the coordinate after correcting under measuring the collimation mark coordinate system o'mn of target at left image SR_L and right image SR_R
Figure FDA0000439483680000044
with be respectively:
x lrow ′ Tar = x lrow Tar + a 0 + a 1 x lrow Tar + a 2 y lcol Tar y lcol ′ Tar = y lcol Tar + b 0 + b 1 x lrow Tar + b 2 y lcol Tar ,
x rrow ′ Tar = x rrow Tar + c 0 + c 1 x rrow Tar + c 2 y rcol Tar y rcol ′ Tar = y rcol Tar + d 0 + d 1 x rrow Tar + d 2 y rcol Tar ;
2. use the elements of interior orientation of pinhole imaging system distortion model and the calibration of virtual three-dimensional camera, respectively by the coordinate after the correction under left image SR_L collimation mark coordinate system o'mn until gauge point
Figure FDA0000439483680000048
with the coordinate after correction under right image SR_R collimation mark coordinate system o'mn
Figure FDA0000439483680000049
be converted to the point coordinate under left image SR_L photo coordinate system oxy
Figure FDA00004394836800000410
with the point coordinate under right image SR_R photo coordinate system oxy
Figure FDA00004394836800000411
3. by the picpointed coordinate of the position, outer orientation of virtual three-dimensional camera calibration and outer orientation attitude and the left image SR_L of super-resolution stereopsis (SR_L, SR_R)
Figure FDA00004394836800000412
picpointed coordinate with right image SR_R
Figure FDA00004394836800000413
carry out forward intersection computing, obtain the three-dimensional coordinate of target to be measured under O-XYZ coordinate system.
CN201310692480.8A 2013-12-17 2013-12-17 A kind of three-dimensional measuring method based on super-resolution image reconstruction Active CN103644897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310692480.8A CN103644897B (en) 2013-12-17 2013-12-17 A kind of three-dimensional measuring method based on super-resolution image reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310692480.8A CN103644897B (en) 2013-12-17 2013-12-17 A kind of three-dimensional measuring method based on super-resolution image reconstruction

Publications (2)

Publication Number Publication Date
CN103644897A true CN103644897A (en) 2014-03-19
CN103644897B CN103644897B (en) 2016-01-13

Family

ID=50250149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310692480.8A Active CN103644897B (en) 2013-12-17 2013-12-17 A kind of three-dimensional measuring method based on super-resolution image reconstruction

Country Status (1)

Country Link
CN (1) CN103644897B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104729532A (en) * 2015-03-02 2015-06-24 山东科技大学 Strict calibration method of panorama camera
CN109945844A (en) * 2014-05-05 2019-06-28 赫克斯冈技术中心 Measure subsystem and measuring system
CN114333243A (en) * 2021-12-21 2022-04-12 长江三峡勘测研究院有限公司(武汉) Landslide monitoring and early warning method, device, medium, electronic equipment and terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101226057A (en) * 2008-02-01 2008-07-23 武汉朗视软件有限公司 Digital close range photogrammetry method
US20080199083A1 (en) * 2007-02-15 2008-08-21 Industrial Technology Research Institute Image filling methods

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080199083A1 (en) * 2007-02-15 2008-08-21 Industrial Technology Research Institute Image filling methods
CN101226057A (en) * 2008-02-01 2008-07-23 武汉朗视软件有限公司 Digital close range photogrammetry method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李玲玲等: "基于Harris-Affine和SIFT特征匹配的图像自动配准", 《华中科技大学学报(自然科学版)》 *
杨宇翔: "图像超分辨率重建算法研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109945844A (en) * 2014-05-05 2019-06-28 赫克斯冈技术中心 Measure subsystem and measuring system
CN109945844B (en) * 2014-05-05 2021-03-12 赫克斯冈技术中心 Measurement subsystem and measurement system
US11054258B2 (en) 2014-05-05 2021-07-06 Hexagon Technology Center Gmbh Surveying system
CN104729532A (en) * 2015-03-02 2015-06-24 山东科技大学 Strict calibration method of panorama camera
CN104729532B (en) * 2015-03-02 2018-05-01 山东科技大学 A kind of tight scaling method of panorama camera
CN114333243A (en) * 2021-12-21 2022-04-12 长江三峡勘测研究院有限公司(武汉) Landslide monitoring and early warning method, device, medium, electronic equipment and terminal

Also Published As

Publication number Publication date
CN103644897B (en) 2016-01-13

Similar Documents

Publication Publication Date Title
CN110057295B (en) Monocular vision plane distance measuring method without image control
CN106767533B (en) Efficient phase-three-dimensional mapping method and system based on fringe projection technology of profiling
CN104240262B (en) Calibration device and calibration method for outer parameters of camera for photogrammetry
CN102252653B (en) Position and attitude measurement method based on time of flight (TOF) scanning-free three-dimensional imaging
CN102168972B (en) RPC-based method for improving and calibrating block adjustment of three-linear array three-dimensional satellite
CN104677277B (en) A kind of method and system for measuring object geometric attribute or distance
CN105300362B (en) A kind of photogrammetric survey method applied to RTK receiver
CN104457710B (en) Aviation digital photogrammetry method based on non-metric digital camera
CN105424058B (en) Digital camera projection centre position method for precisely marking based on photogrammetric technology
CN106885585B (en) Integrated calibration method of satellite-borne photogrammetry system based on light beam adjustment
CN103630120B (en) Martian surface linear array image core line method for resampling based on tight geometric model
CN104268876A (en) Camera calibration method based on partitioning
CN103822644B (en) A kind of camera calibration method of three-dimensional laser imaging system
CN111091076B (en) Tunnel limit data measuring method based on stereoscopic vision
CN112597428B (en) Flutter detection correction method based on beam adjustment and image resampling of RFM model
CN110986888A (en) Aerial photography integrated method
CN105466400A (en) Method for probing multi-source satellite image corresponding image point imaging intersection angles
CN102735218A (en) Making method for grotto digital line graph based on digital close-range photogrammetry
CN103644897B (en) A kind of three-dimensional measuring method based on super-resolution image reconstruction
CN104613942A (en) Analytic method of image pair relative elements of exterior orientation
Yuan et al. Theoretical accuracy of direct georeferencing with position and orientation system in aerial photogrammetry
Huang et al. Image network generation of uncalibrated UAV images with low-cost GPS data
CN104792314A (en) Method for calculating elements of exterior orientation in ground stereoscopic photograph of ranging photographic instrument
CN104613941A (en) Analysis method of terrestrial photograph Kappa and Omega angle with vertical base line
Pan et al. A general method of generating satellite epipolar images based on RPC model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant