CN103776427B - A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera - Google Patents

A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera Download PDF

Info

Publication number
CN103776427B
CN103776427B CN201410026118.1A CN201410026118A CN103776427B CN 103776427 B CN103776427 B CN 103776427B CN 201410026118 A CN201410026118 A CN 201410026118A CN 103776427 B CN103776427 B CN 103776427B
Authority
CN
China
Prior art keywords
camera
cos
angle
theta
sin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410026118.1A
Other languages
Chinese (zh)
Other versions
CN103776427A (en
Inventor
何红艳
齐文雯
高卫军
王小燕
李方琦
高凌雁
赵占平
李岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing spaceflight Creative Technology Co., Ltd.
Original Assignee
Beijing Institute of Space Research Mechanical and Electricity
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Space Research Mechanical and Electricity filed Critical Beijing Institute of Space Research Mechanical and Electricity
Priority to CN201410026118.1A priority Critical patent/CN103776427B/en
Publication of CN103776427A publication Critical patent/CN103776427A/en
Application granted granted Critical
Publication of CN103776427B publication Critical patent/CN103776427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of optimum configurations and the method for adjustment that are applied to tridimensional mapping camera, the present invention have studied the three-dimensional imaging model of three line scanner mapping camera, lower three cameras of different image-forming condition are provided to the difference of entrance pupil spoke brightness during same target imaging according to the mounting means of different cameral and satellite orbit parameter, and the difference of combining camera real response degree and in-orbit integral time gives the output response difference of three cameras, thus the imaging parameters achieved by arranging a camera (facing camera) adjusts other two cameras in real time.The Dynamic Matching that The present invention gives camera imaging parameter under different latitude is arranged, and improves the radiation quality of stereopsis.

Description

A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera
Technical field
The present invention relates to a kind of optimum configurations and the method for adjustment that are applied to tridimensional mapping camera, belong to space flight optical remote sensor applied technical field.
Background technology
Space survey technology is the important component part of space flight earth observation imaging technique, be very important class imaging observation satellite over the ground, and develop into the important component part of modern war fight capability gradually from the important component part of military combat space flight security system.In civilian technical field of mapping, space survey technology is also the efficient surveying and mapping technology means of a kind of modernization.
Mapping camera is the important useful load on cartographic satellite, camera is measured with the CCD of push-scanning image, receive concern general in the world, be divided three classes according to CCD camera array mode is different and photogrammetry principles is different: single linear CCD camera, spaceborne twin-line array measures camera and three-linear array CCD tridimensional mapping camera.
Three line scanner mapping camera comprise vertical imaging over the ground face camera, the forward sight camera of the imaging that turns forward and tilt backwards the rear view camera of imaging.Each camera carries complete optical imaging system, independently works.In the orbital motion of satellite platform, forward sight camera, face camera and rear view camera a period of time of being separated by and successively imaging is sighted to target area, obtain the superimposed image in this region.Along with the development of surveying and mapping technology, more and more higher requirement is proposed to the radiation of superimposed image and geometrical property.In order to ensure that three cameras are to the consistance of the radiation characteristic of same target imaging, need the imaging parameters of in good time adjustment camera, reach high-precision images match ability.
The orbit parameter of satellite is only considered in the calculating of current sun altitude, namely the sun altitude of the moonscope obtained is camera, and cartographic satellite generally has several cameras, and the mounting means of different cameral on satellite is inconsistent, cause different cameral all different with observation angle to imaging moment during same target imaging, the difference of this observation condition can cause the change in propagation in atmosphere path, thus affects the entrance pupil spoke brightness before same target arrival different cameral.
Summary of the invention
Technology of the present invention is dealt with problems and is: overcome the deficiencies in the prior art, propose a kind of optimum configurations and the method for adjustment that are applied to tridimensional mapping camera, lower three cameras of different image-forming condition are provided to the difference of entrance pupil spoke brightness during same target imaging according to the mounting means of different cameral and satellite orbit parameter, and the difference of combining camera real response degree and in-orbit integral time gives the output response difference of three cameras, thus the imaging parameters achieved by arranging a camera adjusts other two cameras in real time and then reaches the consistance of measurement.
Technical scheme of the present invention is:
Be applied to optimum configurations and the method for adjustment of tridimensional mapping camera, comprise step as follows:
1) sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle is determined; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane crossing terrain object; Described solar azimuth is the angle line projection line at the earth's surface of the sun and terrain object and earth surface being crossed the positive northern bit line of terrain object; Described camera observed azimuth represents the projection of the line of camera and terrain object on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
2) according to declination angle ω and the solar hour angle t of current tridimensional mapping camera imaging moment, the sun altitude h every 10 ° in latitude scope [-90 °, 90 °] is calculated s;
3) according to the sun altitude h calculated in step (2) sand the solar azimuth ψ under declination angle ω and solar hour angle t calculating different latitude s;
4) the observed altitude angle h every 10 ° of lower different cameral in latitude scope [-90 °, 90 °] is calculated according to the optical axis vector of three cameras on cartographic satellite (forward sight camera, rear view camera and face camera) and the inclinometer of satellite orbit vwith observed azimuth ψ v;
5) add up satellite image, obtained the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h s, solar azimuth ψ s, camera observed altitude angle h v, camera observed azimuth ψ vand the real reflectance ρ of terrain object that step 5) calculates, the entrance pupil spoke brightness L of every platform camera under calculating different target reflectivity, thus obtain the spoke luminance difference between different imaging moment three cameras;
7) determine that three cameras are in identical spoke brightness and the responsiveness difference under identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculate to face camera for benchmark, front-and rear-view camera with face the relativeness that camera exports in the response of arbitrary imaging moment; The relativeness defining method that described response exports is as follows:
A) front-and rear-view camera is obtained relative to the spoke luminance difference facing camera according to step 6);
B) front-and rear-view camera is obtained relative to the responsiveness difference facing camera according to step 7);
C) front-and rear-view camera is obtained the integral time calculated in real time according to GPS relative to difference integral time facing camera;
D) carry out that accumulating is multiplied and obtain by step a), b) with spoke luminance difference c) obtained, responsiveness difference and difference integral time the relativeness that final response exports;
9) the absolute calibration coefficient obtained according to integrating sphere radiation calibration data determines that different imaging moment faces the imaging parameters of camera, then adjust according to the imaging parameters of relativeness to front-and rear-view camera of step 8), principle is also first adjust TDI(Time Delay Integration) progression, then adjust camera gain.
Described step 2) sun altitude h sspecific formula for calculation be:
sin h s=cosΦcosωcost+sinΦsinω
h s = arcsin { cos Φ cos [ 23.5 sin ( t 365 ) ] cos t + sin Φ sin [ 23.5 sin ( t 365 ) ] } ;
Wherein t ∈ [0,131400], during from 1 day 0, when end time is the 365th day 24, within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
The solar azimuth ψ of described step 3) sspecific formula for calculation be:
&psi; s = 360 - arccos [ sin h s sin &omega; - sin ( 23.5 sin ( t 365 ) ) cos h s cos &omega; ] 360 k < t < 360 k + 180 arccos [ sin h s sin &omega; - sin ( 23.5 sin ( t 365 ) ) cos h s cos &omega; ] 360 k + 180 &le; t < 360 k + 360
Wherein k be 0,1,2 ..., 364.
The camera observed altitude angle h of described step 4) vwith camera observed azimuth Ψ vspecific formula for calculation be: forward sight camera and the intersection angle faced between camera are φ 1, rear view camera and the intersection angle faced between camera are φ 2, the side-sway angle of satellite is θ (θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum), and orbit inclination is δ, then the observed altitude angle h of three cameras vbe respectively:
Forward sight camera: arccos (cos φ 1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ 2cos θ)
The observed azimuth ψ of camera vfor:
Forward sight camera: 180 - arccos ( sin &theta; cos &phi; 1 1 - ( cos &theta; &CenterDot; cos &phi; 1 ) 2 ) + &delta; &theta; > 0 &delta; - arccos ( sin &theta; cos &phi; 1 1 - ( cos &theta; &CenterDot; cos &phi; 1 ) 2 ) &theta; < 0
Face camera: 180 + &delta; &theta; > 0 &delta; &theta; < 0
Rear view camera: 180 + arccos ( sin &theta; cos &phi; 2 1 - ( cos &theta; &CenterDot; cos &phi; 2 ) 2 ) + &delta; &theta; > 0 arccos ( sin &theta; cos &phi; 2 1 - ( cos &theta; &CenterDot; cos &phi; 2 ) 2 ) + &delta; &theta; < 0
It is wherein 0 ° with direct north.
The absolute calibration coefficient of described step 9) is determined with imaging parameters method of adjustment as follows:
A) first calculate that TDI progression is 1 grade according to integrating sphere radiation calibration data, reference gain and the absolute calibration coefficient k 0 under acquiescence integral time and b0;
B) according to the real-time integral time calculated and the ratio q giving tacit consent to integral time obtain the absolute calibration coefficient k 1=q*k0 under the current integration time in-orbit;
C) corresponding according to entrance pupil spoke brightness 80% Saturated output facing camera absolute calibration coefficient k 2;
D) select suitable TDI progression N according to the size of k2/k1, N must be less than or equal to k2/k1, then by adjusting gain meet the requirement of absolute calibration coefficient.
The present invention's advantage is compared with prior art:
(1) The present invention gives the model of three-dimensional imaging accurately of three line scanner mapping camera, consider sun altitude and position angle, camera heights angle and azimuthal combined influence when the brightness of calculating entrance pupil spoke.
(2) the invention provides the capacity volume variance of three cameras under different image-forming condition and corresponding parameter adjustment scheme, ensure that the consistance of the radiation characteristic of different cameral image output.
(3) the present invention proposes the method to set up of imaging parameters in-orbit of three-linear array stereo mapping camera first, can directly apply to the use in-orbit of tridimensional mapping camera, to guarantee image quality, also can be used for the camera of follow-up motor-driven imaging.
(4) the imaging parameters optimization method in-orbit of the TDICCD camera of the present invention's employing, software simulating is convenient, is easy to adopt Mat lab or C to realize.
Accompanying drawing illustrates:
Fig. 1 be tridimensional mapping camera of the present invention in-orbit imaging parameters adjustment process flow diagram;
Fig. 2 is the schematic diagram of elevation angle of the present invention and position angle definition;
Fig. 3 is the schematic diagram of tridimensional mapping camera observation angle of the present invention.
Embodiment
Structure of the present invention composition and principle of work is further illustrated below in conjunction with accompanying drawing.
As shown in Figure 1, the present invention a kind of be applied to tridimensional mapping camera optimum configurations and the method step of method of adjustment as follows:
1) sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle is determined; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane crossing terrain object; Described solar azimuth is the angle line projection line at the earth's surface of the sun and terrain object and earth surface being crossed the positive northern bit line of terrain object; Described camera observed azimuth represents the projection of the line of camera and terrain object on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
Be the direction schematic diagram of sunlight during the P imaging on a surface target of forward sight camera as shown in Figure 2,3, h srepresenting sun altitude, is the angle between the line of the sun (being considered as a particle) and earth surface any point P and the normal plane crossing P point; ψ srepresent solar azimuth, refer to that the line of the sun and earth surface P point crosses the angle (unified during computer azimuth angle is 0 ° with direct north, takes a round counterclockwise westerly just from 0 ° to 360 ° from north) of the positive northern bit line of P point on the projection line and ground surface of the earth surface of mistake P point.Sun altitude and position angle characterize the parameter of position of sun, determine sunlight for earth surface any point come to; ψ v1be the observed azimuth of forward sight camera, represent the projection of line on ground and the angle of direct north of forward sight camera and impact point, h v1expression forward sight camera and the over the ground angle of optical axis and the observed altitude angle of camera.
2) according to declination angle ω and the solar hour angle t of current tridimensional mapping camera imaging moment, the sun altitude h every 10 ° in latitude scope [-90 °, 90 °] is calculated s;
Sun altitude h sspecific formula for calculation be:
sin h s=cosΦcosωcost+sinΦsinω
h s = arcsin { cos &Phi; cos [ 23.5 sin ( t 365 ) ] cos t + sin &Phi; sin [ 23.5 sin ( t 365 ) ] } ;
Wherein t ∈ [0,131400], during from 1 day 0, when end time is the 365th day 24, within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
3) according to the sun altitude h calculated in step (2) sand the solar azimuth ψ under declination angle ω and solar hour angle t calculating different latitude s;
Solar azimuth ψ sspecific formula for calculation be:
&psi; s = 360 - arccos [ sin h s sin &omega; - sin ( 23.5 sin ( t 365 ) ) cos h s cos &omega; ] 360 k < t < 360 k + 180 arccos [ sin h s sin &omega; - sin ( 23.5 sin ( t 365 ) ) cos h s cos &omega; ] 360 k + 180 &le; t < 360 k + 360
Wherein k be 0,1,2 ..., 364.
4) the observed altitude angle h every 10 ° of lower different cameral in latitude scope [-90 °, 90 °] is calculated according to the optical axis vector of three cameras on cartographic satellite (forward sight camera, rear view camera and face camera) and the inclinometer of satellite orbit vwith observed azimuth ψ v;
Camera observed altitude angle h vwith camera observed azimuth Ψ vspecific formula for calculation be: forward sight camera and the intersection angle faced between camera are φ 1, rear view camera and the intersection angle faced between camera are φ 2, the side-sway angle of satellite is θ (θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum), and orbit inclination is δ, then the observed altitude angle h of three cameras vbe respectively:
Forward sight camera: arccos (cos φ 1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ 2cos θ)
The observed azimuth ψ of camera vfor:
Forward sight camera: 180 - arccos ( sin &theta; cos &phi; 1 1 - ( cos &theta; &CenterDot; cos &phi; 1 ) 2 ) + &delta; &theta; > 0 &delta; - arccos ( sin &theta; cos &phi; 1 1 - ( cos &theta; &CenterDot; cos &phi; 1 ) 2 ) &theta; < 0
Face camera: 180 + &delta; &theta; > 0 &delta; &theta; < 0
Rear view camera: 180 + arccos ( sin &theta; cos &phi; 2 1 - ( cos &theta; &CenterDot; cos &phi; 2 ) 2 ) + &delta; &theta; > 0 arccos ( sin &theta; cos &phi; 2 1 - ( cos &theta; &CenterDot; cos &phi; 2 ) 2 ) + &delta; &theta; < 0
It is wherein 0 ° with direct north.
5) add up satellite image, obtained the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h s, solar azimuth ψ s, camera observed altitude angle h v, camera observed azimuth ψ vand the real reflectance ρ of terrain object that step 5) calculates, the entrance pupil spoke brightness L of every platform camera under calculating different target reflectivity, thus obtain the spoke luminance difference between different imaging moment three cameras;
7) determine that three cameras are in identical spoke brightness and the responsiveness difference under identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculate to face camera for benchmark, front-and rear-view camera with face the relativeness that camera exports in the response of arbitrary imaging moment; The relativeness defining method that described response exports is as follows:
A) front-and rear-view camera is obtained relative to the spoke luminance difference facing camera according to step 6);
B) front-and rear-view camera is obtained relative to the responsiveness difference facing camera according to step 7);
C) front-and rear-view camera is obtained the integral time calculated in real time according to GPS relative to difference integral time facing camera;
D) carry out that accumulating is multiplied and obtain by step a), b) with spoke luminance difference c) obtained, responsiveness difference and difference integral time the relativeness that final response exports;
9) the absolute calibration coefficient obtained according to integrating sphere radiation calibration data determines that different imaging moment faces the imaging parameters of camera, then adjust according to the imaging parameters of relativeness to front-and rear-view camera of step 8), principle is also first adjust TDI(Time Delay Integration) progression, then adjust camera gain;
Absolute calibration coefficient is determined with imaging parameters method of adjustment as follows:
A) first calculate that TDI progression is 1 grade according to integrating sphere radiation calibration data, the b0 of reference gain (namely 1 couple of simulating signal 1:1 quantizes) and the absolute calibration coefficient k 0 given tacit consent under integral time and b0(mapping camera is generally smaller, can not consider in calculating below);
B) according to the real-time integral time calculated and the ratio q giving tacit consent to integral time obtain the absolute calibration coefficient k 1=q*k0 under the current integration time in-orbit;
C) corresponding according to entrance pupil spoke brightness 80% Saturated output facing camera absolute calibration coefficient k 2;
D) select suitable TDI progression N according to the size of k2/k1, N must be less than or equal to k2/k1(and meet as far as possible close to k2/k1), then by adjusting gain meet the requirement of absolute calibration coefficient (in gain in unappeasable situation, then regulate gain absolute calibration system is determined according to demand).
The content be not described in detail in instructions of the present invention belongs to the known technology of those skilled in the art.

Claims (5)

1. be applied to optimum configurations and the method for adjustment of tridimensional mapping camera, it is characterized in that step is as follows:
1) sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle is determined; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane crossing terrain object; Described solar azimuth is the angle line projection line at the earth's surface of the sun and terrain object and earth surface being crossed the positive northern bit line of terrain object; Described camera observed azimuth represents the projection of the line of camera and terrain object on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
2) according to declination angle ω and the solar hour angle t of current tridimensional mapping camera imaging moment, the sun altitude h every 10 ° in latitude scope [-90 °, 90 °] is calculated s;
3) according to step 2) in the sun altitude h that calculates sand the solar azimuth ψ under declination angle ω and solar hour angle t calculating different latitude s;
4) the observed altitude angle h every 10 ° of lower different cameral in latitude scope [-90 °, 90 °] is calculated according to the optical axis vector of three cameras on cartographic satellite and the inclinometer of satellite orbit vwith observed azimuth ψ v;
5) add up satellite image, obtained the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h s, solar azimuth ψ s, camera observed altitude angle h v, camera observed azimuth ψ vand step 5) the real reflectance ρ of terrain object that calculates, the entrance pupil spoke brightness L of every platform camera under calculating different target reflectivity, thus obtain the spoke luminance difference between different imaging moment three cameras;
7) determine that three cameras are in identical spoke brightness and the responsiveness difference under identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculate to face camera for benchmark, front-and rear-view camera with face the relativeness that camera exports in the response of arbitrary imaging moment; The relativeness defining method that described response exports is as follows:
A) according to step 6) obtain front-and rear-view camera relative to the spoke luminance difference facing camera;
B) according to step 7) obtain front-and rear-view camera relative to the responsiveness difference facing camera;
C) front-and rear-view camera is obtained the integral time calculated in real time according to GPS relative to difference integral time facing camera;
D) spoke luminance difference step a), b) with c) obtained, responsiveness difference and difference integral time carry out that accumulating is multiplied and obtain the relativeness that final response exports;
9) the absolute calibration coefficient obtained according to integrating sphere radiation calibration data and then determine that different imaging moment faces the imaging parameters of camera, then according to step 8) the imaging parameters of relativeness to front-and rear-view camera adjust, principle is also first adjust TDI progression, then adjusts camera gain.
2. a kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera according to claim 1, is characterized in that: described step 2) sun altitude h sspecific formula for calculation be:
sin h s=cosΦcosωcost+sinΦsinω
h s = arctan { cos &Phi; cos &lsqb; 23.5 sin ( t 365 ) &rsqb; cos t + sin &Phi; sin &lsqb; 23.5 sin ( t 365 ) &rsqb; } ;
Wherein t ∈ [0,131400], during from 1 day 0, when end time is the 365th day 24, within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
3. a kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera according to claim 1, is characterized in that: described step 3) solar azimuth ψ sspecific formula for calculation be:
&psi; s = 360 - arccos &lsqb; sinh s s i n &omega; - s i n ( 23.5 s i n ( t 365 ) ) cosh s c o s &omega; &rsqb; 360 k < t < 360 k + 180 arccos &lsqb; sinh s s i n &omega; - sin ( 23.5 sin ( t 365 ) ) cosh s c o s &omega; &rsqb; 360 k + 180 &le; t < 360 k + 360
Wherein k be 0,1,2 ..., 364.
4. a kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera according to claim 1, is characterized in that: described step 4) camera observed altitude angle h vwith camera observed azimuth ψ vspecific formula for calculation be: forward sight camera and the intersection angle faced between camera are φ 1, rear view camera and the intersection angle faced between camera are φ 2, the side-sway angle of satellite is θ, and orbit inclination is δ, then the observed altitude angle h of three cameras vbe respectively:
Forward sight camera: arccos (cos φ 1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ 2cos θ)
The observed azimuth ψ of camera vfor:
Forward sight camera: 180 - arccos ( sin&theta;cos&phi; 1 1 - ( c o s &theta; &CenterDot; cos&phi; 1 ) 2 ) + &delta; &theta; > 0 &delta; - arccos ( sin&theta;cos&phi; 1 1 - ( cos &theta; &CenterDot; cos&phi; 1 ) 2 ) &theta; < 0
Face camera: 180 + &delta; &theta; > 0 &delta; &theta; < 0
Rear view camera: 180 + arccos ( sin&theta;cos&phi; 2 1 - ( c o s &theta; &CenterDot; cos&phi; 2 ) 2 ) + &delta; &theta; > 0 arccos ( sin&theta;cos&phi; 2 1 - ( c o s &theta; &CenterDot; cos&phi; 2 ) 2 ) + &delta; &theta; < 0
Be wherein 0 ° with direct north, θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum.
5. a kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera according to claim 1, is characterized in that: described step 9) absolute calibration coefficient determine with imaging parameters method of adjustment as follows:
A) the absolute calibration coefficient k 0 under first calculating reference gain, acquiescence integral time according to integrating sphere radiation calibration data and when TDI progression is 1 grade and b0, described reference gain is that the 1 couple of simulating signal 1:1 quantizes;
B) according to the real-time integral time calculated and the ratio q giving tacit consent to integral time obtain the absolute calibration coefficient k 1=q*k0 under the current integration time in-orbit;
C) corresponding according to entrance pupil spoke brightness 80% Saturated output facing camera absolute calibration coefficient k 2;
D) select suitable TDI progression N according to the size of k2/k1, N must be less than or equal to k2/k1, then by adjusting gain meet the requirement of absolute calibration coefficient.
CN201410026118.1A 2014-01-21 2014-01-21 A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera Active CN103776427B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410026118.1A CN103776427B (en) 2014-01-21 2014-01-21 A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410026118.1A CN103776427B (en) 2014-01-21 2014-01-21 A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera

Publications (2)

Publication Number Publication Date
CN103776427A CN103776427A (en) 2014-05-07
CN103776427B true CN103776427B (en) 2015-11-04

Family

ID=50568953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410026118.1A Active CN103776427B (en) 2014-01-21 2014-01-21 A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera

Country Status (1)

Country Link
CN (1) CN103776427B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105677943A (en) * 2015-12-28 2016-06-15 哈尔滨工业大学 Quantification method of radiation performance indexes of optical mapping camera
US9996905B2 (en) 2016-03-16 2018-06-12 Planet Labs, Inc. Systems and methods for enhancing object visibility for overhead imaging
CN106525002B (en) * 2016-09-28 2019-03-12 北京空间机电研究所 A kind of TDICCD picture moves detection and compensation method
CN107330872A (en) * 2017-06-29 2017-11-07 无锡维森智能传感技术有限公司 Luminance proportion method and apparatus for vehicle-mounted viewing system
CN108896279B (en) * 2018-06-07 2019-08-09 北京空间机电研究所 A kind of autonomous matching test system of super quick dynamic middle imaging space camera integration time
JP7419364B2 (en) * 2019-05-29 2024-01-22 古野電気株式会社 Information processing system, method, and program
CN111639543A (en) * 2020-04-26 2020-09-08 山东科技大学 Hyperspectral remote sensing image wetland classification method based on Markov random field
CN112857306B (en) * 2020-12-31 2022-12-13 航天东方红卫星有限公司 Method for determining continuous solar altitude angle of video satellite at any view direction point

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101982976A (en) * 2010-09-29 2011-03-02 中国科学院国家天文台 System for displaying real-time data of linear-array stereo camera
CN102063558A (en) * 2010-09-10 2011-05-18 航天东方红卫星有限公司 Determination method of imaging condition of agile satellite
CN102081704A (en) * 2011-01-25 2011-06-01 中国科学院国家天文台 Generation method for ontrack-operation injection data of scientific detecting instrument lunar probe
CN102141613A (en) * 2010-12-01 2011-08-03 北京空间机电研究所 Method for determining signal-to-noise ratio of optical remote sensor by combining satellite orbit characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063558A (en) * 2010-09-10 2011-05-18 航天东方红卫星有限公司 Determination method of imaging condition of agile satellite
CN101982976A (en) * 2010-09-29 2011-03-02 中国科学院国家天文台 System for displaying real-time data of linear-array stereo camera
CN102141613A (en) * 2010-12-01 2011-08-03 北京空间机电研究所 Method for determining signal-to-noise ratio of optical remote sensor by combining satellite orbit characteristics
CN102081704A (en) * 2011-01-25 2011-06-01 中国科学院国家天文台 Generation method for ontrack-operation injection data of scientific detecting instrument lunar probe

Also Published As

Publication number Publication date
CN103776427A (en) 2014-05-07

Similar Documents

Publication Publication Date Title
CN103776427B (en) A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera
US11756158B2 (en) Information processing device, information processing method, information processing program, image processing device, and image processing system for associating position information with captured images
US8994822B2 (en) Infrastructure mapping system and method
CN103983254B (en) The motor-driven middle formation method of a kind of novel quick satellite
CN108387206B (en) Carrier three-dimensional attitude acquisition method based on horizon and polarized light
CA2880229A1 (en) Infrastructure mapping system and method
CN109781635B (en) Distributed remote sensing satellite system
WO2020134856A1 (en) Remote sensing satellite system
CA2796162A1 (en) Self-calibrated, remote imaging and data processing system
CN104913780A (en) GNSS-CCD-integrated zenith telescope high-precision vertical deflection fast measurement method
Matsuoka et al. Measurement of large-scale solar power plant by using images acquired by non-metric digital camera on board UAV
CN108761453B (en) Imaging view angle optimization method for image fusion of optical satellite and SAR (synthetic aperture radar) satellite
CN105243653A (en) Fast mosaic technology of remote sensing image of unmanned aerial vehicle on the basis of dynamic matching
CN102564404B (en) Polarized remote sensing earth-atmosphere information separation method based on atmosphere neutral point
US20200333140A1 (en) Image data capturing arrangement
CN106289156B (en) The method of photography point solar elevation is obtained when a kind of satellite is imaged with any attitude
CN103743488A (en) Infrared imaging simulation method for globe limb background characteristics of remote sensing satellite
CN111156956B (en) Space attitude parameter acquisition method based on atmospheric polarization E-vector mode features
CN108827280A (en) A kind of sky polarotactic navigation method based on publicly-owned error update
CN117848503B (en) Multi-spectral-band polarization intelligent detection device and method for three-dimensional restoration of high-precision target
CN107444674A (en) Magic square satellite
Hlotov et al. The analysis of the results of aerial photography experiments on the basis of a developed UAV model
Wen et al. Optimized integration of uvas surveys, and image-based modeling strategies for digital terrain model reconstruction
Xie et al. Texture Camera System with Self-calibration for Use Aboard UAVs
Grayson UAV photogrammetry ground control reductions using GNSS

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20180709

Address after: 100076 Beijing Fengtai District East Highland Wanyuan Dongli 99

Patentee after: Beijing spaceflight Creative Technology Co., Ltd.

Address before: 100076 Beijing South Fengtai District Road 1 Dahongmen 9201 mailbox 5 boxes

Patentee before: Beijing Research Institute of Space Mechanical & Electrical Technology

TR01 Transfer of patent right