CN103776427A - Parameter setting and adjusting method applied to stereo mapping camera - Google Patents
Parameter setting and adjusting method applied to stereo mapping camera Download PDFInfo
- Publication number
- CN103776427A CN103776427A CN201410026118.1A CN201410026118A CN103776427A CN 103776427 A CN103776427 A CN 103776427A CN 201410026118 A CN201410026118 A CN 201410026118A CN 103776427 A CN103776427 A CN 103776427A
- Authority
- CN
- China
- Prior art keywords
- camera
- cos
- sin
- angle
- theta
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a parameter setting and adjusting method applied to a stereo mapping camera. The method comprises the following steps: searching a stereoimaging model of a three-line-array mapping camera, providing the differences of the pupil-entrance radiances when three cameras are used for imaging the same target under different imaging conditions according to mounting modes and satellite orbit parameters of different cameras, and providing the output response differences of the three cameras in combination with the camera actual response degree and the differences of in-orbit integral time so as to adjust the other two cameras in real time by setting an imaging parameter of one camera (an envisage camera). The dynamic matching setting of the camera imaging parameter under different latitudes is provided, and the radiation quality of a stereo image is improved.
Description
Technical field
The present invention relates to a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera, belong to space flight optical remote sensor applied technical field.
Background technology
Space survey technology is the important component part of space flight earth observation imaging technique, be very important class imaging observation satellite over the ground, and develop into gradually the important component part of modern war fight capability from the important component part of military combat space flight security system.In civilian technical field of mapping, space survey technology is also the efficient surveying and mapping technology means of a kind of modernization.
Mapping camera is the important useful load on cartographic satellite, CCD with push-scanning image measures camera, receive general in the world concern, can be divided three classes according to CCD camera array mode difference and photogrammetric principle difference: single linear CCD camera, spaceborne twin-line array is measured camera and three-linear array CCD tridimensional mapping camera.
Three linear array mapping cameras comprise vertical imaging over the ground face camera, the forward sight camera of the imaging that turns forward and the rear view camera of the imaging that recedes.Each camera carries complete optical imaging system, works alone respectively.In the orbital motion of satellite platform, forward sight camera, face camera and rear view camera a period of time of being separated by and successively imaging is sighted in target area, obtain the superimposed image in this region.Along with the development of surveying and mapping technology, the radiation to superimposed image and geometrical property have proposed more and more higher requirement.In order to guarantee the consistance of the radiation characteristic of three cameras to same target imaging, need the imaging parameters of in good time adjustment camera, reach high-precision images match ability.
The orbit parameter of satellite is only considered in the calculating of sun altitude at present, the sun altitude of the moonscope obtaining is camera, and on cartographic satellite, generally there are several cameras, and the mounting means of different cameral on satellite is inconsistent, imaging moment while causing different cameral to same target imaging and observation angle are all different, the difference of this observation condition can cause the change in propagation in atmosphere path, arrives the entrance pupil spoke brightness before different cameral thereby affect same target.
Summary of the invention
Technology of the present invention is dealt with problems and is: overcome the deficiencies in the prior art, a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera proposed, the difference of the entrance pupil spoke brightness while providing lower three cameras of different image-forming conditions to same target imaging according to the mounting means of different cameral and satellite orbit parameter, and combining camera real response degree and in-orbit the difference of integral time provided the output response difference of three cameras, thereby realized the consistance that imaging parameters by a camera is set is adjusted in real time other two cameras and then reached measurement.
Technical scheme of the present invention is:
Be applied to parameter setting and the method for adjustment of tridimensional mapping camera, comprise that step is as follows:
1) determine sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane of crossing terrain object; Described solar azimuth is the angle of crossing the positive northern bit line of terrain object on the projection line at the earth's surface of line of the sun and terrain object and earth surface; Described camera observed azimuth represents that the line of camera and terrain object is at the projection on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
2), according to the declination angle ω in current tridimensional mapping camera imaging moment and solar hour angle t, calculate the sun altitude h every 10 ° in latitude scope [90 °, 90 °]
s;
3) according to the sun altitude h calculating in step (2)
sand declination angle ω and solar hour angle t calculate the solar azimuth ψ under different latitude
s;
4) according to the inclinometer of the optical axis vector of three cameras on cartographic satellite (forward sight camera, rear view camera and face camera) and satellite orbit calculate in latitude scope [90 °, 90 °] every the observed altitude angle of 10 ° of lower different cameral h
vwith observed azimuth ψ
v;
5) statistics satellite image, obtains the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h
s, solar azimuth ψ
s, camera observed altitude angle h
v, camera observed azimuth ψ
vand the real reflectance ρ of the terrain object that calculates of step 5), calculate the entrance pupil spoke brightness L of every camera under different target reflectivity, thereby obtain the spoke luminance difference between three cameras of different imaging moment;
7) determine the responsiveness difference of three cameras under identical spoke brightness and identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculating to face camera is benchmark, front-and rear-view camera with face the relativeness of camera in the response output in arbitrary imaging moment; The relativeness of described response output determines that method is as follows:
A) obtain front-and rear-view camera with respect to the spoke luminance difference of facing camera according to step 6);
B) obtain front-and rear-view camera with respect to the responsiveness difference of facing camera according to step 7);
Obtain front-and rear-view camera with respect to difference integral time of facing camera the integral time of c) calculating in real time according to GPS;
D) the spoke luminance difference that by step a), b) and c) obtains, responsiveness difference and integral time difference accumulate to multiply each other and obtain the relativeness of final response output;
9) the different imaging moment of absolute calibration parameter identification that obtain according to integrating sphere radiation calibration data are faced the imaging parameters of camera, then according to the relativeness of step 8), the imaging parameters of front-and rear-view camera is adjusted, principle is also first to adjust TDI(Time Delay Integration) progression, then adjust camera gain.
Described step 2) sun altitude h
sspecific formula for calculation be:
sin?h
s=cosΦcosωcost+sinΦsinω
Wherein t ∈ [0,131400], since 0 o'clock on the 1st, stopping the moment was 24 o'clock on the 365th, and within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
The solar azimuth ψ of described step 3)
sspecific formula for calculation be:
Wherein k be 0,1,2 ..., 364.
The camera observed altitude angle h of described step 4)
vwith camera observed azimuth Ψ
vspecific formula for calculation be: forward sight camera and the intersection angle of facing between camera are φ
1, rear view camera and the intersection angle of facing between camera are φ
2, the side-sway angle of satellite is θ (θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum), orbit inclination is δ, the observed altitude angle h of three cameras
vbe respectively:
Forward sight camera: arccos (cos φ
1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ
2cos θ)
The observed azimuth ψ of camera
vfor:
Forward sight camera:
Face camera:
Rear view camera:
Wherein take direct north as 0 °.
Absolute calibration parameter identification and the imaging parameters method of adjustment of described step 9) are as follows:
A) first calculating TDI progression according to integrating sphere radiation calibration data is absolute calibration coefficient k 0 and the b0 under 1 grade, benchmark gain and acquiescence integral time;
B) according to obtaining the absolute calibration coefficient k 1=q*k0 under current integral time with the ratio q that gives tacit consent to integral time the integral time of calculating in real time in-orbit;
C) according to absolute calibration coefficient k 2 corresponding to entrance pupil spoke brightness 80% saturated output of facing camera;
D) select suitable TDI progression N according to the size of k2/k1, N must be less than or equal to k2/k1, then gains by tune
meet the requirement of absolute calibration coefficient.
The present invention's advantage is compared with prior art:
(1) the present invention has provided the model of three-dimensional imaging accurately of three linear array mapping cameras, in calculating the brightness of entrance pupil spoke, has considered sun altitude and position angle, camera heights angle and azimuthal combined influence.
(2) the invention provides capacity volume variance and the corresponding parameter adjustment scheme of three cameras under different image-forming conditions, guaranteed the consistance of the radiation characteristic of different cameral image output.
(3) the present invention proposes the method to set up of imaging parameters in-orbit of three-linear array stereo mapping camera first, can directly apply to the use in-orbit of tridimensional mapping camera, to guarantee image quality, also can be used for the camera of follow-up motor-driven imaging.
(4) the imaging parameters optimization method in-orbit of the TDICCD camera that the present invention adopts, software is realized convenient, is easy to adopt Mat lab or C to realize.
Accompanying drawing explanation:
Fig. 1 is the tridimensional mapping camera of the present invention process flow diagram that imaging parameters is adjusted in-orbit;
Fig. 2 is the schematic diagram of elevation angle of the present invention and position angle definition;
Fig. 3 is the schematic diagram of tridimensional mapping camera observation angle of the present invention.
Embodiment
Further illustrate structure composition of the present invention and principle of work below in conjunction with accompanying drawing.
As shown in Figure 1, the present invention is a kind of is applied to the parameter setting of tridimensional mapping camera and the method step of method of adjustment is as follows:
1) determine sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane of crossing terrain object; Described solar azimuth is the angle of crossing the positive northern bit line of terrain object on the projection line at the earth's surface of line of the sun and terrain object and earth surface; Described camera observed azimuth represents that the line of camera and terrain object is at the projection on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
The direction schematic diagram of solar light irradiation while being as shown in Figure 2,3 the P imaging on a surface target of forward sight camera, h
srepresenting sun altitude, is the angle between line and the mistake P normal plane of ordering of the sun (being considered as a particle) and earth surface any point P; ψ
srepresent solar azimuth, refer to the angle (unify take direct north as 0 ° when computer azimuth angle, take a round counterclockwise westerly from north just from 0 ° to 360 °) of line that the sun and earth surface P the order positive northern bit line of mistake P point on the projection line of crossing the earth surface that P order and ground surface.Sun altitude and position angle be to characterize the parameter of position of sun, determined sunlight for earth surface any point come to; ψ
v1be the observed azimuth of forward sight camera, represent that the line of forward sight camera and impact point is at the projection on ground and the angle of direct north, h
v1expression forward sight camera and the over the ground angle of optical axis are the observed altitude angle of camera.
2), according to the declination angle ω in current tridimensional mapping camera imaging moment and solar hour angle t, calculate the sun altitude h every 10 ° in latitude scope [90 °, 90 °]
s;
Sun altitude h
sspecific formula for calculation be:
sin?h
s=cosΦcosωcost+sinΦsinω
Wherein t ∈ [0,131400], since 0 o'clock on the 1st, stopping the moment was 24 o'clock on the 365th, and within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
3) according to the sun altitude h calculating in step (2)
sand declination angle ω and solar hour angle t calculate the solar azimuth ψ under different latitude
s;
Solar azimuth ψ
sspecific formula for calculation be:
Wherein k be 0,1,2 ..., 364.
4) according to the inclinometer of the optical axis vector of three cameras on cartographic satellite (forward sight camera, rear view camera and face camera) and satellite orbit calculate in latitude scope [90 °, 90 °] every the observed altitude angle of 10 ° of lower different cameral h
vwith observed azimuth ψ
v;
Camera observed altitude angle h
vwith camera observed azimuth Ψ
vspecific formula for calculation be: forward sight camera and the intersection angle of facing between camera are φ
1, rear view camera and the intersection angle of facing between camera are φ
2, the side-sway angle of satellite is θ (θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum), orbit inclination is δ, the observed altitude angle h of three cameras
vbe respectively:
Forward sight camera: arccos (cos φ
1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ
2cos θ)
The observed azimuth ψ of camera
vfor:
Forward sight camera:
Face camera:
Rear view camera:
Wherein take direct north as 0 °.
5) statistics satellite image, obtains the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h
s, solar azimuth ψ
s, camera observed altitude angle h
v, camera observed azimuth ψ
vand the real reflectance ρ of the terrain object that calculates of step 5), calculate the entrance pupil spoke brightness L of every camera under different target reflectivity, thereby obtain the spoke luminance difference between three cameras of different imaging moment;
7) determine the responsiveness difference of three cameras under identical spoke brightness and identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculating to face camera is benchmark, front-and rear-view camera with face the relativeness of camera in the response output in arbitrary imaging moment; The relativeness of described response output determines that method is as follows:
A) obtain front-and rear-view camera with respect to the spoke luminance difference of facing camera according to step 6);
B) obtain front-and rear-view camera with respect to the responsiveness difference of facing camera according to step 7);
Obtain front-and rear-view camera with respect to difference integral time of facing camera the integral time of c) calculating in real time according to GPS;
D) the spoke luminance difference that by step a), b) and c) obtains, responsiveness difference and integral time difference accumulate to multiply each other and obtain the relativeness of final response output;
9) the different imaging moment of absolute calibration parameter identification that obtain according to integrating sphere radiation calibration data are faced the imaging parameters of camera, then according to the relativeness of step 8), the imaging parameters of front-and rear-view camera is adjusted, principle is also first to adjust TDI(Time Delay Integration) progression, then adjust camera gain;
Absolute calibration parameter identification and imaging parameters method of adjustment are as follows:
A) first according to integrating sphere radiation calibration data calculate TDI progression be 1 grade, benchmark gain (1 couple of simulating signal 1:1 quantizes) and give tacit consent to absolute calibration coefficient k 0 under integral time and the b0 of b0(mapping camera generally smaller, in calculating below, can not consider);
B) according to obtaining the absolute calibration coefficient k 1=q*k0 under current integral time with the ratio q that gives tacit consent to integral time the integral time of calculating in real time in-orbit;
C) according to absolute calibration coefficient k 2 corresponding to entrance pupil spoke brightness 80% saturated output of facing camera;
D) select suitable TDI progression N according to the size of k2/k1, N must be less than or equal to k2/k1(and the satisfied k2/k1 that as far as possible approaches), then gain by tune
meet the requirement of absolute calibration coefficient (in gain
in unappeasable situation, then regulate gain
absolute calibration system is determined according to demand).
The content not being described in detail in instructions of the present invention belongs to those skilled in the art's known technology.
Claims (5)
1. be applied to parameter setting and the method for adjustment of tridimensional mapping camera, it is characterized in that step is as follows:
1) determine sun altitude, solar azimuth, camera observed azimuth and camera observed altitude angle; Described sun altitude is the angle between the line of the sun and terrain object and the normal plane of crossing terrain object; Described solar azimuth is the angle of crossing the positive northern bit line of terrain object on the projection line at the earth's surface of line of the sun and terrain object and earth surface; Described camera observed azimuth represents that the line of camera and terrain object is at the projection on ground and the angle of direct north; Described camera observed altitude angle represents camera angle to the earth's axis vertical with satellite;
2), according to the declination angle ω in current tridimensional mapping camera imaging moment and solar hour angle t, calculate the sun altitude h every 10 ° in latitude scope [90 °, 90 °]
s;
3) according to the sun altitude h calculating in step (2)
sand declination angle ω and solar hour angle t calculate the solar azimuth ψ under different latitude
s;
4) according to the inclinometer of the optical axis vector of three cameras on cartographic satellite and satellite orbit calculate in latitude scope [90 °, 90 °] every the observed altitude angle of 10 ° of lower different cameral h
vwith observed azimuth ψ
v;
5) statistics satellite image, obtains the real reflectance of terrain object under the Spring Equinox, the Summer Solstice, the Autumnal Equinox and Winter Solstice four typical solar term by image inverting;
6) according to sun altitude h
s, solar azimuth ψ
s, camera observed altitude angle h
v, camera observed azimuth ψ
vand the real reflectance ρ of the terrain object that calculates of step 5), calculate the entrance pupil spoke brightness L of every camera under different target reflectivity, thereby obtain the spoke luminance difference between three cameras of different imaging moment;
7) determine the responsiveness difference of three cameras under identical spoke brightness and identical camera parameter, according to integrating sphere radiation calibration data, extract the calibration data of three cameras under identical imaging parameters, draw the responsiveness difference of different cameral under identical spoke brightness;
8) calculating to face camera is benchmark, front-and rear-view camera with face the relativeness of camera in the response output in arbitrary imaging moment; The relativeness of described response output determines that method is as follows:
A) obtain front-and rear-view camera with respect to the spoke luminance difference of facing camera according to step 6);
B) obtain front-and rear-view camera with respect to the responsiveness difference of facing camera according to step 7);
Obtain front-and rear-view camera with respect to difference integral time of facing camera the integral time of c) calculating in real time according to GPS;
D) the spoke luminance difference that by step a), b) and c) obtains, responsiveness difference and integral time difference accumulate to multiply each other and obtain the relativeness of final response output;
9) the absolute calibration coefficient obtaining according to integrating sphere radiation calibration data and then definite different imaging moment are faced the imaging parameters of camera, then according to the relativeness of step 8), the imaging parameters of front-and rear-view camera is adjusted, principle is also first to adjust TDI progression, then adjusts camera gain.
2. a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera according to claim 1, is characterized in that: described step 2) sun altitude h
sspecific formula for calculation be:
sin?h
s=cosΦcosωcost+sinΦsinω
Wherein t ∈ [0,131400], since 0 o'clock on the 1st, stopping the moment was 24 o'clock on the 365th, and within every 24 hours, t has changed 360 °, and Φ is expressed as latitude.
3. a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera according to claim 1, is characterized in that: the solar azimuth ψ of described step 3)
sspecific formula for calculation be:
Wherein k be 0,1,2 ..., 364.
4. a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera according to claim 1, is characterized in that: the camera observed altitude angle h of described step 4)
vwith camera observed azimuth ψ
vspecific formula for calculation be: forward sight camera and the intersection angle of facing between camera are φ
1, rear view camera and the intersection angle of facing between camera are φ
2, the side-sway angle of satellite is θ, orbit inclination is δ, the observed altitude angle h of three cameras
vbe respectively:
Forward sight camera: arccos (cos φ
1cos θ)
Face camera: θ
Rear view camera: arccos (cos φ
2cos θ)
The observed azimuth ψ of camera
vfor:
Forward sight camera:
Face camera:
Rear view camera:
Wherein, take direct north as 0 °, θ >0 represents that θ <0 represents side-sway westerly toward east side pendulum.
5. a kind of parameter setting and method of adjustment that is applied to tridimensional mapping camera according to claim 1, is characterized in that: absolute calibration parameter identification and the imaging parameters method of adjustment of described step 9) are as follows:
A) first calculate benchmark gain, give tacit consent to absolute calibration coefficient k 0 and b0 under integral time and when TDI progression is 1 grade according to integrating sphere radiation calibration data, described benchmark gain is that 1 couple of simulating signal 1:1 quantizes;
B) according to obtaining the absolute calibration coefficient k 1=q*k0 under current integral time with the ratio q that gives tacit consent to integral time the integral time of calculating in real time in-orbit;
C) according to absolute calibration coefficient k 2 corresponding to entrance pupil spoke brightness 80% saturated output of facing camera;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410026118.1A CN103776427B (en) | 2014-01-21 | 2014-01-21 | A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410026118.1A CN103776427B (en) | 2014-01-21 | 2014-01-21 | A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103776427A true CN103776427A (en) | 2014-05-07 |
CN103776427B CN103776427B (en) | 2015-11-04 |
Family
ID=50568953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410026118.1A Active CN103776427B (en) | 2014-01-21 | 2014-01-21 | A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103776427B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105677943A (en) * | 2015-12-28 | 2016-06-15 | 哈尔滨工业大学 | Quantification method of radiation performance indexes of optical mapping camera |
CN106525002A (en) * | 2016-09-28 | 2017-03-22 | 北京空间机电研究所 | TDICCD image motion detection and compensation method |
CN107330872A (en) * | 2017-06-29 | 2017-11-07 | 无锡维森智能传感技术有限公司 | Luminance proportion method and apparatus for vehicle-mounted viewing system |
US9996905B2 (en) | 2016-03-16 | 2018-06-12 | Planet Labs, Inc. | Systems and methods for enhancing object visibility for overhead imaging |
CN108896279A (en) * | 2018-06-07 | 2018-11-27 | 北京空间机电研究所 | A kind of autonomous matching test system of super quick dynamic middle imaging space camera integration time |
CN111639543A (en) * | 2020-04-26 | 2020-09-08 | 山东科技大学 | Hyperspectral remote sensing image wetland classification method based on Markov random field |
CN112857306A (en) * | 2020-12-31 | 2021-05-28 | 航天东方红卫星有限公司 | Method for determining continuous solar altitude angle of video satellite at any view direction point |
CN113939710A (en) * | 2019-05-29 | 2022-01-14 | 古野电气株式会社 | Information processing system, method, and program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101982976A (en) * | 2010-09-29 | 2011-03-02 | 中国科学院国家天文台 | System for displaying real-time data of linear-array stereo camera |
CN102063558A (en) * | 2010-09-10 | 2011-05-18 | 航天东方红卫星有限公司 | Determination method of imaging condition of agile satellite |
CN102081704A (en) * | 2011-01-25 | 2011-06-01 | 中国科学院国家天文台 | Generation method for ontrack-operation injection data of scientific detecting instrument lunar probe |
CN102141613A (en) * | 2010-12-01 | 2011-08-03 | 北京空间机电研究所 | Method for determining signal-to-noise ratio of optical remote sensor by combining satellite orbit characteristics |
-
2014
- 2014-01-21 CN CN201410026118.1A patent/CN103776427B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102063558A (en) * | 2010-09-10 | 2011-05-18 | 航天东方红卫星有限公司 | Determination method of imaging condition of agile satellite |
CN101982976A (en) * | 2010-09-29 | 2011-03-02 | 中国科学院国家天文台 | System for displaying real-time data of linear-array stereo camera |
CN102141613A (en) * | 2010-12-01 | 2011-08-03 | 北京空间机电研究所 | Method for determining signal-to-noise ratio of optical remote sensor by combining satellite orbit characteristics |
CN102081704A (en) * | 2011-01-25 | 2011-06-01 | 中国科学院国家天文台 | Generation method for ontrack-operation injection data of scientific detecting instrument lunar probe |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105677943A (en) * | 2015-12-28 | 2016-06-15 | 哈尔滨工业大学 | Quantification method of radiation performance indexes of optical mapping camera |
US9996905B2 (en) | 2016-03-16 | 2018-06-12 | Planet Labs, Inc. | Systems and methods for enhancing object visibility for overhead imaging |
US10249024B2 (en) | 2016-03-16 | 2019-04-02 | Plant Labs, Inc. | Systems and methods for enhancing object visibility for overhead imaging |
CN106525002A (en) * | 2016-09-28 | 2017-03-22 | 北京空间机电研究所 | TDICCD image motion detection and compensation method |
CN106525002B (en) * | 2016-09-28 | 2019-03-12 | 北京空间机电研究所 | A kind of TDICCD picture moves detection and compensation method |
CN107330872A (en) * | 2017-06-29 | 2017-11-07 | 无锡维森智能传感技术有限公司 | Luminance proportion method and apparatus for vehicle-mounted viewing system |
CN108896279A (en) * | 2018-06-07 | 2018-11-27 | 北京空间机电研究所 | A kind of autonomous matching test system of super quick dynamic middle imaging space camera integration time |
CN113939710A (en) * | 2019-05-29 | 2022-01-14 | 古野电气株式会社 | Information processing system, method, and program |
CN111639543A (en) * | 2020-04-26 | 2020-09-08 | 山东科技大学 | Hyperspectral remote sensing image wetland classification method based on Markov random field |
CN112857306A (en) * | 2020-12-31 | 2021-05-28 | 航天东方红卫星有限公司 | Method for determining continuous solar altitude angle of video satellite at any view direction point |
Also Published As
Publication number | Publication date |
---|---|
CN103776427B (en) | 2015-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103776427B (en) | A kind of optimum configurations and method of adjustment being applied to tridimensional mapping camera | |
US11756158B2 (en) | Information processing device, information processing method, information processing program, image processing device, and image processing system for associating position information with captured images | |
Fazeli et al. | Evaluating the potential of RTK-UAV for automatic point cloud generation in 3D rapid mapping | |
CN103438900B (en) | The collaborative absolute radiation calibration of three line scanner camera image and correction method | |
CN108387206B (en) | Carrier three-dimensional attitude acquisition method based on horizon and polarized light | |
CA2880229A1 (en) | Infrastructure mapping system and method | |
CA2796162A1 (en) | Self-calibrated, remote imaging and data processing system | |
CN104462776A (en) | Method for absolutely radiometric calibration of low orbit earth observation satellite with moon as reference | |
CN111829964A (en) | Distributed remote sensing satellite system | |
Matsuoka et al. | Measurement of large-scale solar power plant by using images acquired by non-metric digital camera on board UAV | |
CN101813481B (en) | Virtual horizontal reference correction-based inertial and astronomical positioning method for onboard environment | |
Walczak et al. | Light pollution mapping from a stratospheric high-altitude balloon platform | |
CN102564404B (en) | Polarized remote sensing earth-atmosphere information separation method based on atmosphere neutral point | |
US20200333140A1 (en) | Image data capturing arrangement | |
CN106289156B (en) | The method of photography point solar elevation is obtained when a kind of satellite is imaged with any attitude | |
CN103743488A (en) | Infrared imaging simulation method for globe limb background characteristics of remote sensing satellite | |
CN111156956B (en) | Space attitude parameter acquisition method based on atmospheric polarization E-vector mode features | |
Guan et al. | The novel method of north finding based on the skylight polarization | |
CN108827280A (en) | A kind of sky polarotactic navigation method based on publicly-owned error update | |
CN213658968U (en) | Aviation coaxial remote sensing device based on multiple sensors | |
Anderson et al. | Supporting remote sensing research with small unmanned aerial systems | |
Hlotov et al. | The analysis of the results of aerial photography experiments on the basis of a developed UAV model | |
US20240019588A1 (en) | Onboard geolocation for images | |
Grayson | UAV photogrammetry ground control reductions using GNSS | |
Wen et al. | Optimized integration of uvas surveys, and image-based modeling strategies for digital terrain model reconstruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20180709 Address after: 100076 Beijing Fengtai District East Highland Wanyuan Dongli 99 Patentee after: Beijing spaceflight Creative Technology Co., Ltd. Address before: 100076 Beijing South Fengtai District Road 1 Dahongmen 9201 mailbox 5 boxes Patentee before: Beijing Research Institute of Space Mechanical & Electrical Technology |
|
TR01 | Transfer of patent right |