CN110702228B - Edge radiation correction method for aviation hyperspectral image - Google Patents

Edge radiation correction method for aviation hyperspectral image Download PDF

Info

Publication number
CN110702228B
CN110702228B CN201910912897.8A CN201910912897A CN110702228B CN 110702228 B CN110702228 B CN 110702228B CN 201910912897 A CN201910912897 A CN 201910912897A CN 110702228 B CN110702228 B CN 110702228B
Authority
CN
China
Prior art keywords
sensor
correction
radiation
angle
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910912897.8A
Other languages
Chinese (zh)
Other versions
CN110702228A (en
Inventor
谭琨
牛超
王雪
杜培军
丁建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
East China Normal University
Original Assignee
East China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by East China Normal University filed Critical East China Normal University
Priority to CN201910912897.8A priority Critical patent/CN110702228B/en
Publication of CN110702228A publication Critical patent/CN110702228A/en
Application granted granted Critical
Publication of CN110702228B publication Critical patent/CN110702228B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J2003/283Investigating the spectrum computer-interfaced
    • G01J2003/2843Processing for eliminating interfering spectra

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an edge radiation correction method for an aviation hyperspectral image, belongs to the field of hyperspectral image correction, and is suitable for correcting edge radiation distortion of aviation hyperspectral data, eliminating the radiation brightness gradient phenomenon among different strips caused by the ground surface two-way reflection effect and the radiation attenuation effect, and realizing hyperspectral image edge correction and image stitching under the conditions of multiple strips and different imaging time. The method comprises the following steps: a. laboratory testing and radiometric calibration of the sensor; b. reading attitude parameters of a sensor and correcting exposure time; c. solving a radiation attenuation coefficient; d. solving a BRDF correction coefficient; e. and (5) edge radiation correction of the image. The edge radiation correction method based on the surface two-direction reflection effect and the radiation attenuation effect caused by the change of the visual angle, the attitude and the zenith angle of the sensor is constructed, the edge radiation correction is carried out on the image data, and the edge radiation distortion of the aviation hyperspectral image is eliminated.

Description

Edge radiation correction method for aviation hyperspectral image
Technical Field
The invention belongs to the field of hyperspectral image correction, and relates to an edge radiation correction method for an aerial hyperspectral image, which is suitable for correcting edge radiation distortion of aerial hyperspectral data, eliminating the radiation brightness gradient phenomenon among different strips caused by the ground surface two-way reflection effect and the radiation attenuation effect, and realizing the edge correction and image splicing of the hyperspectral image under multiple strips and different imaging times.
Background
The aviation hyperspectral remote sensing has high space and high spectral resolution, and plays a very important role in the aspect of regional ecological environment evaluation. However, in the imaging process of the aerial image, due to the influence of the visual angle, the irradiance, the bidirectional reflection distribution function, the radiation attenuation effect and the like of the sensor, the edge of the image has radiation distortion, so that the radiation of the same ground object between adjacent strips is inconsistent, the edge radiation correction needs to be carried out on the image, and accurate spectral data are provided for the construction of models for fine ground object classification, soil heavy metal inversion, vegetation physical and chemical parameter inversion and the like by using aerial hyperspectral data.
In the study of edge radiation correction, distortion caused by the Bidirectional Reflectance Distribution Function (BRDF) effect is a focus of attention. Typically, the impact of BRDF is eliminated by building empirical and semi-empirical models. The empirical model mainly considers the statistical characteristics of the image, establishes a brightness coefficient related to the visual angle through least square fitting, and corrects the brightness coefficient by using the minimum value. But is generally easily ignored for radiation attenuation effects, which are typically caused by radiation path differences due to changes in sensor attitude.
Disclosure of Invention
The invention aims to provide an edge radiation correction method for an aerial hyperspectral image, which eliminates the radiation brightness gradient phenomenon among different strips caused by the ground surface two-way reflection effect and the radiation attenuation effect and realizes the edge correction and image splicing of the hyperspectral image under multiple strips and different imaging time.
The specific technical scheme for realizing the purpose of the invention is as follows:
an edge radiation correction method for an aerial hyperspectral image comprises the following specific steps:
step 1: performing laboratory testing and radiometric calibration of sensors
The laboratory test content comprises wavelength calibration and sensitivity test; the wavelength calibration is carried out by using a monochromator to calibrate, so that the wavelength is accurately calibrated; the sensitivity test uses an integrating sphere to set different integrating sphere powers and integrating times, an integrating sphere spectrum curve under corresponding power and time is obtained when the sensor irradiates the integrating sphere, and the sensitivity test is completed by comparing the curve change; finally, acquiring the radiometric calibration coefficient of the sensor by using the energy level data acquired by the integrating sphere and the image data acquired by the sensor;
step 2: sensor attitude parameter reading and exposure time correction
Reading attitude information acquired by a sensor in real time in the flight process, and using the attitude information as known data for subsequent calculation; in addition, in the hyperspectral image acquisition process, under the influence of illumination intensity and the digital digit of the sensor, different exposure time is adopted at different time nodes, and the exposure time is corrected according to the ratio of the normalized exposure time to the set exposure time; the exposure time correction coefficient e is calculated as follows:
Figure BDA0002215228690000021
in the formula (I), the compound is shown in the specification,
Figure BDA0002215228690000022
setting the exposure time to be 13.72ms according to the actual experimental condition for normalizing the exposure time; t is exposure time set by different time nodes;
and step 3: radiation attenuation coefficient calculation
The influence of the flight attitude obtained in the step 2 on the radiation energy transmission path is compared and analyzed to obtain the pitch angle theta of the sensorpAngle of roll thetarAngle of view of sensor thetaiVarying radiation path difference Δ Hθi
Figure BDA0002215228690000023
Radiation path differences under different visual angles are constructed by selecting pos data of 500 lines, and a mean value is calculated to draw a fitting line. (ii) a The radiation path difference corresponding to the visual angle of the sensor forms a unitary quadratic linear correlation, and the sensor detects theta according to the Bougner-Lanmert transmission lawiThe radiation intensity after atmospheric attenuation at the viewing angle is:
Figure BDA0002215228690000024
wherein λ is the wavelength, Ls(lambda) is the emittance of the ground object, H is the flying height, Delta HθiCalculating the radiation path difference according to the attitude of the sensor; the correction term μ, which finally takes into account the radiation attenuation coefficient, is as follows:
Figure BDA0002215228690000025
in the formula, b0,b1,b2Is the model coefficient;
and 4, step 4: BRDF correction factor solution
A unitary quartic empirical model is established by introducing BRDF correction coefficients and radiation attenuation coefficients, and a fitting formula of average radiance values and the empirical model under different visual angles is established as follows:
Figure BDA0002215228690000026
in the formula [ theta ]iThe visual angle of the sensor is in the range of-17 degrees to 17 degrees;
Figure BDA0002215228690000027
is thetaiAverage radiance value at viewing angle, a0,a1,a2,a3,a4,b0,b1,b2Is the model coefficient; let the fitting model formula be f (theta)i) The BRDF correction coefficient c at different viewing angles is:
Figure BDA0002215228690000028
and 5: edge radiation correction of images
Adding the sun zenith angle and the sensor zenith angle in the correction function as a correctionA positive term, comprehensively considering the description of the directional reflection characteristics of the ground objects in the Hapke model and the Lommel-Seeliger function, and introducing a Lommel-Seeliger factor
Figure BDA0002215228690000031
The final correction function is obtained as follows:
Figure BDA0002215228690000032
wherein e is an exposure time correction coefficient, θiFrom the sensor perspective, czFor BRDF correction coefficients at different wave bands, Lzi) Is Z wave band thetaiThe corrected front radiance under the visual angle, alpha and beta are respectively a sun zenith angle and a sensor zenith angle;
Figure BDA0002215228690000033
the normalized sun zenith angle is 40 degrees, and the normalized sun zenith angle is obtained by calculating the average zenith angle of all the flight belts;
Figure BDA0002215228690000034
the normalized zenith angle of the sensor is 0 degree; and calculating the original radiance data line by line according to the correction function to obtain an image with edge radiation distortion eliminated.
The invention has the beneficial effects that:
the invention comprehensively considers the radiation distortion caused by the BRDF effect and the radiation path difference, and fits to obtain the corresponding relation between the radiance and the visual angle of the sensor, thereby realizing the elimination of the edge radiation distortion of the aviation hyperspectral image. The method has the advantages of small operand, high precision and the like, achieves good results in the aspect of edge radiation correction of the aerial high-spectrum image, eliminates the strip edge difference phenomenon of the high-spectrum image, and realizes seamless mosaic of a plurality of strips of high-spectrum images.
Drawings
FIG. 1 is a graph of wavelength scaling results;
FIG. 2 is a graph of sensitivity test results;
FIG. 3 is a radiometric calibration flow diagram;
FIG. 4 is a diagram of sensor pitch change attitude;
FIG. 5 is a diagram of roll angle variation of the sensor;
FIG. 6 is a fitting graph of radiation path difference corresponding to a sensor view angle;
FIG. 7 is a comparison graph of radiance curves of the same image points of the images before and after edge radiance correction;
FIG. 8 is a diagram of the result of full image stitching before edge radiance correction;
FIG. 9 is a graph of the result of image full stitching after edge radiance correction.
Detailed description of the preferred embodiments
The invention is described in detail below with reference to the accompanying drawings and examples.
Examples
In this embodiment, edge radiation correction of an image is performed by taking the processing of a HeadWall airborne hyperspectral sensor in a great victory mining area of inner Mongolia tin forest as an example, and the specific steps are as follows:
the method comprises the following steps: performing laboratory testing and radiometric calibration of sensors
The laboratory test content mainly comprises wavelength calibration, sensitivity test and the like. The wavelength calibration is carried out by using a monochromator for calibration, the central wavelength and the bandwidth of each wave band are obtained, and the wavelength is accurately calibrated by analyzing and comparing the output wavelength of the monochromator and the wave band corresponding to the hyperspectral sensor for linear fitting. The instrument sensitivity contrast test is to contrast the spectral curve of the sensor when irradiating the integrating sphere, and complete the sensitivity contrast test by setting the integrating sphere power and the integrating time. The wavelength calibration test result is as shown in figure 1, and the wavelength calibration of the sensor by the monochromator band by band can be seen through a chart, so that the wavelength information of the sensor is accurately obtained; the sensitivity test result is shown in figure 2, and it can be seen that the sensor has better response to different integration powers and integration times, and the sensitivity of the sensor meets the experimental requirements.
For radiometric calibration of the sensor, a standard radiation surface light source with high precision and stability is required, and an integrating sphere system is generally adopted. The integrating sphere system comprises an integrating sphere, a standard lamp and a controller, and because the image data acquired by the spectrometer can generate supersaturation, the output power and the integration time of the standard lamp need to be set. And performing linear fitting band by comparing energy level data of the integrating sphere, namely the relation between the input radiance value under corresponding power and integrating time and the output DN value of the spectrometer, so as to finish the radiometric calibration of the spectrometer. The specific radiometric calibration flow chart is as shown in figure 3, the integrating sphere system acquires energy level data, the sensor acquires image data and dark current information, and the acquisition of the radiometric calibration coefficient of the sensor is completed.
Step two: sensor attitude parameter reading and exposure time correction
And reading POS data acquired by the sensor in real time to acquire real-time attitude information of the sensor, and solving a correction coefficient. In addition, in the hyperspectral image acquisition process, different exposure times are adopted at different time nodes under the influence of illumination intensity and the digital digit of the instrument, and the correction of the exposure times is required. The exposure time correction coefficient e is calculated as follows:
Figure BDA0002215228690000041
step three: radiation attenuation coefficient calculation
And acquiring position and attitude information of the sensor at the moment of flight through the second step, wherein the change of the attitude of the sensor can cause the scanning center image point of the sensor to generate position offset with the off-board point, so that the distance of the sensor for receiving the earth surface reflected radiation energy is changed, radiation attenuation of different degrees occurs, and further edge radiation distortion is caused. The influence of the flight attitude of the airplane on the radiation energy transmission path is compared and analyzed, and the radiation path difference caused by the pitch angle is obtained
Figure BDA0002215228690000042
H is altitude, thetapIs the sensor pitch angle. The difference in radiation paths caused by the roll angle is
Figure BDA0002215228690000043
Wherein H isHigh, thetarIs the sensor roll angle and θ is the sensor view angle. Obtaining the final radiation path difference
Figure BDA0002215228690000044
The attitude change diagram of the sensor is shown in the attached figures 4 and 5;
radiation path differences at different viewing angles are constructed by selecting 500 lines of pos data, and a mean value drawing fit line is calculated, as shown in fig. 6. It can be seen that the radiation path difference corresponding to the visual angle of the sensor forms a unitary quadratic linear correlation, and the sensor detects theta according to the Bougner-Lanmert transmission lawiThe radiation intensity after atmospheric attenuation at the viewing angle is:
Figure BDA0002215228690000051
wherein λ is the wavelength, Ls(lambda) is the emittance of the ground object, H is the flying height, Delta HθiIs the radiation path difference calculated by the equation. The correction term μ, which finally takes into account the radiation attenuation coefficient, is as follows:
Figure BDA0002215228690000052
step four: BRDF correction factor solution
An empirical model is established by introducing a BRDF correction coefficient and a radiation attenuation coefficient, and a unitary quartic model function of radiance is established, wherein the formula is as follows:
Figure BDA0002215228690000053
where θ is the sensor viewing angle, taking the HeadWall a sensor as an example, the viewing angle is 34 °, and the viewing angle θ ranges from-17 ° to 17 °.
Figure BDA0002215228690000054
Is thetaiAverage radiance value at viewing angle, a0,a1,a2,a3,a4,b0,b1,b2Are model coefficients.
Let the fitting model formula be f (theta)i) The BRDF correction coefficient c under different visual angles can be obtained as follows:
Figure BDA0002215228690000055
step five: edge radiation correction of images
Adding a sun zenith angle and a sensor zenith angle as correction terms, comprehensively considering the description of the directional reflection characteristics of the ground object in a Hapke model and a Lommel-Seeliger function, and introducing a Lommel-Seeliger factor
Figure BDA0002215228690000056
The final correction function is obtained as follows:
Figure BDA0002215228690000057
wherein e is an exposure time correction coefficient, θiFrom the sensor perspective, czFor BRDF correction coefficients at different wave bands, Lzi) In order to correct the front radiance, alpha and beta are respectively a sun zenith angle and a sensor zenith angle;
Figure BDA0002215228690000058
is a normalized sun zenith angle of 40 degrees and is obtained by calculating the average zenith angle of all the air strips.
Figure BDA0002215228690000059
Is a normalized sensor zenith angle of 0 deg.. And calculating the original radiance data line by line according to the correction function to obtain an image with edge radiation distortion eliminated. The radiance curves of the adjacent strips with the same name image points before and after correction are compared with the radiance curve of the image points with the same name before and after correction shown in the attached figure 7, so that the radiance curves of the vegetation of two different land features and the sandy soil before and after correction are compared. It can be obviously seen that the radiance curve before correction has obvious difference, and after correction said difference phenomenonIs eliminated. Image splicing is carried out on each strip, a corrected image full-splicing result graph is obtained and is shown in an attached figure 8, a large amount of edge radiation distortion exists, and the strip phenomenon is obvious. The image full-stitching result after correction is shown in figure 9, and it can be seen that the correction method of the invention eliminates edge radiation distortion, eliminates the brightness gradient of the image before correction and the obvious brightness difference between the strips, and completes the seamless mosaic of the image.

Claims (1)

1. An edge radiation correction method for an aerial hyperspectral image is characterized by comprising the following specific steps:
step 1: performing laboratory testing and radiometric calibration of sensors
The laboratory test content comprises wavelength calibration and sensitivity test; the wavelength calibration is carried out by using a monochromator to calibrate, so that the wavelength is accurately calibrated; the sensitivity test uses an integrating sphere to set different integrating sphere powers and integrating times, an integrating sphere spectrum curve under corresponding power and time is obtained when the sensor irradiates the integrating sphere, and the sensitivity test is completed by comparing the curve change; finally, acquiring the radiometric calibration coefficient of the sensor by using the energy level data acquired by the integrating sphere and the image data acquired by the sensor;
step 2: sensor attitude parameter reading and exposure time correction
Reading attitude information acquired by a sensor in real time in the flight process, and using the attitude information as known data for subsequent calculation; in addition, in the hyperspectral image acquisition process, under the influence of illumination intensity and the digital digit of the sensor, different exposure time is adopted at different time nodes, and the exposure time is corrected according to the ratio of the normalized exposure time to the set exposure time; the exposure time correction coefficient e is calculated as follows:
Figure FDA0003032443660000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003032443660000012
in order to normalize the exposure time, t is the exposure time set by different time nodes;
and step 3: radiation attenuation coefficient calculation
The influence of the flight attitude obtained in the step 2 on the radiation energy transmission path is compared and analyzed to obtain the pitch angle theta of the sensorpAngle of roll thetarAngle of view of sensor thetaiVarying radiation path difference Δ Hθi
Figure FDA0003032443660000013
Constructing radiation path differences under different visual angles by selecting 500 lines of pos data, and calculating a mean value to draw a fitting line; the radiation path difference corresponding to the visual angle of the sensor forms a unitary quadratic linear correlation, and the sensor detects theta according to the Bougner-Lanmert transmission lawiThe radiation intensity after atmospheric attenuation at the viewing angle is:
Figure FDA0003032443660000014
wherein λ is the wavelength, Ls(lambda) is the emittance of the ground object, H is the flying height, Delta HθiCalculating the radiation path difference according to the attitude of the sensor; the correction term μ, which finally takes into account the radiation attenuation coefficient, is as follows:
Figure FDA0003032443660000015
in the formula, b0,b1,b2Is the model coefficient;
and 4, step 4: BRDF correction factor solution
A unitary quartic empirical model is established by introducing BRDF correction coefficients and radiation attenuation coefficients, and a fitting formula of average radiance values and the empirical model under different visual angles is established as follows:
Figure FDA0003032443660000021
in the formula [ theta ]iThe visual angle of the sensor is in the range of-17 degrees to 17 degrees;
Figure FDA0003032443660000022
is thetaiAverage radiance value at viewing angle, a0,a1,a2,a3,a4,b0,b1,b2Is the model coefficient; after the model coefficients are obtained, the model coefficients can be obtained
Figure FDA0003032443660000023
Is called as fitting model formula f (theta)i) The BRDF correction coefficient c at different viewing angles is:
Figure FDA0003032443660000024
and 5: edge radiation correction of images
Adding a sun zenith angle and a sensor zenith angle into a correction function as correction terms, comprehensively considering the description of the directional reflection characteristics of the ground objects in a Hapke model and a Lommel-Seeliger function, and introducing a Lommel-Seeliger factor
Figure FDA0003032443660000025
The final correction function is obtained as follows:
Figure FDA0003032443660000026
wherein e is an exposure time correction coefficient, θiFrom the sensor perspective, czFor BRDF correction coefficients at different wave bands, Lzi) Is Z wave band thetaiFront radiance correction at viewing angleDegree, alpha and beta are respectively a sun zenith angle and a sensor zenith angle;
Figure FDA0003032443660000027
the normalized sun zenith angle is 40 degrees, and the normalized sun zenith angle is obtained by calculating the average zenith angle of all the flight belts;
Figure FDA0003032443660000028
the normalized zenith angle of the sensor is 0 degree; and calculating the original radiance data line by line according to the correction function to obtain an image with edge radiation distortion eliminated.
CN201910912897.8A 2019-09-25 2019-09-25 Edge radiation correction method for aviation hyperspectral image Active CN110702228B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910912897.8A CN110702228B (en) 2019-09-25 2019-09-25 Edge radiation correction method for aviation hyperspectral image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910912897.8A CN110702228B (en) 2019-09-25 2019-09-25 Edge radiation correction method for aviation hyperspectral image

Publications (2)

Publication Number Publication Date
CN110702228A CN110702228A (en) 2020-01-17
CN110702228B true CN110702228B (en) 2021-06-25

Family

ID=69196796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910912897.8A Active CN110702228B (en) 2019-09-25 2019-09-25 Edge radiation correction method for aviation hyperspectral image

Country Status (1)

Country Link
CN (1) CN110702228B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112215765A (en) * 2020-09-08 2021-01-12 北京农业智能装备技术研究中心 Robot vision color correction method and device under agricultural natural light environment
CN114792327B (en) * 2022-06-23 2022-11-04 中国科学院空天信息创新研究院 Image processing method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102879094A (en) * 2012-09-28 2013-01-16 北京航空航天大学 Impact analysis method of imaging spectrometer radiometric calibration precision on data quality
CN108132220A (en) * 2017-12-25 2018-06-08 中国林业科学研究院资源信息研究所 The BRDF normalization methods of the airborne push-broom type Hyperspectral imaging in forest zone
CN109556715A (en) * 2018-11-19 2019-04-02 中国国土资源航空物探遥感中心 A kind of more air strips image Radiometric Correction Methods of Airborne Hyperspectral
CN109974854A (en) * 2019-03-18 2019-07-05 石河子大学 A kind of radiation correction method of frame width formula FPI high spectrum image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11010639B2 (en) * 2018-02-19 2021-05-18 Raytheon Company In-scene multi-angle surface-specific signature generation and exploitation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102879094A (en) * 2012-09-28 2013-01-16 北京航空航天大学 Impact analysis method of imaging spectrometer radiometric calibration precision on data quality
CN108132220A (en) * 2017-12-25 2018-06-08 中国林业科学研究院资源信息研究所 The BRDF normalization methods of the airborne push-broom type Hyperspectral imaging in forest zone
CN109556715A (en) * 2018-11-19 2019-04-02 中国国土资源航空物探遥感中心 A kind of more air strips image Radiometric Correction Methods of Airborne Hyperspectral
CN109974854A (en) * 2019-03-18 2019-07-05 石河子大学 A kind of radiation correction method of frame width formula FPI high spectrum image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Seamless Mosaicking of Multi-strip Airborne Hyperspectral Images Based on Hapke Model》;Yu J. 等;《 PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON SENSING AND IMAGING.Springer》;20180919;第285-292页 *
机载多光谱数据的亮度梯度效应;陈健 等;《遥感信息》;20180630;第33卷(第3期);第1-6页 *

Also Published As

Publication number Publication date
CN110702228A (en) 2020-01-17

Similar Documents

Publication Publication Date Title
CN102901516B (en) A kind of multispectral image radiation correction method based on absolute radiometric calibration
CN112051222A (en) River and lake water quality monitoring method based on high-resolution satellite image
KR101702187B1 (en) Device and method for calibration of high resolution electro optical satellite
CN103438900B (en) The collaborative absolute radiation calibration of three line scanner camera image and correction method
CN109883957B (en) MODIS image-based apparent reflectivity model construction method, system and calibration method
CN112798013B (en) Method for verifying on-orbit absolute radiometric calibration result of optical load
CN109974854B (en) Radiation correction method for frame-type FPI (field programmable Gate array) hyperspectral image
CN111415309A (en) High-resolution remote sensing image atmospheric correction method based on minimum reflectivity method
CN110987821A (en) Hyperspectral rapid atmospheric correction parameterization method
CN114279567B (en) On-orbit absolute radiation calibration method for micro-nano hyperspectral satellite constellation
CN110702228B (en) Edge radiation correction method for aviation hyperspectral image
CN104318550A (en) Eight-channel multi-spectral imaging data processing method
CN106017678A (en) Thermal infrared high spectral remote sensing data on-track spectral calibration method
CN107656289A (en) Spaceborne optics load absolute radiation calibration method and system based on ground spoke brightness
CN108120510A (en) A kind of in-orbit absolute radiation calibration method of optical sensor based on reflection mirror array
CN114219994A (en) Ocean optical satellite radiometric calibration method based on air-sea cooperative observation
CN115187481A (en) Airborne push-broom hyperspectral image radiation disturbance correction method
KR101010265B1 (en) Method for Calibration of COMS using Desert and Ocean
Markelin et al. Radiometric calibration and characterization of large-format digital photogrammetric sensors in a test field
CN112665829A (en) Inter-band calibration method for optical remote sensing satellite
CN109900361B (en) Atmospheric radiation correction method suitable for aviation hyperspectral image
CN113029977B (en) Automatic cross radiometric calibration method for wide-field-angle multispectral sensor
Kuusk et al. Measured spectral bidirectional reflection properties of three mature hemiboreal forests
KR20100028337A (en) Method for calibration of coms throught cloud using cloud target
CN111735538B (en) Airborne area array staring type hyperspectral image illumination correction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant