CN115452167A - Satellite remote sensor cross calibration method and device based on invariant pixel - Google Patents
Satellite remote sensor cross calibration method and device based on invariant pixel Download PDFInfo
- Publication number
- CN115452167A CN115452167A CN202211028600.XA CN202211028600A CN115452167A CN 115452167 A CN115452167 A CN 115452167A CN 202211028600 A CN202211028600 A CN 202211028600A CN 115452167 A CN115452167 A CN 115452167A
- Authority
- CN
- China
- Prior art keywords
- remote sensor
- invariant
- pixel
- image pair
- calibrated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 81
- 238000002310 reflectometry Methods 0.000 claims abstract description 101
- 238000001228 spectrum Methods 0.000 claims abstract description 74
- 238000001514 detection method Methods 0.000 claims abstract description 69
- 238000012937 correction Methods 0.000 claims abstract description 24
- 210000001747 pupil Anatomy 0.000 claims description 24
- 230000008859 change Effects 0.000 claims description 12
- 238000007781 pre-processing Methods 0.000 claims description 11
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 11
- 238000012952 Resampling Methods 0.000 claims description 5
- 230000004044 response Effects 0.000 abstract description 14
- 230000003595 spectral effect Effects 0.000 description 18
- 230000005855 radiation Effects 0.000 description 15
- 230000000694 effects Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 6
- 238000007619 statistical method Methods 0.000 description 6
- 238000000611 regression analysis Methods 0.000 description 5
- 238000011160 research Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000007774 longterm Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 239000000428 dust Substances 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 238000005316 response function Methods 0.000 description 3
- 239000004576 sand Substances 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 239000000443 aerosol Substances 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000008092 positive effect Effects 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 238000011002 quantification Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000000546 chi-square test Methods 0.000 description 1
- 238000010219 correlation analysis Methods 0.000 description 1
- 238000012850 discrimination method Methods 0.000 description 1
- 230000003631 expected effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012067 mathematical method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000013441 quality evaluation Methods 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/90—Testing, inspecting or checking operation of radiation pyrometers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/23—Testing, monitoring, correcting or calibrating of receiver elements
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Image Processing (AREA)
- Image Input (AREA)
Abstract
The invention provides a satellite remote sensor cross calibration method and device based on an invariant pixel. The method comprises the following steps: determining a sequence of image pairs; obtaining the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into an invariant pixel detection model to obtain an invariant pixel in each image pair; acquiring spectrum matching factors of a remote sensor to be calibrated and a reference remote sensor for spectrum matching, and determining the spectrum correction apparent reflectivity of the reference remote sensor of an invariant pixel in each image pair; and performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient. The invention automatically detects an invariant pixel target in a scene, calculates a spectrum matching factor to correct the spectrum response difference between two sensors, establishes a linear fitting relation of the apparent reflectivity of an invariant pixel sample, and obtains a cross calibration result and a long-time sequence cross calibration coefficient thereof.
Description
Technical Field
The invention relates to the technical field of remote sensor calibration, in particular to a satellite remote sensor cross calibration method and device based on an invariant pixel.
Background
With the rapid development of informatization and globalization and the continuous progress of the space remote sensing detection technology, the remote sensing application gradually deepens into each field of human activities, and the accurate calibration of a remote sensor is an important precondition for quantitative remote sensing application.
The radiometric calibration method of the remote sensor comprises a plurality of methods: in the absolute radiometric calibration, the absolute radiometric value of an observation target needs to be obtained in advance, the implementation process is difficult and is difficult to apply to the recalibration of historical satellite data; the on-orbit satellite calibration method depends on-satellite calibration equipment, many satellites lack on-satellite calibration equipment or are limited by the development level of on-satellite calibration equipment, and the calibration precision is low or the on-orbit satellite calibration cannot be carried out.
The cross calibration is an effective and feasible on-track substitution calibration method, and the sensor to be calibrated is subjected to cross calibration by taking the sensor with high absolute calibration precision as a reference and simultaneously observing the same target. However, the existing cross calibration method has the defects of difficult selection of an unchanged target, single surface characteristic of an observed target, small dynamic range of reflectivity and low calibration frequency.
Disclosure of Invention
The invention provides a satellite remote sensor cross calibration method and device based on an invariant pixel, which are used for solving the defects that in the prior art, invariant targets are difficult to select, the dynamic range of target reflectivity is small, and the calibration frequency is low when remote sensors are subjected to cross calibration. The invention can effectively improve the cross calibration frequency between the sensors without manually selecting the invariant target, and simultaneously, a large number of invariant pixels effectively increase the coverage range of the reflectivity, thereby being applicable to large scene data.
The invention provides a satellite remote sensor cross calibration method based on an invariant pixel, which comprises the following steps:
determining a sequence of image pairs, wherein each image pair comprises multispectral images obtained by respectively observing the same observation scene by a remote sensor to be calibrated and a reference remote sensor at the same time on the same day;
acquiring the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into an invariant pixel detection model to obtain an invariant pixel in each image pair;
acquiring spectrum matching factors of a remote sensor to be calibrated and a reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient.
According to the method for cross calibration of the satellite remote sensor based on the invariant pixel, provided by the invention, the determination of the image pair sequence comprises the following steps:
acquiring multispectral images obtained by respectively observing the same observation scene by a remote sensor to be identified and a reference remote sensor on the same day at the same time when the satellite passes the border at the same time;
and preprocessing the multispectral image acquired by the remote sensor to be calibrated and the multispectral image acquired by the reference remote sensor to obtain a target image pair.
Determining the sequence of image pairs based on the target image pairs acquired over multiple days.
According to the satellite remote sensor cross calibration method based on the invariant pixels, the preprocessing comprises resolution resampling, rasterization and invalid pixel elimination.
According to the satellite remote sensor cross calibration method based on the invariant pixel provided by the invention, the elimination of the invalid pixel comprises the following steps:
and eliminating cloud pollution pixels, water body target pixels and pixels with satellite observation zenith angles larger than or equal to 30 degrees.
According to the satellite remote sensor cross calibration method based on the invariant pixel, provided by the invention, the invariant pixel detection model is established based on an iterative weighted multivariate change detection IR-MAD method.
According to the satellite remote sensor cross calibration method based on the invariant pixel, provided by the invention, the method for obtaining the apparent reflectivity of a single pixel in each image pair, inputting the apparent reflectivity into an invariant pixel detection model and outputting the invariant pixel of each image pair comprises the following steps:
acquiring a detection result of a single pixel in each image pair and calculating the apparent reflectivity of the pixel;
inputting the apparent reflectivity of the pixel of the single pixel in each image pair into an invariant pixel detection model, constructing an MAD variable and constructing an observation value of the single pixel in each image pair based on the MAD variable;
determining invariant pixels of each image pair based on observations of individual pixels in each image pair and an invariant probability decision threshold.
According to the satellite remote sensor cross calibration method based on the invariant pixel, the spectrum matching factor of the remote sensor to be calibrated and the spectrum matching factor of the reference remote sensor are obtained, and the method comprises the following steps:
acquiring the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor;
and determining the spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor based on the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor.
According to the satellite remote sensor cross calibration method based on the invariant pixel, provided by the invention, the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair are subjected to orthogonal regression, and a cross calibration coefficient is determined, and the method comprises the following steps:
establishing a linear fitting relationship between the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair for orthogonal regression;
and determining to obtain a cross scaling coefficient based on the slope and intercept of the orthogonal regression.
According to the satellite remote sensor cross calibration method based on the invariant pixel provided by the invention, the cross calibration coefficient is determined, and then the method further comprises the following steps:
performing cross calibration of the long-time sequence on each channel of the long-time sequence data of the remote sensor to be calibrated;
and determining the cross scaling coefficient of the long-time sequence based on the cross scaling result of the long-time sequence.
The invention also provides a satellite remote sensor cross calibration device based on the invariant pixel, which comprises the following components:
the acquisition module is used for determining an image pair sequence, and each image pair comprises multispectral images obtained by respectively observing the same observation scene by the remote sensor to be calibrated and the reference remote sensor at the same time on the same day;
the invariant pixel detection module is used for acquiring the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into the invariant pixel detection model to acquire the invariant pixels in each image pair;
the spectrum matching module is used for acquiring spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and the regression module is used for performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel element in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient.
The method and the device for cross calibration of the satellite remote sensor based on the invariant pixel are different from the traditional cross calibration method, do not depend on the invariant calibration field manually selected by the satellite border, are not limited by the harsh conditions of ground synchronous observation, and can effectively improve the cross calibration frequency among the sensors. The detection of the invariant pixel target is automatically carried out, and the method has a positive effect on processing long-time sequence data and retargeting historical satellite data. The obtained invariant pixel targets are discontinuous in space, although the space characteristics of the earth surface of the targets cannot be completely analyzed, the coverage range of the reflectivity is effectively increased by a large number of invariant pixel samples, the method is applicable to large scene data, and the problems that the reflectivity range of a target field observed by a traditional cross calibration method is small and the samples are single are solved. The calibration result can be used for research application of remote sensing data subsequent quantification inversion products and long-term monitoring of remote sensor radiation response characteristics.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for cross-calibrating a satellite remote sensor based on an invariant pixel according to an embodiment of the present invention;
FIG. 2 is a second schematic flowchart of a cross calibration method for a satellite remote sensor based on invariant pixels according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of FY-3B/VIRR, MERSI channel spectral response functions, and SCIAMACHY hyperspectral samples provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of an orthogonal regression result of the apparent reflectivity of an invariant pixel provided in an embodiment of the present invention;
FIG. 5 is a schematic diagram of a long time sequence of relative calibration slopes of VIRR provided by an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a satellite remote sensor cross calibration device based on an invariant pixel provided in an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
The existing cross calibration method is mainly divided into two types, one is an SNO cross calibration method, and the main steps are as follows:
1) Acquiring the geographical positions and the observation time of the intersection points of the two satellite orbits through orbit forecasting;
2) Based on the time and the geographic position of the track forecast intersection, carrying out pixel matching on observation data to select data for cross calibration, wherein the data comprises time, observation geometry and spatial position matching;
3) Performing spectrum matching according to the spectral response difference of the similar channels of the two remote sensors;
4) Regression analysis calculates the scaling factor.
The other method is a site cross calibration method, which mainly comprises the following steps:
1) Selecting a proper invariant field as an observation target;
2) Acquiring invariant site observation data matched with observation conditions of two remote sensors and projecting the invariant site observation data to the same geographical grid;
3) Acquiring the simulated apparent reflectivity or apparent radiance of the reference sensor by utilizing a radiation transmission model and spectrum matching;
4) Regression analysis calculates the scaling factor.
Based on the two existing cross calibration methods, the following three problems of cross calibration can be obtained:
invariant target selection is difficult. SNO cross calibration needs to select an orbit intersection point of a satellite by using an orbit forecasting model, available cross calibration data can be obtained only through strict pixel matching in the limited intersection point, and field cross calibration also needs to manually select a calibration invariant field and consumes manpower and material resources.
The observed target has single surface characteristic and small dynamic range of reflectivity. If the desert constant field is used for site cross calibration, the reflectivity of a visible light waveband of the desert site is about 0.2-0.3, but the radiation performance of the remote sensor has certain target dependence, namely, a calibration coefficient can change along with the dynamic state of target radiation, so that a large number of target samples are required to cover the dynamic range of the remote sensor as much as possible to obtain a calibration result with higher precision.
The scaling frequency is low. Both SNO cross calibration and site cross calibration have low cross calibration frequency caused by the implementation difficulty of manually selecting an invariant target, and continuous radiometric calibration cannot be developed to achieve the target of long-term detection of the radiometric performance of the remote sensor.
Based on the problems, the embodiment of the invention provides a satellite remote sensor cross calibration method based on an invariant pixel, which realizes cross calibration of a remote sensor and solves the defects of difficulty in selecting an invariant target, small dynamic range of target reflectivity and low calibration frequency in cross calibration of the remote sensor.
The method for cross-calibrating a satellite remote sensor based on invariant pixels according to the present invention is described below with reference to fig. 1 to 5, and as shown in fig. 1, the method at least includes the following steps:
102, acquiring the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into an invariant pixel detection model to obtain an invariant pixel in each image pair;
103, acquiring spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and 104, performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient.
In step 101, it should be noted that the image pair sequence includes a plurality of image pairs, and each image pair is a simultaneous multispectral image of the same target scene collected by two remote sensors when the satellite passes through the border at the same time. The two image pairs may be two sets of multispectral images acquired at the same time for the same observation scene on two adjacent days, or two sets of multispectral images acquired at different times for the same observation scene on the same day, or two sets of multispectral images with different dates, acquisition times and observation scenes. The sequence of image pairs requires acquisition of satellite data for a number of consecutive days, with the time of the consecutive acquisition being within the satellite orbit regression cycle.
In addition, each multispectral image contains satellite data of a plurality of channel wave bands, when an image pair is selected, two multispectral images which are at the same time phase and are the same target scene need to be subjected to channel matching, and channels of similar wave bands correspond to one another.
For step 102, it should be noted that the detection result of a single pixel in each image pair is generally a remote sensing remote sensor radiometric value, i.e., a DN count value. The apparent reflectivity of each picture element can be calculated based on the detection results. The invariant pixel detection model is a pre-established model which can screen out invariant pixels from input image pairs according to apparent reflectivity. For each single-day image pair, after passing through the invariant pixel detection model, the invariant pixels of the image pair can be output, and after image pair sequences are sequentially input into the invariant pixel detection model, the sequences of the invariant pixels corresponding to different image pairs can be obtained.
For step 103, it should be noted that, because there is a difference in spectral response between the channels corresponding to the remote sensor to be calibrated and the reference remote sensor, that is, for the same entrance pupil radiation amount, the two remote sensors will obtain different measurement values, so that spectral matching needs to be performed on the channels corresponding to the remote sensors to correct the calibration error caused by the difference in spectral response.
In this embodiment, the spectral matching factors of the remote sensor to be calibrated and the reference remote sensor need to use the hyperspectral instrument observation data as hyperspectral samples, and are respectively convolved with the channel spectral response functions of the remote sensor to be calibrated and the reference remote sensor to obtain the entrance pupil radiance of the corresponding remote sensor, and then the entrance pupil radiance can be obtained by establishing a relational expression between the entrance pupil radiance of the matching channel of the remote sensor to be calibrated and the reference remote sensor.
For step 104, it should be noted that, for any image pair, the invariant pixel obtained by the invariant pixel detection model may be processed in steps 103 and 104 to obtain the cross scaling coefficient. However, because the scene used for statistical analysis is wide, the single-day image pair result only contains partial region information meeting the observation zenith angle limiting condition in the scene, and in order to contain richer invariant pixel samples in single regression as much as possible, the embodiment collects the image pair sequence, merges invariant pixels obtained by continuous multi-day detection, and performs fitting processing. Based on the short-term stability of the radiation response of the remote sensor, the merging processing is feasible, and the method is helpful for expanding the reflectivity value range of the pixel sample and effectively improving the regression effect and the calibration precision.
The satellite remote sensor cross calibration method based on the invariant pixel of the embodiment of the invention provides an effective solution aiming at the problems of invariant target selection, target reflectivity dynamic range limitation of observation and long-time sequence remote sensor continuous calibration monitoring of the existing cross calibration method. The method is different from the traditional cross calibration method, does not depend on a constant calibration field manually selected by a satellite transit, is not limited by the harsh condition of ground synchronous observation, and can effectively improve the cross calibration frequency among sensors. The detection of the invariant pixel target is automatically carried out, and the method has a positive effect on processing long-time sequence data and retargeting historical satellite data. The obtained invariant pixel targets are discontinuous in space, although the space characteristics of the earth surface of the targets cannot be completely analyzed, the coverage range of the reflectivity is effectively increased by a large number of invariant pixel samples, the method is applicable to large scene data, and the problems that the reflectivity range of a target field observed by a traditional cross calibration method is small and the samples are single are solved. The calibration result can be used for research application of remote sensing data subsequent quantification inversion products and long-term monitoring of remote sensor radiation response characteristics.
It will be appreciated that determining a sequence of image pairs includes:
acquiring multispectral images obtained by respectively observing the same observation scene by a remote sensor to be identified and a reference remote sensor on the same day at the same time when the satellite passes the border at the same time;
and preprocessing the multispectral image acquired by the remote sensor to be calibrated and the multispectral image acquired by the reference remote sensor to obtain a target image pair.
Based on the target image pairs acquired over multiple days, a sequence of image pairs is determined.
It should be noted that, because the spatial resolutions between different remote sensors and between different channels are different, and the data dimensions are different, subsequent statistical analysis cannot be performed, the multispectral image acquired by the remote sensor to be calibrated and the multispectral image acquired by the reference remote sensor need to be preprocessed, and the preprocessed image pair is the target image pair. Target image pairs within multiple consecutive days are acquired to form an image pair sequence.
It will be appreciated that the pre-processing includes resolution resampling, rasterization and culling of invalid pixels.
It should be noted that, during the preprocessing, the same channel bands of the two multispectral images in the same image pair need to be matched one by one, and the basis of the subsequent processing is the image pair under the same channel band, and the DN value of each multispectral image of each image pair under each channel band is obtained. In the embodiment of the invention, satellite data of each image pair is subjected to resolution resampling, and the matching of channel wave bands can be realized. And projecting satellite remote sensing data corresponding to the multispectral image to the same geographic grid, and constructing the rasterized satellite remote sensing data into a satellite data set. Wherein. The geogrid is a data form which divides a geographic space into regular grids, each grid is called a unit, and corresponding attribute values are given to the units to represent satellite remote sensing data.
And the DN values of different channels acquired by the two remote sensors have failure data. Because the invariant pixel detection model is developed based on the statistical principle, two sets of input data are required to be consistent in dimension, and therefore invalid pixels in two images in each image pair need to be correspondingly removed.
It is understood that the elimination of invalid pels includes:
and eliminating cloud pollution pixels, water body target pixels and pixels with satellite observation zenith angles larger than or equal to 30 degrees.
It should be noted that when the invariant pixel detection model is used for detecting an invariant pixel in a scene, the variant pixel is gradually removed in the algorithm iteration process, but considering the operation efficiency of the algorithm, for an unstable pixel which is significant in the scene, also called a cloud-polluted pixel, for example, a cloud, a dust and sand and other targets are removed before the algorithm corresponding to the invariant pixel detection model is operated. Although areas such as water bodies and oceans have stable reflectivity characteristics, the differences of the reflectivity spectrum characteristics of the areas are large, signals of partial channel wave bands are weak, and land ocean mask data in water body target pixel utilization data are removed.
In addition, after the data of the image pair sequence is removed by invalid points, cloud, sand dust and ocean water body pixels, millions of pixel samples are still available, and a large number of pixel samples with overlarge satellite observation zenith angles exist. The over-large zenith angle of the satellite observation can cause the reduction of the spatial resolution of the pixel and the detection radiation precision, and in order to improve the identification precision of the invariant pixel detection model to the invariant pixel, a pixel sample with the zenith angle of the satellite observation smaller than 30 degrees is used for subsequent statistical analysis.
It can be understood that the invariant pixel detection model is established based on an iterative weighted multivariate change detection IR-MAD method.
It should be noted that, with the progress of computer information technology and the wide application of statistical analysis methods in scientific research, remote sensing researchers have gradually applied some mathematical methods to the processing and analysis of satellite data. In the prior art, a difference image comparison method and a ratio method are suitable for analyzing a single-channel image and are not suitable for multi-channel satellite remote sensor data. Principal Component Analysis (PCA) methods can integrate the varying information of individual channels, but cannot eliminate correlation between different remote sensor channels. A multivariable change detection technology (MAD) with linear scale invariance can eliminate the correlation inside the channels of the simultaneous phase sensor and the correlation among different channels of the sensor. Based on the method, the embodiment of the invention constructs the invariant pixel detection model applied to the simultaneous observation scene of different remote sensors by using the IR-MAD method, and applies the invariant pixel detection model to the cross calibration of the remote sensors.
It can be understood that in the embodiment of the present invention, the invariant pixel detection model is constructed by an IR-MAD method, and the invariant pixel in each image pair is obtained. The method comprises the following steps of acquiring two n-channel multispectral images of the same scene collected by a remote sensor to be calibrated and a reference remote sensor at the same time, and analyzing and processing the two n-channel multispectral images by using an invariant pixel detection model, wherein the method at least comprises the following steps:
step 201, acquiring a detection result of a single pixel in each image pair and calculating the apparent reflectivity of the pixel;
step 202, inputting the apparent reflectivity of the pixel of a single pixel in each image pair into an invariant pixel detection model, constructing an MAD variable, and constructing an observed value of the single pixel in each image pair based on the MAD variable;
and step 203, determining the invariant pixel of each image pair based on the observation value and the invariant probability decision threshold of the single pixel in each image pair.
In step 201, the apparent reflectance (TOA) is the reflectance of the top of the atmospheric layer, and its value is the sum of the surface reflectance and the atmospheric transmittance contribution. And (3) calculating the apparent reflectivity of the pixel by using a fixed calibration coefficient at the initial satellite operation stage and a remote sensor channel DN value:
wherein slope i 、bias i Respectively, the fixed calibration slope and intercept, DN, corresponding to the ith channel of any remote sensor i DN value, d, obtained for any remote sensor 2 Is a sun-ground distance correction factor, theta s The zenith angle of the sun.
In step 202, it should be noted that, after the apparent reflectivity of the pixel element of a single pixel element in each image pair is input into the invariant pixel element detection model, the MAD variable needs to be constructed first. The process of constructing the MAD variables includes:
step 2021, using vectors F = (F), respectively 1 …F n ) T 、G=(G 1 …G n ) T Representing the apparent reflectance after the DN value conversion in two multispectral images in an image pair, all spectral bands are linearly combined to obtain the typical variable U, V, as shown in equation 1:
wherein n is the total number of matching channels of the current image pair, m and n are constant vectors, and can be obtained by solving the coupled generalized eigenvalue equation of equation 2:
where ρ is the correlation coefficient, Σ, of the exemplary variable U, V ff 、∑ gg 、∑ fg 、∑ gf Is the covariance matrix of the image vector F, G.
Step 2022, deriving the MAD variable based on the difference of the representative variable U, V, as shown in equation 3:
wherein the MAD i The MAD variable of the i channel of the remote sensor is shown.
Step 2023, obtaining an observation value Z based on the linear scale invariance of the MAD variable, as shown in formula 4:
wherein,is a remote sensorThe standard deviation of the MAD variables of the i-channel, Z is the sum of squares of the normalized MAD variables, and is approximated as a chi-square distribution (chi) 2 ) The method has n degrees of freedom, and the smaller the Z value is, the higher the pixel invariant probability is.
In step 203, it should be noted that, for two n-channel multispectral images of the same scene acquired by two remote sensors at the same time in any one image pair, the difference between the two multispectral images is ideally caused by the response of the instrument itself and random effects such as noise and atmospheric fluctuation, and from the perspective of the central limit theorem, the MAD variable conforms to the ideal normal distribution. However, the MAD variables associated with the change observation deviate more or less from the multivariate normal distribution, and in the case of changes, the sensitivity of the MAD transformation is improved by focusing on successive iterations to establish a better and better background without changes.
Because the ideal invariant pixel detection effect is difficult to achieve by single MAD transformation, when a sample mean value and a covariance matrix are estimated, sample data is weighted by the invariant probability determined by single iteration, pixels with higher invariant probability are given larger weight, MAD variables of next iteration are determined by typical correlation analysis, and the better invariant pixel detection effect is obtained by multiple iterations.
Specifically, step 203 includes the following substeps:
step 2031, based on the observed value, calculating an invariant probability weight value;
it should be noted that, for each iteration, the invariant probability weight Pr may be determined by the chi-square distribution test result equation 5 of the observation value Z:
for the IR-MAD algorithm, three iteration stop thresholds, namely a typical variable correlation coefficient change threshold, a maximum iteration number and a minimum NCPs number, need to be set, so that the iteration processing of a certain image pair is stopped when the algorithm is optimized and converged and the small probability detection fails, and the automatic operation of the algorithm is ensured. When the difference value of the change of the correlation coefficient of the representative variables of the two times is smaller than the threshold value of the change of the correlation coefficient of the representative variables, the algorithm is considered to be converged and iteration is stopped; meanwhile, in order to ensure that enough samples of the invariant pixels are used for subsequent cross-scaling regression analysis, a minimum threshold value of the number of the NCPs needs to be set, and when the number of the NCPs after a certain iteration is smaller than the value, the iteration is stopped.
Step 2032, setting an invariant probability decision threshold, and determining invariant pixels of each image pair based on the observed value of a single pixel in each image pair and the invariant probability decision threshold.
It should be noted that, because the present embodiment seeks to find an invariant pixel with a higher probability, when the number n of matching channel groups is determined, a plurality of confidence coefficients of the invariant probability are tested and analyzed, the operation efficiency of the algorithm and the detection effect of the invariant pixel are comprehensively considered, a suitable decision threshold t of the invariant probability is selected, and when an observed value Z is less than t, the confidence coefficient that a sample of the pixel is the invariant pixel is considered to be higher than that of the invariant pixelCan be used for subsequent cross-calibration analysis.
Therefore, in the present embodiment, samples satisfying the condition of equation 6 among the pixel samples are selected as non-changing pixels (NCPs):
wherein t is the number of quantiles on chi-square distribution with n degrees of freedom, namely the probability decision threshold of the invariant pixel,p is the probability that the chi-squared test value is less than or equal to t.
In addition, NCPs selected from data statistics without prior knowledge of the earth's surface have spatial locations corresponding to invariant features between image pairs, but the locations of the NCPs vary with differences in radiation information from image pair to image pair.
It can be understood that, obtaining the spectrum matching factors of the remote sensor to be calibrated and the remote reference sensor includes:
obtaining entrance pupil radiance of a remote sensor to be calibrated and entrance pupil radiance of a reference remote sensor;
and determining the spectral matching factors of the remote sensor to be calibrated and the reference remote sensor based on the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor.
It should be noted that, in the embodiment of the present invention, hyperspectral instrument observation data is used as a hyperspectral sample, and is respectively convolved with the channel spectral response functions of a remote sensor to be calibrated and a reference remote sensor, so as to obtain the entrance pupil radiance of the corresponding remote sensor, as shown in formula 7:
wherein R is h Is the radiance of the hyperspectral sample, f i_sensor As a function of the spectral response of the i-channel of the hyperspectral instrument, R i_sensor And the convolved entrance pupil radiance of the remote sensor to be labeled or the reference remote sensor is obtained.
Under the same geometric conditions of earth surface, atmosphere and observation time, the ratio of incident radiation quantity of the matching channel of the remote sensor to be calibrated and the reference remote sensor is called as a spectrum matching factor (SBAF). Establishing a relational expression between the luminance of the entrance pupil of the remote sensor to be calibrated and the luminance of the matching channel of the reference remote sensor, as shown in formula 8, and obtaining the SBAF coefficients of the corresponding channels of the two remote sensors:
R i_CAL =A i,j ×R j_REF +B i,j formula 8
Wherein R represents radiance, A, B is a spectrum matching factor, i and j are respectively the serial numbers of matching channels of a remote sensor (CAL) to be calibrated and a reference remote sensor (REF), A i,j And B i,j And spectrum matching factors of the i channel of the remote sensor to be calibrated and the j channel of the reference remote sensor are represented, and the SBAF of each matching channel is calculated through least square regression fitting.
It will be appreciated that the cross-scaling coefficients are determined by performing an orthogonal regression of the spectrally corrected apparent reflectance of the reference remote sensor for the invariant pixel elements of each image pair with the apparent reflectance of the remote sensor to be scaled for the invariant pixel elements of the corresponding image pair, including:
establishing a linear fitting relation for the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair to perform orthogonal regression;
and determining to obtain a cross scaling coefficient based on the slope and intercept of the orthogonal regression.
It should be noted that, for the invariant pixel obtained by IR-MAD detection, the apparent reflectivity detection result ρ of the j channel of the remote sensor will be referred to j_REF The spectrum matching factor is used for conversion to obtain the spectrum correction apparent reflectivity rho of the spectrum correction apparent reflectivity under the i-channel spectral response i_REF As shown in equation 9:
ρ i_REF =A i,j ×ρ j_REF +B i,j formula 9
For any image pair of the invariant pixel obtained by IR-MAD detection, the spectrum of the i-channel spectrum obtained by spectrum matching is corrected to obtain the spectrum of the apparent reflectivity rho i_REF Apparent reflectivity detection result rho of remote sensor to be calibrated i_CAL Establishing a linear fitting relationship to perform orthogonal regression as shown in formula 10, thereby obtaining a cross-scaling coefficient:
ρ i_REF =a×ρ i_CAL + b formula 10
Wherein a is a calibration slope and b is a calibration intercept. Rho i_CAL The scaling coefficients a, b are relative scaling results compared to the initial emission fixed scaling coefficients for the calculated apparent reflectivity based on the initial emission fixed scaling slope.
It is understood that determining the cross-scaling factor then further includes:
performing cross calibration of the long-time sequence on each channel of the long-time sequence data of the remote sensor to be calibrated;
and determining the cross scaling coefficient of the long-time sequence based on the cross scaling result of the long-time sequence.
It should be noted that, the long-time sequence cross calibration is performed on each channel of the remote sensor to be calibrated, and the long-time sequence cross calibration is used for monitoring the long-time radiation response condition of the remote sensor channel and determining the long-time sequence cross calibration coefficient trend.
Specific examples of the process of the invention are given below:
a Fengyun three-number B satellite (FY-3B) is a second satellite in a second generation polar orbit meteorological satellite in China, and carries 11 remote sensing instruments, wherein a visible light infrared scanning radiometer (VIRR) and a medium resolution imaging spectrometer (MERSI) are two main instruments. The FY-3B/VIRR has 10 channel wave bands in total, the spectral range is 0.43-12.5 mu m, 7 visible light near infrared wave bands and 3 infrared emission wave bands are contained, and the space resolution of the satellite point is 1.1km. The VIRR is mainly used for detecting and identifying cloud information, detecting earth surface vegetation coverage, acquiring earth surface and sea surface related information, monitoring the total amount of atmospheric water vapor and the like. FY3B/MERSI has 19 reflecting solar bands (0.41-2.13 μm) and 1 thermal infrared band (11.25 μm), and the spatial resolution of the sub-satellite points is 250m and 1000m. The 1,2,6,7,8,9 channel of VIRR is similar to 3,4,6,1, 10_11,2 channel of MERSI in arrangement (8 channels of VIRR correspond to 10 and 11 channels of MERSI), the specific indexes are shown in Table 1, so that MERSI is used as a reference remote sensor, VIRR is used as a remote sensor to be calibrated, and the VIRR is subjected to cross calibration on 6 groups of corresponding channels in total.
As shown in fig. 2, in this embodiment, image data pairs of a remote sensor VIRR to be calibrated and a reference remote sensor MERSI at the same time phase are screened and preprocessed, matching channel data of the two remote sensors are linearly combined to construct an MAD variable, an IR-MAD invariant pixel detection model is used to analyze the MAD variable to obtain an invariant pixel sample in a scene, and the obtained invariant pixel takes a detection result of the reference remote sensor as a radiation reference, and after spectrum matching and correction, orthogonal regression is performed on the obtained invariant pixel sample and a detection result of the remote sensor to be calibrated to obtain a cross-scaling coefficient.
TABLE 1 FY3B/VIRR, MERSI spectral band index
It can be understood that, for step 101, the observation scene of the image pair sequence selected in this embodiment is the northwest region of china, and the central coordinates are 91 ° east longitude and 39 ° north latitude. The terrain is mainly plateau, basin and mountain land, and the elevation is more than 1km. The landform landscape mainly comprises loess plateau, gobi desert, desert grassland and Gobi desert. The region belongs to continental climate, is influenced by the south of Himalaya mountains, is cold and dry in winter, is hot in summer, has little rainfall, and is a main natural characteristic of the region. The seasonal change of the ground surface reflectivity of the research area is small, and the method accords with the ideal cross calibration observation area standard of low aerosol content, low water vapor content and high clear sky probability.
As shown in table 1, there are differences between the resolution of each channel of the VIRR and the MERSI, including three spatial resolutions of 250m,1km and 1.1km, the detection data of each channel is resampled to 1km spatial resolution, and the data is projected to the same geographic grid, the size of a single scene data is 1400 × 3200 pixels, and the time range is from 1/21/2011 to 11/2018/14/11/h.
It can be understood that, for the DN values of the simultaneous multispectral images acquired in step 101, via r and mer si, each channel has some invalid points, and it is necessary to ensure that each channel data of each pixel for statistical analysis is valid, so that pixels with invalid values are removed.
Because the observed scene is mainly cloud in the region with obvious change, and the L1 data of VIRR and MERSI has no cloud mask data, the channel data of VIRR is used, and a threshold discrimination method is adopted to perform preliminary cloud identification. Firstly, the radiation ratio of two wave bands of a channel 1 (0.58-0.68 mu m) and a channel 2 (0.84-0.89 mu m) of the VIRR can effectively distinguish cloud and clear sky areas, and the radiation ratio of the channel 2 (0.84-0.89 mu m) and the channel 6 (1.55-1.64 mu m) can better distinguish the cloud layer on the ground from the object with high reflectivity on the ground. And for the pixel target of the marine water body, removing by using land marine mask (landseamask) data in the VIRR and MERSI data. The field scanning ranges of the VIRR and MERSI remote sensors are +/-55.4 degrees, and pixels with the satellite observation zenith angle smaller than 30 degrees are selected for subsequent statistical analysis.
It can be understood that, for step 102, the pixel apparent reflectivity is calculated from the channel DN value by using the fixed scaling coefficient of the remote sensor at the initial stage of operation of the FY-3B satellite, and an apparent reflectivity image at the same time phase of the research scene is obtained.
It is understood that, for step 102, after data screening and preprocessing, two multispectral images of northwest china collected at the same time via and MERSI retain 6 sets of matching channel apparent reflectivity data of sample pixels for IR-MAD analysis, that is, n =6.
In the embodiment, the data pairs of a plurality of groups of images are subjected to IR-MAD analysis for testing and selecting a proper iteration stop threshold parameter, when the iteration time is 0-10 times, the number of unchanged pixels is rapidly reduced, the reduction trend is gradually gentle after 10-20 times, the algorithm tends to be converged after 20-25 times, and the variation of the correlation coefficient of a typical variable between two iterations is less than 0.001 at the moment, so that the algorithm can be considered to be converged and stopped from iteration when the variation difference of the correlation coefficient of the typical variable between two iterations is less than 0.001, and the maximum iteration time of the algorithm is set to be 30 times. Meanwhile, in order to ensure that enough samples of the invariant pixels are used for subsequent cross-scaling regression analysis, the minimum number of the NCPs is set to be 400, and when the number of the NCPs is smaller than the value after a certain iteration, the iteration is stopped.
In this embodiment, an invariant pixel with an invariant probability of 90% or more needs to be obtained, when the degree of freedom n =6 (6 groups of matching channels), test analysis is performed on confidence coefficients of the invariant probabilities of 90%, 92.5%, 95%, and 97.5%, and the operation efficiency of the algorithm and the detection effect of the invariant pixel are comprehensively considered, and the invariant pixel is selectedAnd as an invariant probability decision threshold, when the observed value Z < t =1.635, the confidence coefficient that the pixel sample is an invariant pixel is higher than 95%, and the pixel sample can be used for subsequent cross-calibration analysis.
It can be understood that, for step 203, a relational expression between the vitrr and the intensity of the entrance pupil width of the MERSI matching channel is established, and the SBAF coefficients of the channels corresponding to the two remote sensors are obtained, which specifically includes:
R i_VIRR =A i,j ×R j_MERSI +B i,j formula 11
R 8_VIRR =A 8,10 ×R 10_MERSI +A 8,11 ×R 11_MERSI +B i,j Formula 12
Wherein, R represents the radiance, A, B is the spectrum matching factor, i, j are the matching channel serial numbers of VIRR and MERSI respectively. For a single pair of single matched channels of VIRR and MERSI, equation 11 is used, and for 8 channels of VIRR and 10, 11 channels of MERSI, equation 12 is used. The SBAF of each matched channel was calculated by least squares regression fitting, and the spectral matching factors for VIRR and MERSI were obtained as shown in table 2.
TABLE 2 FY-3B VIRR and MERSI spectral match factor (SBAF)
It can be understood that for step 203, the apparent reflectivity ρ of MERSI j channel is detected for the invariant pixel acquired by IR-MAD detection j_MERSI Converting by using a spectrum matching factor to obtain the spectrum correction apparent reflectivity rho under the i-channel spectral response i_MERSI 。
It can be understood that, for step 204, for the invariant pixel obtained by IR-MAD detection in any image pair, the i-channel spectrum obtained by spectrum matching is corrected for the apparent reflectivity ρ i_MERSI Apparent reflectance detection result ρ from VIRR i_VIRR And establishing a linear fitting relation as shown in a formula 12, thereby obtaining a cross scaling coefficient.
ρ i_MERSI =a×ρ i_VIRR + b formula 12
Wherein a is a calibration slope and b is a calibration intercept.
It should be noted that the nominal orbit regression period of the wind cloud three-satellite is 5.5 days, and the invariant pixels obtained by continuous 5-day detection are combined to perform orthogonal regression analysis. FIG. 4 shows the result of matching channel orthogonal regression for each group of unchanged pixels from 8 days to 12 days in 4 months in 2011. The linear fitting effect of the invariant pixel of each channel is good, and the dynamic range of the TOA is large.
Fig. 5 is a long-time sequence trend of relative calibration slopes of each channel of the VIRR from 21/2011 to 14/2018, 11/s, and is used for monitoring the long-term radiation response condition of the remote sensor channel. The method is developed based on the principle of mathematical statistics, when a single-day image pair is used for analysis, the identification result of the invariant pixel is influenced by the cloud amount, the content of water vapor aerosol and extreme severe weather conditions (such as rainstorm, snow storm, sand dust and the like), so that the TOA regression effect of the invariant pixel is poor, and therefore, regression quality evaluation indexes such as correlation coefficients, residual errors and the like of regression results of all channels can be used for removing the result from the data which do not accord with the expected effect in a long-time sequence. For some data pairs, only part of the channels are interfered, the method eliminates the correlation inside the channels of the simultaneous phase sensor and the correlation among different channels of different sensors, and has independence on the calibration and analysis of the channels, so that the effective channel data result of the data pairs can be reserved. The long time sequence results show that the method of the present invention can be implemented with almost uninterrupted automatic cross-calibration.
The satellite remote sensor cross calibration device based on the invariant pixel provided by the invention is described below, and the satellite remote sensor cross calibration device based on the invariant pixel described below and the satellite remote sensor cross calibration method based on the invariant pixel described above can be referred to correspondingly.
As shown in fig. 6, an embodiment of the present invention discloses a satellite remote sensor cross calibration apparatus based on invariant pixels, including:
the acquisition module 601 is used for determining a sequence of image pairs, wherein each image pair comprises a multispectral image obtained by respectively observing the same observation scene on the same day at the same time by using the remote sensor to be calibrated and the reference remote sensor;
an invariant pixel detection module 602, configured to obtain an apparent reflectivity of a single pixel in each image pair and input the apparent reflectivity into an invariant pixel detection model, so as to obtain an invariant pixel in each image pair;
the spectrum matching module 603 is used for acquiring spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and the regression module 604 is configured to perform orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated, so as to determine a cross calibration coefficient.
The satellite remote sensor cross calibration device based on the invariant pixel provided by the embodiment of the invention provides a remote sensor cross calibration method based on intelligent selection of the invariant pixel aiming at the calibration problem of an optical remote sensing imager and the historical data recalibration requirement. The method comprises the steps of utilizing simultaneous satellite scene images acquired by two remote sensors, automatically detecting an invariant pixel target in a scene through an invariant pixel detection model after data screening and preprocessing, calculating a spectrum matching factor to correct the spectrum response difference between the two sensors, and establishing a linear fitting relation of the apparent reflectivity of an invariant pixel sample to obtain a cross calibration result and a long-time sequence calibration trend thereof.
It is understood that determining the sequence of image pairs in the acquisition module 601 includes:
acquiring multispectral images obtained by respectively observing the same observation scene by a remote sensor to be identified and a reference remote sensor on the same day at the same time when the satellite passes the border at the same time;
and preprocessing the multispectral image acquired by the remote sensor to be calibrated and the multispectral image acquired by the reference remote sensor to obtain a target image pair.
Based on the target image pairs acquired over multiple days, a sequence of image pairs is determined.
It will be appreciated that the pre-processing includes resolution resampling, rasterization and culling of invalid pixels.
It is to be understood that eliminating invalid pels includes:
and eliminating cloud pollution pixels, water body target pixels and pixels with satellite observation zenith angles larger than or equal to 30 degrees.
It can be understood that the invariant pixel detection model is established based on an iterative weighted multivariate change detection IR-MAD method.
It is understood that invariant pixel detection module 602 comprises:
acquiring a detection result of a single pixel in each image pair and calculating the apparent reflectivity of the pixel;
inputting the apparent reflectivity of the pixel of a single pixel in each image pair into an invariant pixel detection model, constructing an MAD variable and constructing an observed value of the single pixel in each image pair based on the MAD variable;
and determining the invariant pixel in each image pair based on the observed value and the invariant probability decision threshold of the single pixel in each image pair.
It can be understood that, the spectrum matching module 603 obtains the spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor, including:
obtaining entrance pupil radiance of a remote sensor to be calibrated and entrance pupil radiance of a reference remote sensor;
and determining spectral matching factors of the remote sensor to be calibrated and the reference remote sensor based on the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor.
It is to be understood that the regression module 604 includes:
establishing a linear fitting relation between the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair for orthogonal regression;
and determining to obtain a cross scaling coefficient based on the slope and the intercept of the orthogonal regression.
It is to be understood that the regression module 604 further includes:
carrying out cross calibration of long-time sequences on each channel of the long-time sequence data of the remote sensor to be calibrated;
and determining the cross scaling coefficient of the long-time sequence based on the cross scaling result of the long-time sequence.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A satellite remote sensor cross calibration method based on invariant pixels is characterized by comprising the following steps:
determining a sequence of image pairs, wherein each image pair comprises multispectral images obtained by respectively observing the same observation scene by a remote sensor to be calibrated and a reference remote sensor at the same time on the same day;
obtaining the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into an invariant pixel detection model to obtain an invariant pixel in each image pair;
acquiring spectrum matching factors of a remote sensor to be calibrated and a reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of an invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel element in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient.
2. The invariant pixel-based satellite remote sensor cross-calibration method of claim 1, wherein the determining a sequence of image pairs comprises:
acquiring multispectral images obtained by respectively observing the same observation scene by a remote sensor to be identified and a reference remote sensor on the same day at the same time when the satellite passes the border at the same time;
preprocessing the multispectral image acquired by the remote sensor to be calibrated and the multispectral image acquired by the reference remote sensor to obtain a target image pair;
determining the sequence of image pairs based on the target image pairs acquired over multiple days.
3. The method of claim 2, wherein the preprocessing comprises resolution resampling, rasterization, and culling of invalid pixels.
4. The satellite remote sensor cross-calibration method based on invariant pixels of claim 3, wherein said eliminating invalid pixels comprises:
and eliminating cloud pollution pixels, water body target pixels and pixels with satellite observation zenith angles larger than or equal to 30 degrees.
5. The satellite remote sensor cross-calibration method based on invariant pixel elements of claim 1, wherein said invariant pixel element detection model is established based on an iterative weighted multivariate change detection IR-MAD method.
6. The satellite remote sensor cross-calibration method based on invariant image elements of claim 5, wherein said obtaining the apparent reflectivity of a single image element in each image pair and inputting the apparent reflectivity into an invariant image element detection model, and outputting the invariant image element in each image pair comprises:
acquiring the detection result of a single pixel in each image pair and calculating the apparent reflectivity of the pixel;
inputting the apparent reflectivity of the pixel of the single pixel in each image pair into an invariant pixel detection model, constructing an MAD variable and constructing an observation value of the single pixel in each image pair based on the MAD variable;
and determining the invariant pixel in each image pair based on the observed value and the invariant probability decision threshold of the single pixel in each image pair.
7. The satellite remote sensor cross calibration method based on the invariant pixel according to claim 1, wherein the obtaining of the spectrum matching factors of the remote sensor to be calibrated and the remote reference sensor comprises:
acquiring the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor;
and determining the spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor based on the entrance pupil radiance of the remote sensor to be calibrated and the entrance pupil radiance of the reference remote sensor.
8. The satellite remote sensor cross-calibration method based on invariant pixel according to claim 1, wherein said performing orthogonal regression on the spectrum corrected apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair to determine the cross-calibration coefficient comprises:
establishing a linear fitting relationship between the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel of each image pair and the apparent reflectivity of the remote sensor to be calibrated of the invariant pixel of the corresponding image pair for orthogonal regression;
and determining to obtain a cross scaling coefficient based on the slope and intercept of the orthogonal regression.
9. The method for cross-scaling a satellite remote sensor based on invariant image elements as claimed in claim 1, wherein said determining a cross-scaling coefficient further comprises:
performing cross calibration of the long-time sequence on each channel of the long-time sequence data of the remote sensor to be calibrated;
and determining the cross scaling coefficient of the long-time sequence based on the cross scaling result of the long-time sequence.
10. The utility model provides a satellite remote sensor cross calibration device based on invariant pixel, its characterized in that includes:
the acquisition module is used for determining an image pair sequence, and each image pair comprises a multispectral image obtained by respectively observing the same observation scene by the remote sensor to be calibrated and the reference remote sensor at the same time on the same day;
the invariant pixel detection module is used for acquiring the apparent reflectivity of a single pixel in each image pair and inputting the apparent reflectivity into the invariant pixel detection model to acquire the invariant pixel in each image pair;
the spectrum matching module is used for acquiring spectrum matching factors of the remote sensor to be calibrated and the reference remote sensor, performing spectrum matching on the apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair based on the spectrum matching factors, and determining the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair;
and the regression module is used for performing orthogonal regression on the spectrum correction apparent reflectivity of the reference remote sensor of the invariant pixel in each image pair and the apparent reflectivity of the corresponding remote sensor to be calibrated to determine a cross calibration coefficient.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211028600.XA CN115452167A (en) | 2022-08-25 | 2022-08-25 | Satellite remote sensor cross calibration method and device based on invariant pixel |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211028600.XA CN115452167A (en) | 2022-08-25 | 2022-08-25 | Satellite remote sensor cross calibration method and device based on invariant pixel |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115452167A true CN115452167A (en) | 2022-12-09 |
Family
ID=84298684
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211028600.XA Pending CN115452167A (en) | 2022-08-25 | 2022-08-25 | Satellite remote sensor cross calibration method and device based on invariant pixel |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115452167A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115615938A (en) * | 2022-12-14 | 2023-01-17 | 天津中科谱光信息技术有限公司 | Water quality analysis method and device based on reflection spectrum and electronic equipment |
-
2022
- 2022-08-25 CN CN202211028600.XA patent/CN115452167A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115615938A (en) * | 2022-12-14 | 2023-01-17 | 天津中科谱光信息技术有限公司 | Water quality analysis method and device based on reflection spectrum and electronic equipment |
CN115615938B (en) * | 2022-12-14 | 2023-03-28 | 天津中科谱光信息技术有限公司 | Water quality analysis method and device based on reflection spectrum and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11640653B2 (en) | Method to correct satellite data to surface reflectance using scene statistics | |
CN115062527B (en) | Geostationary satellite sea temperature inversion method and system based on deep learning | |
CN116519557B (en) | Aerosol optical thickness inversion method | |
CN114564767A (en) | Under-cloud surface temperature estimation method based on sun-cloud-satellite observation geometry | |
CN111832518A (en) | Space-time fusion-based TSA remote sensing image land utilization method | |
CN113408111B (en) | Atmospheric precipitation inversion method and system, electronic equipment and storage medium | |
Heidinger et al. | Using MODIS to estimate cloud contamination of the AVHRR data record | |
CN117077437B (en) | Method for constructing and determining polar region sea surface net radiation model based on multi-source satellite | |
Qiu et al. | Assessment of straylight correction performance for the VIIRS Day/Night Band using Dome-C and Greenland under lunar illumination | |
CN111104888A (en) | Automatic generation technology of cloud detection algorithm supported by AVIRIS high-resolution data | |
CN115452167A (en) | Satellite remote sensor cross calibration method and device based on invariant pixel | |
CN110689505A (en) | Scene-based satellite-borne remote sensing instrument self-adaptive correction method and system | |
Barnsley et al. | Characterizing the spatial variability of broadband albedo in a semidesert environment for MODIS validation | |
CN113532652A (en) | Infrared remote sensing sensor absolute calibration method based on buoy and atmospheric reanalysis data | |
Xu et al. | Fuxi-DA: A Generalized Deep Learning Data Assimilation Framework for Assimilating Satellite Observations | |
CN116185616A (en) | FY-3D MERSI L1B data automatic reprocessing method | |
Elmer et al. | Limb correction of geostationary infrared imagery in clear and cloudy regions to improve interpretation of RGB composites for real-time applications | |
Kashyap et al. | Model for estimation of global horizontal irradiance in the presence of dust, fog, and clouds | |
Kleynhans | Detecting land-cover change using Modis time-series data | |
KR101948706B1 (en) | Development of retrieval method for broadband albedo at the top of the atmosphere using Himawari-8 AHI sensor | |
CN116380811B (en) | Cloud detection method based on oxygen absorption band | |
CN118196653B (en) | AOD inversion method and device based on DN value of satellite sensor | |
Liu et al. | Highly consistent brightness temperature fundamental climate data record from SSM/I and SSMIS | |
Alhirmizy | Comparative Study between Landsat-8 OLI and Landsat-7 ETM+ for sensor signal-to-noise performance, Spectral Distortion, and spectral signature matching: A Study in the IRAQ Landscape | |
CN118090636A (en) | Method for processing month observation data by foundation visible light hyperspectral imager |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |