CN116738734A - Regularization constraint-based water transparency fusion calculation method and system - Google Patents

Regularization constraint-based water transparency fusion calculation method and system Download PDF

Info

Publication number
CN116738734A
CN116738734A CN202310727937.8A CN202310727937A CN116738734A CN 116738734 A CN116738734 A CN 116738734A CN 202310727937 A CN202310727937 A CN 202310727937A CN 116738734 A CN116738734 A CN 116738734A
Authority
CN
China
Prior art keywords
transparency
inversion
fusion
satellite
calculation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310727937.8A
Other languages
Chinese (zh)
Other versions
CN116738734B (en
Inventor
王毅
项杰
周树道
关吉平
赵世军
金赛花
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202310727937.8A priority Critical patent/CN116738734B/en
Publication of CN116738734A publication Critical patent/CN116738734A/en
Application granted granted Critical
Publication of CN116738734B publication Critical patent/CN116738734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/59Transmissivity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Abstract

The application provides a regularization constraint-based water transparency fusion calculation method and a regularization constraint-based water transparency fusion calculation system, wherein the method comprises the following steps: detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data; obtaining measured data of a target water body area; acquiring detection data true values of all detection satellites based on all satellite multichannel detection data; respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm according to the measured data and all satellite multichannel detection data; carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single-satellite fusion measured values; and taking the plurality of single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain the multi-satellite fusion measured values. The transparency inversion calculation method can be suitable for various measurement calculation scenes with different types.

Description

Regularization constraint-based water transparency fusion calculation method and system
Technical Field
The application belongs to the technical field of remote sensing data processing, and particularly relates to a regularization constraint-based water transparency fusion calculation method and system.
Background
Transparency is a basic parameter describing the optical properties of a water body, and is related to the components and the content of chlorophyll, suspended matters and yellow matters in the water body, and is closely related to solar radiation on the surface of the water body, physicochemical properties and meteorological conditions of the water body and the like. The most direct method for obtaining the water transparency space-time distribution is to measure the transparency of each station by using ships at regular time, but the method can only obtain the transparency of a measuring point, and can not obtain the sea water transparency characteristic of large space-time distribution. The remote sensing technology is used as a brand new observation means, and can obtain the distribution characteristics of ocean parameters in large space time. In recent years, with rapid development of remote sensing technology, particularly development of water color remote sensors and improvement of accuracy of inversion algorithms, numerous water transparency remote sensing products are provided. However, due to the on-orbit operation of different remote sensing loads and the difference of observation parameters, the water transparency product is affected by cloud and weather climate environments, verification and evaluation have no unified standard and the like, and the influence factors limit the popularization and application of the water transparency product.
At the same time, data fusion techniques have gained wide acceptance in a number of disciplines and have been greatly developed in recent years. In the specific application of the prior art, a single-band estimation model, a differential spectrum estimation model and a multi-band estimation model of transparency can be established according to satellite hyperspectral remote sensing reflectivity and transparency data, and the precision is evaluated. And then respectively simulating MERIS and HJ bands by using the actually measured hyperspectral remote sensing data, establishing a transparency band combination model and evaluating. Finally inverting the transparency by adopting HJ data and analyzing influence factors. The inversion accuracy of the transparency can be effectively improved based on the hyperspectral high-resolution data through the steps. However, many water color sensors or satellite data are not hyperspectral data, and other types of data such as multispectral data are possible, so that the measurement calculation scene applicable to the transparency inversion calculation method in the prior art is limited.
Disclosure of Invention
The application provides a regularization constraint-based water transparency fusion calculation method and a regularization constraint-based water transparency fusion calculation system, which aim to solve the problem that a water transparency measurement calculation scene is limited.
In a first aspect, the present application provides a regularization constraint-based water transparency fusion calculation method, which includes the following steps:
detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data;
obtaining measured data of the target water body area;
acquiring detection data true values of all the detection satellites based on all the satellite multichannel detection data;
according to the measured data and all the satellite multichannel detection data, respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm;
carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single satellite fusion measured values;
and taking the single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain multi-satellite fusion measured values.
Optionally, the calculating the multiple inversion transparency of the target water body area according to the measured data and all the satellite multichannel detection data and using an inversion algorithm includes the following steps:
taking any one of the satellite multichannel detection data as target satellite multichannel detection data;
according to the multichannel detection data of the target satellite, an absorption coefficient and a backscattering coefficient are calculated;
combining the absorption coefficient, the backscattering coefficient and the measured data, and calculating inversion transparency of the target water body region by using an inversion algorithm, wherein the measured data comprises a water body refractive index, a water body reflection coefficient, a water body surface reflectivity and a water body transmissivity;
and selecting one of the rest satellite multichannel detection data as the target satellite multichannel detection data at will, and repeating the calculation steps until inversion calculation is performed on all the satellite multichannel detection data to obtain a plurality of inversion transparency.
Optionally, the calculation formula of the inversion transparency is as follows:
wherein: z is Z d Representing the inversion transparency, a representing the absorption coefficient, b b Representing the backscattering coefficient ρ p Representing the water surface reflectivity, alpha representing the water refractive index, beta representing the water reflection coefficient, C e Represents a contrast threshold, and f represents the water transmittance.
Optionally, the satellite multichannel detection data includes real-time satellite detection data and historical satellite detection data of historical K days, and the regularized fusion calculation is performed by combining all the inversion transparency and all the corresponding true detection data values to obtain a plurality of single-satellite fusion measurement values, including the following steps:
taking any inversion transparency as a target inversion transparency, wherein the target inversion transparency comprises real-time inversion transparency and historical inversion transparency of a history M days, and M is less than K;
calculating to obtain an observation covariance matrix according to the real-time inversion transparency and M historical inversion transparency;
calculating a background field according to N historical inversion transparency and the true value of the detection data, wherein M is smaller than N and smaller than K;
calculating a background covariance matrix by combining the detection data true value and the background field;
carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix to obtain a single satellite fusion measured value corresponding to the target inversion transparency;
and selecting one of the rest inversion transparency as the target inversion transparency, and repeating the calculation steps until all inversion transparency are subjected to regularized fusion calculation to obtain a plurality of single satellite fusion measured values.
Optionally, the regularized fusion calculation is performed by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix, so as to obtain a single satellite fusion measurement value corresponding to the target inversion transparency, which comprises the following steps:
constructing a cost function by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix;
deriving the cost function based on a preset fusion precision threshold value so that the cost function takes a minimum value;
simplifying and transposing the cost function after derivation to obtain a single satellite fusion measurement value calculation formula;
and calculating to obtain the single-satellite fusion measured value corresponding to the target inversion transparency by using the single-satellite fusion measured value calculation formula.
Optionally, the expression of the cost function is as follows:
wherein:representing the cost function,/->Representing the single satellite fusion measurements, B representing the background covariance matrix, R representing the observed covariance matrix,>representing the background field->Representing the real-time inversion transparency, i=1, 2, …, N, i representing the i-th channel of the detection satellite, and N representing the maximum number of channels of the detection satellite.
Optionally, the single satellite fusion measurement value is calculated as follows:
optionally, after the regularized fusion calculation is performed by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix, the method further comprises the following steps of:
counting the calculation time of the single satellite fusion measured value;
judging whether the calculated time exceeds a preset time threshold value or not;
and if the calculation time exceeds the time threshold, reconfiguring the calculation grid of the regularized fusion calculation, and recalculating the single satellite fusion measured value after the calculation grid is reconfigured.
Optionally, after the regularized fusion calculation is performed by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix, the method further comprises the following steps of:
identifying a null value region in the single satellite fusion measured value, and calculating the area of the null value region;
judging whether the area of the region exceeds a preset area threshold value or not;
if the area of the region exceeds the area threshold, recalculating to obtain a corrected background field according to N+T historical inversion transparency and the true value of the detection data, wherein T is more than 0 and less than (K-N);
calculating a corrected background covariance matrix by combining the detected data true value and the corrected background field;
and carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the corrected background field and the corrected background covariance matrix to obtain a corrected single satellite fusion measured value corresponding to the target inversion transparency.
In a second aspect, the present application also provides a regularization constraint-based water transparency fusion computing system, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as described in the first aspect when executing the computer program.
The beneficial effects of the application are as follows:
the regularization constraint-based water transparency fusion calculation method provided by the application comprises the following steps: detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data; obtaining measured data of a target water body area; acquiring detection data true values of all detection satellites based on all satellite multichannel detection data; respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm according to the measured data and all satellite multichannel detection data; carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single-satellite fusion measured values; and taking the plurality of single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain the multi-satellite fusion measured values. In the water transparency calculation process based on data fusion, the multi-channel detection data of a single satellite can be subjected to data fusion, fusion measured values of a plurality of satellites can be used as observed values, the data fusion calculation process is continued, and finally multi-channel fusion measured data of a multi-source satellite can be obtained. Compared with a measurement scene which is simply calculated by inversion based on hyperspectral satellite data, the method can be also suitable for measurement calculation scenes with other types of data such as multispectral data. The adopted data fusion method belongs to the pixel level, and has higher inversion precision in the water transparency inversion calculation process.
Drawings
FIG. 1 is a flow chart of a regularization constraint-based water transparency fusion calculation method.
FIG. 2 is a fusion diagram of single satellite fusion measurement data visualization in the present application.
Fig. 2 (a) is a schematic diagram of background field visualization.
FIG. 2 (b) is a visual representation of real-time inversion transparency.
Fig. 2 (c) is a schematic diagram of visualization of single satellite fusion measurement data after data fusion.
Detailed Description
The application discloses a regularization constraint-based water transparency fusion calculation method.
Referring to fig. 1, the regularization constraint-based water transparency fusion calculation method specifically includes the following steps:
s101, detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data.
The detected data obtained by detecting the satellite can be hyperspectral data or multispectral data. The satellite multichannel detection data are satellite secondary data, and each detection satellite independently corresponds to one satellite multichannel detection data. The satellite multichannel probe data only indicates that the probe satellite may have multiple data bands or that only one data band may be present.
S102, obtaining measured data of a target water body area.
The measured data of the target water body area can be obtained by actually measuring the ship in the target water body area, and the measured data generally comprise a water body refractive index, a water body reflection coefficient, a water body surface reflectivity, a water body transmittance and the like.
S103, acquiring detection data true values of all detection satellites based on all satellite multichannel detection data.
The satellite multichannel detection data is subjected to quality control processing based on the satellite type of the detection satellite corresponding to the satellite multichannel detection data, so that a data true value is obtained and is used as the detection data true value of the detection satellite.
S104, respectively calculating a plurality of inversion transparency of the target water body region according to the measured data and all satellite multichannel detection data by using an inversion algorithm.
And for each satellite multichannel detection data, respectively combining the actual measurement data and inverting and calculating the inversion transparency of the target water body region by using an inversion algorithm. The inversion algorithm comprises a single-channel inversion algorithm, a double-channel inversion algorithm, a multi-channel inversion algorithm and the like. The channel inversion algorithm is simple and feasible, but has relatively low accuracy. Two-channel or multi-channel inversion algorithms require more data to process, but can obtain more accurate transparency estimation results.
S105, regularized fusion calculation is carried out by combining all inversion transparency and all corresponding detection data true values, and a plurality of single-satellite fusion measured values are obtained.
All inversion transparency data and corresponding detection data true values can be preprocessed, and the method comprises the operations of removing abnormal values, filling missing values, normalizing and the like. And carrying out regularized fusion calculation on the preprocessed data to obtain a plurality of single satellite fusion measured values. The regularization fusion calculation can be selected according to actual conditions by using methods such as weighted average, least square method and neural network.
S106, taking the plurality of single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain a multi-satellite fusion measured value.
The multiple single-satellite fusion measurement values can be used as all satellite multichannel detection data in the step S104, and the step S104 and the step S105 are repeatedly executed, so that the multiple-satellite fusion measurement value obtained after the multi-source satellite multichannel data fusion can be calculated, and the multiple-satellite fusion measurement value is the final measurement value of the water transparency of the water region of the detection target by adopting the multiple-source satellite.
The implementation principle of the embodiment is as follows:
detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data; obtaining measured data of a target water body area; acquiring detection data true values of all detection satellites based on all satellite multichannel detection data; respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm according to the measured data and all satellite multichannel detection data; carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single-satellite fusion measured values; and taking the plurality of single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain the multi-satellite fusion measured values. In the water transparency calculation process based on data fusion, the multi-channel detection data of a single satellite can be subjected to data fusion, fusion measured values of a plurality of satellites can be used as observed values, the data fusion calculation process is continued, and finally multi-channel fusion measured data of a multi-source satellite can be obtained. Compared with a measurement scene which is simply calculated by inversion based on hyperspectral satellite data, the method can be also suitable for measurement calculation scenes with other types of data such as multispectral data. The adopted data fusion method belongs to the pixel level, and has higher inversion precision in the water transparency inversion calculation process.
In one embodiment, step S104, namely, calculating a plurality of inversion transparency of the target water body region according to the measured data and all satellite multichannel detection data by using an inversion algorithm, specifically includes the following steps:
taking any one satellite multichannel detection data as target satellite multichannel detection data;
according to the multichannel detection data of the target satellite, an absorption coefficient and a backscattering coefficient are obtained through calculation;
combining the absorption coefficient, the backscattering coefficient and the measured data, and calculating by using an inversion algorithm to obtain inversion transparency of the target water body region, wherein the measured data comprises a water body refractive index, a water body reflection coefficient, a water body surface reflectivity and a water body transmissivity;
and selecting one of the remaining satellite multichannel detection data as target satellite multichannel detection data at will, and repeating the calculation steps until all satellite multichannel detection data are subjected to inversion calculation to obtain a plurality of inversion transparency.
In this embodiment, the absorption coefficient and the backscatter coefficient may be calculated using an inversion algorithm using the target satellite multichannel probe data. The specific method can be that the total reflectivity of the water body is calculated according to the reflectivities of different wave bands, and then the absorption coefficient and the backscattering coefficient are calculated.
And using the absorption coefficient, the backscattering coefficient and the measured data, and calculating to obtain the inversion transparency of the target water body region by using an inversion algorithm. Firstly, constructing an inversion formula of water transparency in advance, wherein the inversion formula is as follows:
wherein: z is Z d Representing inversion transparency, C r Representing the appearance of the target, C 0 Represents the inherent contrast, c represents the beam attenuation coefficient, K d Representing the down irradiance decay coefficient.
For totally diffuse reflected light, the downward irradiance decay factor and the upward irradiance decay factor are approximately equal, and the distribution function δ of the downward incident square d (z) =2, so the inversion formula can be rewritten as follows:
wherein: z is Z d Represents inversion transparency, a represents absorption coefficient, b b Represents the backscattering coefficient, ρ p Representing the reflectivity of the surface of the water body, alpha represents the refractive index of the water body, beta represents the reflection coefficient of the water body, and C e Representing a contrast threshold and R representing the optical depth of the body of water.
And combining the inversion formulas, and finally obtaining the calculation formulas of inversion transparency as follows:
wherein: z is Z d Represents inversion transparency, a represents absorption coefficient, b b Represents the backscattering coefficient, ρ p Representing the reflectivity of the surface of the water body, alpha represents the refractive index of the water body, beta represents the reflection coefficient of the water body, and C e The contrast threshold value is represented, and f represents the water transmittance.
In one embodiment, the satellite multichannel detection data includes real-time satellite detection data and historical satellite detection data of historical K days, and step S105 is to perform regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single-satellite fusion measurement values, and specifically includes the following steps:
taking any inversion transparency as a target inversion transparency, wherein the target inversion transparency comprises real-time inversion transparency and historical inversion transparency of a history M day, and M is less than K;
calculating according to the real-time inversion transparency and the M historical inversion transparency to obtain an observation covariance matrix;
calculating according to N historical inversion transparency and the true value of the detection data to obtain a background field, wherein M is less than N and less than K;
calculating by combining the true value of the detection data and the background field to obtain a background covariance matrix;
carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix to obtain a single satellite fusion measured value corresponding to the target inversion transparency;
and selecting one of the remaining inversion transparency as a target inversion transparency, and repeating the calculation steps until all inversion transparency are subjected to regularized fusion calculation to obtain a plurality of single-satellite fusion measured values.
In this embodiment, the real-time satellite detection data is the satellite detection data of the date of data fusion, the historical satellite detection data includes all the historical satellite detection data of K days before the date of data fusion, and the historical satellite detection data is the set of K day data closest to the date of data fusion. The value of K can be chosen in the interval [30,60], the value of K being preferably 40. The historical inversion transparency of the historical M days refers to inversion transparency data obtained by inversion calculation by combining the historical satellite detection data of the historical M days with the actual measurement data, the historical inversion transparency of the historical M days is also a set of the inversion transparency of the M calendar closest to the date of the data fusion, and the M value is preferably 10.
The observation covariance matrix obtained by calculating according to the real-time inversion transparency and the M historical inversion transparency can be specifically obtained by the following steps: and (3) carrying out daily average on the M historical inversion transparency, or carrying out weighted average on the M historical inversion transparency, wherein the weight can be set according to the quality of the observed data. And averaging the M historical inversion transparency averaged day by day according to the time period of M days to obtain an M balance average result. Also simple averaging or weighted averaging methods may be used. And fusing the real-time satellite detection data with the M balance results to obtain a final transparency result of the data fusion on the same day. The fusion method can be simple average, weighted average or other more complex method, and is selected according to the practical situation. The observation covariance matrix may reflect the uncertainty of the observation data. Finally, an observed covariance matrix may be calculated using statistical methods, such as calculating the standard deviation or variance of the inversion transparency data. The calculation of the observation covariance matrix can be used for subsequent data processing and analysis.
The N historical inversion transparency is a set of N calendar history inversion transparency closest to the date of the data fusion, and the N value is preferably 30. The background field obtained by calculation according to the N historical inversion transparency and the true value of the detection data can be specifically obtained by the following steps: and preprocessing the N historical inversion transparency data and the corresponding detection data true values, including removing abnormal values, filling missing values, normalizing and the like, so as to reduce noise and errors of the data. According to the preprocessed historical inversion transparency data and the detection data true value, a statistical method, a machine learning method and the like can be used for calculating to obtain a background field. The specific method may be to calculate the background field according to statistical features of the historical data, such as average, variance, correlation coefficient, etc. Machine learning methods, such as regression models, neural networks, etc., may also be used to train a background field model based on characteristics of the historical data and use the model to make predictions.
The background covariance matrix is calculated by combining the true values of the detected data and the background field, and the following steps can be adopted: according to the historical inversion transparency data and the detection data true values, a statistical method, a machine learning method and the like can be used for calculating to obtain a background field. The specific method may be to calculate the background field according to statistical features of the historical data, such as average, variance, correlation coefficient, etc. Machine learning methods, such as regression models, neural networks, etc., may also be used to train a background field model based on characteristics of the historical data and use the model to make predictions. And comparing the true value of the detection data with the background field, and calculating to obtain the background field deviation. The background field deviation represents the difference between the true value of the detected data and the background field, reflecting the uncertainty of the background field. According to the background field deviation, a covariance matrix of the background field deviation can be obtained by calculation using a statistical method, matrix operation and the like. The specific method may be to calculate the background covariance matrix according to the statistical features of the background field deviation, such as average, variance, correlation coefficient, etc.
Referring to fig. 2, fig. 2 is a fusion diagram of single satellite fusion measurement data visualization, in which fig. 2 (a) is a background field visualization diagram, fig. 2 (b) is a visualization diagram of real-time inversion transparency, and fig. 2 (c) is a visualization diagram of single satellite fusion measurement data after data fusion.
In one embodiment, the step of performing regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix to obtain a single satellite fusion measurement value corresponding to the target inversion transparency specifically comprises the following steps:
constructing a cost function by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix;
deriving a cost function based on a preset fusion precision threshold value so that the cost function takes a minimum value;
simplifying and transposing the cost function after derivation to obtain a single satellite fusion measurement value calculation formula;
and calculating by using a single-satellite fusion measurement value calculation formula to obtain the single-satellite fusion measurement value corresponding to the target inversion transparency.
In this embodiment, the expression of the cost function is as follows:
wherein:representing a cost function->Represents a single satellite fusion measurement, B represents a background covariance matrix, R represents an observation covariance matrix,/->Representing the background field +.>Indicating the real-time inversion transparency, i=1, 2, …, N, i indicating the i-th channel of the probe satellite corresponding to the real-time inversion transparency, N indicating the real-time inversion transparency pairThe maximum number of channels of the satellite should be detected.
The preset fusion precision threshold value can be 2%, and the cost function is derived, so that the calculation formula of the minimum value of the cost function is as follows:
where δ represents a derivative symbol.
Simplifying the derivative formula to obtain the following simplified formula:
the above simplified formula is transposed to obtain the following transposed formula:
finally, the transposed formula is adjusted to obtain a single satellite fusion measured value calculation formula, wherein the formula is as follows:
in one embodiment, the regularized fusion calculation is performed by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix in the step, and the method specifically comprises the following steps after obtaining the single satellite fusion measured value corresponding to the target inversion transparency:
counting the calculation time of the single satellite fusion measured value;
judging whether the calculated time exceeds a preset time threshold;
if the calculation time exceeds the time threshold, the calculation grid for regularized fusion calculation is reconfigured, and single satellite fusion measurement values are recalculated after the calculation grid is reconfigured.
In this embodiment, since the data calculation amount of data fusion is large in the actual calculation application process, the optimization algorithm iteration generally needs two or three days to complete the fusion once, so that the preset time threshold can be set to 3 days. And counting the calculation time of the single-satellite fusion measured value, judging whether the calculation time exceeds a preset time threshold, and if the calculation time does not exceed the time threshold, normally obtaining the single-satellite fusion measured value. If the calculation time exceeds the time threshold, the calculation grid of the regularized fusion calculation is required to be reconfigured according to factors such as spatial resolution, time resolution, data density and the like, and single satellite fusion measurement values are required to be recalculated after the calculation grid is reconfigured.
In one embodiment, the regularized fusion calculation is performed by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix in the step, and the method specifically comprises the following steps after obtaining the single satellite fusion measured value corresponding to the target inversion transparency:
identifying a null region in the single satellite fusion measured value, and calculating the area of the null region;
judging whether the area of the region exceeds a preset area threshold value;
if the area of the region exceeds the area threshold value, recalculating to obtain a corrected background field according to N+T historical inversion transparency and the true value of the detection data, wherein T is more than 0 and less than (K-N);
calculating by combining the true value of the detection data and the corrected background field to obtain a corrected background covariance matrix;
and carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the corrected background field and the corrected background covariance matrix to obtain a corrected single satellite fusion measured value corresponding to the target inversion transparency.
In this embodiment, the null region refers to a region without any data in the data image presented by the single satellite fusion measurement value after being processed by quality control, cloud detection and the like, and the area threshold may be set to 10km×10km, because 10km×10km is the largest region capable of realizing fusion filling, if the fusion cannot effectively fill the region, more historical inversion transparency (historical inversion transparency on T days) needs to be added again, a corrected background field is obtained by recalculation, and regularized fusion calculation is performed by finally combining the real-time inversion transparency, the observation covariance matrix, the corrected background field and the corrected background covariance matrix, so as to obtain the corrected single satellite fusion measurement value corresponding to the target inversion transparency, wherein the T value is preferably 5.
The application also discloses a regularization constraint-based water transparency fusion computing system, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the regularization constraint-based water transparency fusion computing method in all the embodiments is realized when the processor executes the computer program.
The implementation principle of the embodiment is as follows:
through the calling of the program, a plurality of detection satellites can be used for detecting the target water body area, and a plurality of satellite multichannel detection data are obtained; obtaining measured data of a target water body area; acquiring detection data true values of all detection satellites based on all satellite multichannel detection data; respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm according to the measured data and all satellite multichannel detection data; carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single-satellite fusion measured values; and taking the plurality of single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain the multi-satellite fusion measured values. In the water transparency calculation process based on data fusion, the multi-channel detection data of a single satellite can be subjected to data fusion, fusion measured values of a plurality of satellites can be used as observed values, the data fusion calculation process is continued, and finally multi-channel fusion measured data of a multi-source satellite can be obtained. Compared with a measurement scene which is simply calculated by inversion based on hyperspectral satellite data, the method can be also suitable for measurement calculation scenes with other types of data such as multispectral data. The adopted data fusion method belongs to the pixel level, and has higher inversion precision in the water transparency inversion calculation process.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of protection of the application is limited to these examples; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the application, the steps may be implemented in any order and there are many other variations of the different aspects of one or more embodiments of the application as above, which are not provided in detail for the sake of brevity.
One or more embodiments of the present application are intended to embrace all such alternatives, modifications and variations as fall within the broad scope of the present application. Accordingly, any omissions, modifications, equivalents, improvements and others which are within the spirit and principles of the one or more embodiments of the application are intended to be included within the scope of the application.

Claims (10)

1. A regularization constraint-based water transparency fusion calculation method is characterized by comprising the following steps:
detecting a target water body area through a plurality of detection satellites to obtain a plurality of satellite multichannel detection data;
obtaining measured data of the target water body area;
acquiring detection data true values of all the detection satellites based on all the satellite multichannel detection data;
according to the measured data and all the satellite multichannel detection data, respectively calculating a plurality of inversion transparency of the target water body region by using an inversion algorithm;
carrying out regularized fusion calculation by combining all inversion transparency and all corresponding detection data true values to obtain a plurality of single satellite fusion measured values;
and taking the single-satellite fusion measured values as detection data, and repeating the inversion algorithm calculation step and the regularized fusion calculation step to obtain multi-satellite fusion measured values.
2. The regularization constraint-based water transparency fusion computing method of claim 1 wherein the computing the plurality of inversion transparency values for the target water region from the measured data and all of the satellite multi-channel probe data using an inversion algorithm, respectively, comprises the steps of:
taking any one of the satellite multichannel detection data as target satellite multichannel detection data;
according to the multichannel detection data of the target satellite, an absorption coefficient and a backscattering coefficient are calculated;
combining the absorption coefficient, the backscattering coefficient and the measured data, and calculating inversion transparency of the target water body region by using an inversion algorithm, wherein the measured data comprises a water body refractive index, a water body reflection coefficient, a water body surface reflectivity and a water body transmissivity;
and selecting one of the rest satellite multichannel detection data as the target satellite multichannel detection data at will, and repeating the calculation steps until inversion calculation is performed on all the satellite multichannel detection data to obtain a plurality of inversion transparency.
3. The regularization constraint-based water transparency fusion computing method of claim 2, wherein the inversion transparency is computed as:
wherein: z is Z d Representing the inversion transparency, a representing the absorption coefficient, b b Representing the backscattering coefficient ρ p Representing the water surface reflectivity, alpha representing the water refractive index, beta representing the water reflection coefficient, C e Represents a contrast threshold, and f represents the water transmittance.
4. The regularization constraint-based water transparency fusion computing method of claim 1 wherein the satellite multichannel probe data includes real-time satellite probe data and historical satellite probe data for historical K days, the regularization fusion computing combining all the inversion transparency and all corresponding true values of the probe data to obtain a plurality of single satellite fusion measurements includes the steps of:
taking any inversion transparency as a target inversion transparency, wherein the target inversion transparency comprises real-time inversion transparency and historical inversion transparency of a history M days, and M is less than K;
calculating to obtain an observation covariance matrix according to the real-time inversion transparency and M historical inversion transparency;
calculating a background field according to N historical inversion transparency and the true value of the detection data, wherein M is smaller than N and smaller than K;
calculating a background covariance matrix by combining the detection data true value and the background field;
carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix to obtain a single satellite fusion measured value corresponding to the target inversion transparency;
and selecting one of the rest inversion transparency as the target inversion transparency, and repeating the calculation steps until all inversion transparency are subjected to regularized fusion calculation to obtain a plurality of single satellite fusion measured values.
5. The regularization constraint-based water transparency fusion computing method of claim 4 wherein the performing regularized fusion computing in combination with the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix to obtain a single satellite fusion measurement corresponding to the target inversion transparency comprises the steps of:
constructing a cost function by combining the real-time inversion transparency, the observation covariance matrix, the background field and the background covariance matrix;
deriving the cost function based on a preset fusion precision threshold value so that the cost function takes a minimum value;
simplifying and transposing the cost function after derivation to obtain a single satellite fusion measurement value calculation formula;
and calculating to obtain the single-satellite fusion measured value corresponding to the target inversion transparency by using the single-satellite fusion measured value calculation formula.
6. The regularization constraint-based water transparency fusion computing method of claim 5, wherein the cost function is expressed as follows:
wherein:representing the cost function,/->Representing the single satellite fusion measurements, B representing the background covariance matrix, R representing the observed covariance matrix,>representing the background field->Representing the real-time inversion transparency, i=1, 2, …, N, i representing the i-th channel of the detection satellite, and N representing the maximum number of channels of the detection satellite.
7. The regularization constraint-based water transparency fusion computing method of claim 6, wherein the single satellite fusion measurement value is computed as:
8. the regularization constraint-based water transparency fusion computing method of claim 4 further comprising the steps of, after said performing regularized fusion computing in combination with said real-time inversion transparency, said observation covariance matrix, said background field, and said background covariance matrix, obtaining single satellite fusion measurements corresponding to said target inversion transparency:
counting the calculation time of the single satellite fusion measured value;
judging whether the calculated time exceeds a preset time threshold value or not;
and if the calculation time exceeds the time threshold, reconfiguring the calculation grid of the regularized fusion calculation, and recalculating the single satellite fusion measured value after the calculation grid is reconfigured.
9. The regularization constraint-based water transparency fusion computing method of claim 4 further comprising the steps of, after said performing regularized fusion computing in combination with said real-time inversion transparency, said observation covariance matrix, said background field, and said background covariance matrix, obtaining single satellite fusion measurements corresponding to said target inversion transparency:
identifying a null value region in the single satellite fusion measured value, and calculating the area of the null value region;
judging whether the area of the region exceeds a preset area threshold value or not;
if the area of the region exceeds the area threshold, recalculating to obtain a corrected background field according to N+T historical inversion transparency and the true value of the detection data, wherein T is more than 0 and less than (K-N);
calculating a corrected background covariance matrix by combining the detected data true value and the corrected background field;
and carrying out regularized fusion calculation by combining the real-time inversion transparency, the observation covariance matrix, the corrected background field and the corrected background covariance matrix to obtain a corrected single satellite fusion measured value corresponding to the target inversion transparency.
10. A regularization constraint-based water transparency fusion computing system comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-9 when executing the computer program.
CN202310727937.8A 2023-06-19 2023-06-19 Regularization constraint-based water transparency fusion calculation method and system Active CN116738734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310727937.8A CN116738734B (en) 2023-06-19 2023-06-19 Regularization constraint-based water transparency fusion calculation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310727937.8A CN116738734B (en) 2023-06-19 2023-06-19 Regularization constraint-based water transparency fusion calculation method and system

Publications (2)

Publication Number Publication Date
CN116738734A true CN116738734A (en) 2023-09-12
CN116738734B CN116738734B (en) 2024-04-09

Family

ID=87916551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310727937.8A Active CN116738734B (en) 2023-06-19 2023-06-19 Regularization constraint-based water transparency fusion calculation method and system

Country Status (1)

Country Link
CN (1) CN116738734B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110095437A (en) * 2019-05-20 2019-08-06 中国人民解放军国防科技大学 Regional seawater transparency real-time calculation method
US20200025613A1 (en) * 2018-03-27 2020-01-23 Flying Gybe Inc. Hyperspectral sensing system and processing methods for hyperspectral data
CN112345499A (en) * 2020-10-07 2021-02-09 大连理工大学 International boundary lake transparency inversion method based on multi-source remote sensing satellite
CN112904375A (en) * 2021-01-18 2021-06-04 中国人民解放军国防科技大学 Novel multi-source satellite altimeter fusion method
CN113344043A (en) * 2021-05-21 2021-09-03 北京工业大学 River turbidity monitoring method based on self-organizing multi-channel deep learning network
CN113762467A (en) * 2021-08-12 2021-12-07 生态环境部卫星环境应用中心 Method for obtaining near-ground ozone concentration based on ultraviolet and visible hyperspectrum
CN114894719A (en) * 2022-03-16 2022-08-12 自然资源部国土卫星遥感应用中心 Satellite water transparency inversion method based on wave band remote sensing reflectivity ratio
CN115204618A (en) * 2022-06-22 2022-10-18 中国气象科学研究院 CCMVS regional carbon source convergence inversion evaluation system
CN115345238A (en) * 2022-08-17 2022-11-15 中国人民解放军61741部队 Method and device for generating seawater transparency fusion data
CN115561176A (en) * 2022-10-13 2023-01-03 中电莱斯信息系统有限公司 Water quality inversion method based on feature adaptive operation and machine learning fusion
CN115830473A (en) * 2023-02-20 2023-03-21 江苏省生态环境监测监控有限公司 Water quality inversion method and system based on satellite remote sensing and automatic monitoring

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200025613A1 (en) * 2018-03-27 2020-01-23 Flying Gybe Inc. Hyperspectral sensing system and processing methods for hyperspectral data
CN110095437A (en) * 2019-05-20 2019-08-06 中国人民解放军国防科技大学 Regional seawater transparency real-time calculation method
CN112345499A (en) * 2020-10-07 2021-02-09 大连理工大学 International boundary lake transparency inversion method based on multi-source remote sensing satellite
CN112904375A (en) * 2021-01-18 2021-06-04 中国人民解放军国防科技大学 Novel multi-source satellite altimeter fusion method
CN113344043A (en) * 2021-05-21 2021-09-03 北京工业大学 River turbidity monitoring method based on self-organizing multi-channel deep learning network
CN113762467A (en) * 2021-08-12 2021-12-07 生态环境部卫星环境应用中心 Method for obtaining near-ground ozone concentration based on ultraviolet and visible hyperspectrum
CN114894719A (en) * 2022-03-16 2022-08-12 自然资源部国土卫星遥感应用中心 Satellite water transparency inversion method based on wave band remote sensing reflectivity ratio
CN115204618A (en) * 2022-06-22 2022-10-18 中国气象科学研究院 CCMVS regional carbon source convergence inversion evaluation system
CN115345238A (en) * 2022-08-17 2022-11-15 中国人民解放军61741部队 Method and device for generating seawater transparency fusion data
CN115561176A (en) * 2022-10-13 2023-01-03 中电莱斯信息系统有限公司 Water quality inversion method based on feature adaptive operation and machine learning fusion
CN115830473A (en) * 2023-02-20 2023-03-21 江苏省生态环境监测监控有限公司 Water quality inversion method and system based on satellite remote sensing and automatic monitoring

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Y. WANG AND S. ZHOU: "Advance Algorithms of Secchi depth Remote Sensing", 2022 INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND ARTIFICIAL INTELLIGENCE (ICCEAI), 17 August 2022 (2022-08-17) *
任尚书,周树道等: "基于微纳阵列的光度学海水透明度测量仪设计", 海洋科学进展, vol. 37, no. 1, 15 January 2019 (2019-01-15) *
何贤强, 潘德炉, 毛志华, 朱乾坤: "利用Sea WiFS反演海水透明度的模式研究", 海洋学报(中文版), no. 05, 13 September 2004 (2004-09-13) *
何贤强, 潘德炉, 黄二辉, 赵艳玲: "中国海透明度卫星遥感监测", 中国工程科学, no. 09, 30 September 2004 (2004-09-30) *
崔建勇;刘晓东;岳增友;李连伟;: "多源海洋遥感叶绿素数据融合", 遥感信息, no. 03, 20 June 2020 (2020-06-20) *
林志贵, 徐立中, 沈祖诒, 黄凤辰: "信息融合在水环境监测中的应用", 水利水文自动化, no. 02, 25 June 2003 (2003-06-25) *
王晓菲;张亭禄;田林;施英妮;: "西北太平洋海水透明度遥感反演与融合方法", 中国海洋大学学报(自然科学版), no. 12, 15 December 2016 (2016-12-15) *
王燕红;陈义兰;周兴华;杨磊;付延光;: "基于多项式回归模型的岛礁遥感浅海水深反演", 海洋学报, no. 03, 15 March 2018 (2018-03-15) *

Also Published As

Publication number Publication date
CN116738734B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
Huang et al. Discharge estimation in high-mountain regions with improved methods using multisource remote sensing: A case study of the Upper Brahmaputra River
Bartlett et al. The spectral effects of clouds on solar irradiance
Brajard et al. Atmospheric correction of MERIS data for case-2 waters using a neuro-variational inversion
CN102288956A (en) Atmospheric correction method for multispectral data of remote sensing satellite
CN111538940B (en) Suspended matter concentration inversion model determination method and suspended matter concentration determination method
CN103712955A (en) Class-II water atmospheric correction method based on neural network quadratic optimization
CN113901384A (en) Ground PM2.5 concentration modeling method considering global spatial autocorrelation and local heterogeneity
CN113408742B (en) High-precision sea surface temperature inversion method based on machine learning
CN110388986B (en) Land surface temperature inversion method based on TASI data
CN105842190A (en) Near-infrared model transfer method based on spectral regression
CN114372707A (en) High-cold-wetland degradation degree monitoring method based on remote sensing data
CN114112941A (en) Aviation hyperspectral water eutrophication evaluation method based on support vector regression
CN116738734B (en) Regularization constraint-based water transparency fusion calculation method and system
CN111597692B (en) Method, system, electronic equipment and storage medium for estimating surface net radiation
CN114943142B (en) Integrated inversion method and device for hyperspectral earth surface reflectivity and atmospheric parameters
CN114819737B (en) Method, system and storage medium for estimating carbon reserves of highway road vegetation
Aitken et al. Prelude to CZMIL: seafloor imaging and classification results achieved with CHARTS and the Rapid Environmental Assessment (REA) Processor
CN115060656A (en) Satellite remote sensing water depth inversion method based on sparse prior actual measurement points
CN114781242A (en) Remote sensing monitoring method for total amount of algae in true light layer of eutrophic lake
Zakharova et al. Data analysis for variational assimilation of the surface temperature of the Black and Azov Seas
CN112595344A (en) Method and device for on-orbit absolute radiometric calibration of remote sensing satellite high resolution camera
CN111752976A (en) Meteorological data processing method and device, computer equipment and readable storage medium
Kesavavarthini et al. Bias correction of CMIP6 simulations of precipitation over Indian monsoon core region using deep learning algorithms
CN116821694B (en) Soil humidity inversion method based on multi-branch neural network and segmented model
CN114199827B (en) Remote sensing data-based method for inverting vertical change of PAR diffuse attenuation coefficient

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant