CN112364289B - Method for extracting water body information through data fusion - Google Patents

Method for extracting water body information through data fusion Download PDF

Info

Publication number
CN112364289B
CN112364289B CN202011201586.XA CN202011201586A CN112364289B CN 112364289 B CN112364289 B CN 112364289B CN 202011201586 A CN202011201586 A CN 202011201586A CN 112364289 B CN112364289 B CN 112364289B
Authority
CN
China
Prior art keywords
water body
data
image data
coefficient
preprocessing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011201586.XA
Other languages
Chinese (zh)
Other versions
CN112364289A (en
Inventor
柯樱海
吕明苑
李小娟
洪剑明
郭琳
朱丽娟
王展鹏
宫辉力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Normal University
Original Assignee
Capital Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Normal University filed Critical Capital Normal University
Priority to CN202011201586.XA priority Critical patent/CN112364289B/en
Publication of CN112364289A publication Critical patent/CN112364289A/en
Application granted granted Critical
Publication of CN112364289B publication Critical patent/CN112364289B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Water Supply & Treatment (AREA)
  • Algebra (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Public Health (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Multimedia (AREA)
  • Pure & Applied Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a method for extracting water body information through data fusion, which comprises the following steps: s1, acquiring optical image data and SAR observation image data covering the range of the research area; s2, preprocessing the optical image data in the S1; s3, acquiring a surface reflectance value on each pixel based on the preprocessing result in the S2, acquiring a water body index through wave band calculation, and selecting a water body index graph with the best extraction effect; s4, preprocessing the SAR observation image data in the S1 to obtain a backscattering coefficient map of the ground object; and S5, establishing a relation between the water body index map determined in the S3 and the backscattering coefficient map obtained in the S4, and performing data fusion to obtain water body information. The fusion method can scientifically, accurately and quickly acquire the water body information, master the time-space change rule of the water body and provide a theoretical basis for effective utilization and reasonable planning of water resources.

Description

Method for extracting water body information through data fusion
Technical Field
The invention relates to the field of water resource management, in particular to a method for extracting water body information through data fusion.
Background
Water is a source of life, and water resources play an especially important role in the development of a country and a society. The traditional water body information acquisition method mainly analyzes data acquired by manual field monitoring and hydrological monitoring stations. Although the accuracy of the field monitoring data is highest, the field monitoring period is long, the cost is high, and the monitoring range is limited. In addition, the seasonal change of the water body is large, and the change is obvious in the year, which are not beneficial to obtaining water body information with long time sequence and large range.
The application of remote sensing technology in monitoring water bodies is more and more extensive. The remote sensing image has the advantages of wide coverage range, short period, capability of acquiring water body information with high precision and high space-time resolution, no limitation of the terrain of a research area, no damage to a research place and the like. At present, the technology for extracting water can be classified into optical image extraction, SAR image extraction, and extraction by fusing two images according to the image type. The optical image has rich wave band information, but the image is influenced by weather conditions, and under the weather conditions of cloudy, thunderstorm and the like, the image quality is poor, and the water body information cannot be extracted; the SAR has the capability of monitoring all-weather all-day-time, can penetrate clouds and fog, and is not influenced by weather, but the SAR is easily influenced by terrain and speckle noise when extracting water body information, so that the accuracy of an extraction result is low.
Aiming at the problems, the invention provides a method for extracting water body information through data fusion. According to the method, through data fusion, a relation between the water body index obtained through optical image calculation and the backscattering intensity coefficient obtained through SAR image preprocessing is established, the problems that the optical image is limited by weather influence and the SAR image is high in noise are solved, and water body information with high precision and high space-time resolution can be obtained. By the method, the water body can be identified and positioned, the spatial position of the water body and the time-space change rule of the water body area are obtained, a basis can be provided for monitoring and managing water resources, and decision-making services are provided for departments such as water conservancy and resource environment.
Disclosure of Invention
The invention aims to fuse two remote sensing images to obtain water body information with high space-time resolution and obtain the space-time change characteristics of a water body, thereby providing a method for extracting the water body information through data fusion, and effectively fusing the two data to obtain the high-precision water body information.
In order to achieve the purpose, the method for extracting the water body information through data fusion provided by the invention comprises the following steps:
s1, acquiring optical image data and SAR observation image data covering the research area range, wherein the weather condition of the optical image in the day needs to be sunny in order to acquire better optical image data;
s2, carrying out data preprocessing of radiometric calibration, atmospheric correction and geometric correction on the optical image data in the S1. Radiometric calibration is the establishment of a quantitative relationship between the spectral reflectance of the surface features and the DN value obtained by the sensor. The formula mainly depends on:
L=M×Q+A
among them, the L-converted radiance value, the M-image gain, the Q-satellite payload observation (DN value), and the A-image bias. The purpose of atmospheric correction is to eliminate the influence of various gases and impurities in the atmosphere and obtain the real reflectivity and radiance of the ground object.
S3, acquiring a surface reflectance value on each pixel based on the preprocessing result in the S2, acquiring a water body index through wave band calculation, and selecting a water body index graph with the best extraction effect;
s4, carrying out radiometric calibration, terrain correction, geocoding and filtering pretreatment on the SAR observation image data in the S1 to obtain a backscattering coefficient map of the ground object. The purpose of the radiometric calibration in this step is to convert the intensity values on the image into backscattering coefficients. The basic formula for radiometric calibration is:
Figure BDA0002755369060000031
wherein sigma0And D is a DN value on the original image, K is a scaling factor of the SAR image, and theta is an incidence angle. Because the SAR imaging mode is side-view imaging, phenomena such as perspective shrinkage, shadow, overlapping and the like can be generated, and the phenomena can cause errors on the result of water body information extraction. The purpose of geocoding is to transform the image data from the coordinate system of the SAR image (the slant range coordinate system) to some more general reference coordinate system, typically a geographic coordinate system containing geographic coordinates.
S5, establishing a relation between the water body index graph determined in the S3 and the backscattering coefficient graph obtained in the S4 by using a sliding window regression fitting method, and performing data fusion to obtain water body information.
Preferably, the atmosphere correction preprocessing in S2 adopts a FLAASH method.
Preferably, the water body index in the S3 is subjected to two classifications of the selected normalized difference water body index NDWI, the improved normalized difference water body index MNDWI, the automatic water body extraction index AWEInsh and the AWEIsh by using an Otsu threshold method, and the extraction result is verified based on a google earth image, so that a water body index graph with the best extraction effect is selected.
Preferably, in S4, the RD model is selected for terrain correction, and the srtmdex data is selected for geocoding, so that the SAR data can be converted from a slant range coordinate system to a geographic coordinate system, and phenomena such as perspective, shrinkage, top-bottom displacement and the like can be eliminated. The inherent speckle noise of the SAR image can also cause great interference to the extraction of water body information, and the speckle noise is eliminated by Lee filtering adopted by the method. In order to better distinguish the water body from the non-water body, the pixel values on the two preprocessed images are finally converted into logarithmic scales (dB) from linear scales.
Preferably, ENVI or SNAP software can be selected for preprocessing the optical image data in S2, and one of SNAP, ENVI/Sarscape, PIE-SAR and PoLSAR software can be selected for preprocessing the SAR observation image data in S4.
Preferably, the method further comprises the step of performing precision verification on the data fusion result in the step S5, wherein the indexes of the verification are the overall precision, the user precision, the product precision and the Kappa coefficient respectively.
Compared with the prior art, the invention has the beneficial effects that:
(1) the method is based on optical images and SAR image data, and utilizes a sliding window regression fitting method to establish a correlation between the water body index and the backscattering coefficient and extract water body information. The method can effectively overcome the defects that the optical image is limited by weather conditions and the SAR image has more speckle noise when the water body is extracted, so that two kinds of remote sensing data are more fully utilized in the water body extraction.
(2) The invention can accurately extract the water body information with high precision and high space-time resolution, and master the space-time change rule of the water body in a long time sequence. In addition, the remote sensing image has large image size and wide coverage area, and can efficiently and quickly acquire large-scale water body information under the condition of low cost. The space-time distribution and the change rule of the water body are mastered, a water resource database is established, and a theoretical basis is provided for the effective utilization of water resources reasonably planned by water conservancy departments, resource management departments and the like.
Drawings
FIG. 1 is a schematic flow chart of a method for extracting water body information through data fusion according to the present invention;
FIG. 2 is a water body information accuracy verification diagram extracted by the method of the present invention;
FIG. 3 is a graph of water results extracted using the method of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The invention provides a method for extracting water body information through data fusion, which comprises the following specific steps:
s1, acquiring two optical image data covering the research area with numbers L1And L2The acquisition times are respectively marked as tL1And tL2Two Synthetic Aperture Radar (SAR) observation image data with serial numbers S1And S2The acquisition times are respectively tS1And tS2Wherein L is1、S1And S2As experimental data, L2As authentication data, tL1And tS1,tL2And tS2The smaller the time interval, the better. The weather condition of the optical image on the day needs to be sunny.
S2, combining the two optical images L1And L2Data radiometric calibration, atmospheric correction and geometric correctionAnd (4) performing pretreatment. The invention adopts a FLAASH method to carry out atmospheric correction. Selecting L1As the reference image, the other images are geometrically corrected. And selecting control points to carry out geometric correction, wherein the control points are uniformly distributed on the whole image, and the error is controlled within 0.5 pixel.
And S3, calculating the water body index. And finally obtaining the surface reflectance value on each pixel based on the optical data preprocessing result, and obtaining the water body index through wave band calculation. The invention selects 4 water body indexes with wider application, which mainly comprises the following steps: normalized difference water body index NDWI, improved normalized difference water body index MNDWI and automatic water body extraction index AWEInshAnd AWEIshThe calculation formulas are respectively as follows:
Figure BDA0002755369060000061
Figure BDA0002755369060000062
AWEInsh=4×(G-SWIR1)-(0.25×NIR+2.75×SWIR2)
AWEIsh=B+2.5×G-1.5×(NIR+SWIR1)-0.25×SWIR2
among them, G-green light, B-blue light, NIR-near infrared, MIR-mid infrared, SWIR1, SWIR 2-short wave near infrared.
And selecting an Otsu threshold value method for secondary classification, and dividing the ground objects in the research area into a water body and a non-water body. And (4) superposing the extraction result with a Google Earth image for verification, and selecting a water body index graph with the best extraction effect.
S4, SAR image S1And S2Radiometric calibration, atmospheric correction, and geometric correction are performed. Radiometric calibration converts intensity values on the image into backscatter coefficients. The basic formula for radiometric calibration is:
Figure BDA0002755369060000063
wherein sigma0And D is a DN value on the original image, K is a scaling factor of the SAR image, and theta is an incidence angle. Because the SAR imaging mode is side-looking imaging, phenomena such as perspective shrinkage, overlapping and shadow can be generated, and the phenomena can cause errors on the water body extraction result. The invention selects RD (RangeDoppler) model to correct the terrain. The purpose of geocoding is to transform image data from the coordinate system (the slant-range coordinate system) at the time of radar imaging to some more general reference coordinate system, typically a geographic coordinate system that includes geographic coordinates. The invention uses SRTM DEM data to carry out terrain correction coding, thereby not only converting SAR data from an oblique distance coordinate system to a geographic coordinate system, but also eliminating the phenomena of perspective, contraction and top-bottom displacement. The inherent speckle noise of the SAR image also causes great interference to the extraction of water body information, and the filtering algorithm adopted in the method is Lee filtering. In order to better distinguish the water body from the non-water body, the pixel values of the two images after radiation correction, geometric correction geocoding and filtering processing are required to be converted from linear scale to logarithmic scale (dB).
And S5, carrying out image fusion on the water body index map of the optical image and the backscattering coefficient map of the SAR image. Firstly to S1And S2The obtained backscattering coefficient image is used for resampling by a nearest neighbor method, so that the spatial resolution of the two data is the same. Mixing L with1The extracted water body index image is used as a dependent variable image layer Y, and S is used1The backscattering coefficients obtained by different polarization modes and correlation values obtained by correlation calculation (multiplication, square and the like) jointly form n layers as independent variable layers X (X1, X2, … Xn). The invention uses regression fitting method, the size of the sliding window is (2m +1) × (2m +1), m>1 is ═ 1; a linear relationship between Y and each of the independent variables X1, X2, … Xn was established. The specific process is as follows:
in the jth sliding window, the argument Yj is:
Figure BDA0002755369060000071
in the jth sliding window of the independent variable layers X1, X2 and … Xn, the corresponding independent variable Xij (i is more than or equal to 1 and less than or equal to n) is as follows:
Figure BDA0002755369060000081
the Yj, Xij are respectively changed into column vectors Yj, Xij,
Figure BDA0002755369060000082
assuming that k is a coefficient layer and C is a constant term layer, according to a fitting equation:
Figure BDA0002755369060000083
finally, obtaining a coefficient matrix image layer of n central pixels, such as a coefficient matrix kij corresponding to the jth window of the ith independent variable:
Figure BDA0002755369060000084
in a certain constant term layer, in a constant matrix Cj corresponding to the jth window:
Figure BDA0002755369060000085
finally obtaining n coefficient layers k (k)1,k2,…kn) And n constant term layers C (C1, C2, … Cn), and the determining coefficient R of each window fitting equation can be calculated2The layer and the significance p layer are used for evaluating the goodness of fit of the equation;
S2the backscattering coefficients obtained by different polarization modes and related calculated values obtained by calculation form n layers which are used as input independent variable X '(X1', X2', … Xn') layers, and the formula is as follows:
Y′=kX′+C
and calculating to obtain a fitting value Y'.
And S6, performing precision verification on the fusion result. And performing secondary classification on the fitted Y' through an Otsu threshold method, and dividing the ground objects into a water body and a non-water body. Based on the GoogleEarth image, the optical image L2Performing precision verification on the water body extraction result and the water body extraction result of the fusion image Y'; with L2And verifying the fusion result by taking the water body index extraction result as a reference, wherein the verified indexes are as follows: overall accuracy, user accuracy, product accuracy and Kappa coefficient.
It should be noted that the processing software of the optical image in the present invention may be one of enii and SNAP. The preprocessing software of the radar image can be one of the commonly used SNAP, ENVI/Sarscape, PIE-SAR, PoLSAR and the like. The calculations in the above steps S2, S3, and S4 can be obtained by software calculations such as enii, SNAP, PIE-SAR, etc., and the calculation in S5 can be obtained by calculations such as ArcGIS, Matlab, Python, C + +, etc.
The feasibility of the invention is further proved by applying the method provided by the invention to practical cases.
According to the scheme, two scenes of optical image data Landsat8OIL (data time is respectively 2019-07-30 and 2019-08-15) and two scenes of SAR image data Sentinel-1 (data time is respectively 2019-08-02 and 2019-08-14) of a zhangkou area are obtained. By utilizing the data fusion method provided by the invention, the two data are fused to obtain high-precision water body information. The method mainly comprises the following steps:
the method comprises the following steps: and preprocessing the Landsat8OIL data by utilizing ENVI software, wherein the preprocessing process comprises radiometric calibration, atmospheric correction and geometric correction. Carrying out radiation calibration by using a calibration file carried by Landsat 8; atmospheric correction and selection of FLAASH tools; and selecting control points for geometric correction, wherein the control points are uniformly distributed on the image, and the correction error is controlled in 0.5 pixel. Through data preprocessing, Landsat8 surface reflectivity data are obtained.
Step two: performing waveband calculation on Landsat8 surface reflectivity data to obtain water body index image (NDWI, MNDWI, AWEI)nshAnd AWEIsh). In Matlab, 4 water body index images are subjected to secondary classification by using an Otsu threshold method, and the ground objects are divided into water bodies and non-water bodies. And carrying out precision verification on the classification result by using the Google Earth image, wherein the result shows that the MNDWI water body extraction precision is highest.
Step two: and preprocessing the Sentinel-1 data by utilizing SNAP software, wherein the preprocessing process comprises radiometric calibration, terrain correction, geocoding and filtering processing. The radiation calibration parameters are obtained from Sentinel-1 data, an RD (Range Doppler) model is selected for terrain correction, SRTMDEM is selected as terrain correction data, a Lee filter algorithm (filter window: 7 multiplied by 7) is selected for filtering, and finally the backscattering coefficient value is converted into a dB value from a linear scale value. Through data preprocessing, backscatter coefficient maps (numbers VV, VH) in both VV and VH polarization modes are finally obtained, with a spatial resolution of 20 m.
Step three: the backscatter coefficient maps under VV and VH polarizations were resampled by the nearest neighbor method using Arcgis software, and the resolution after resampling was 30 m. By grid calculation, VV × VV, VH × VH, and VV × VH were calculated, and three images, numbered VVVV, VHVH, and VVVH, were obtained.
The MNDWI water body index map of 8, month and 15 in 2019 is used as a dependent variable Y, and the VV, VH, VVVV, VHVH and VVVH of 14 months in 2019 are used as independent variables X. According to the sliding window regression fitting method provided by the invention, Y, X is subjected to regression fitting under different sliding windows, and finally a coefficient layer, a constant term layer and a decision coefficient R are obtained2Layers and saliency p layers. VV, VH, VVVV, VHVH, and VVVH obtained on day 8, month 2 in 2019 are used as input arguments X ', and the obtained coefficient layer and constant term layer are operated by Y' ═ kX '+ C, so that a data fusion result Y' on day 8, month 2 in 2019 is finally obtained. And performing secondary classification on the fusion result by an Otsu threshold method. And carrying out precision verification based on the Google Earth image. In addition, compared with MNDWI water body index maps of 7, month and 30 in 2019, the evaluation is carried out through overall precision, user precision, product precision and Kappa coefficient. And adjusting the size of the sliding window through precision evaluation until the precision reaches the maximum.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be able to cover the technical scope of the present invention and the equivalent alternatives or modifications according to the technical solution and the inventive concept of the present invention within the technical scope of the present invention.

Claims (6)

1. A method for extracting water body information through data fusion is characterized by comprising the following steps:
s1, acquiring optical image data and SAR observation image data covering the range of the research area; acquiring two scenes of optical image data covering the research area, wherein the serial numbers of the two scenes are respectively L1And L2The acquisition times are respectively marked as tL1And tL2Acquiring two-scene SAR observation image data with serial numbers S1And S2The acquisition times are respectively tS1And tS2
S2, performing data preprocessing of radiometric calibration, atmospheric correction and geometric correction on the optical image data in the S1;
s3, acquiring a surface reflectance value on each pixel based on the preprocessing result in the S2, acquiring a water body index through wave band calculation, and selecting a water body index graph with the best extraction effect;
s4, carrying out radiometric calibration, terrain correction, geocoding and filtering pretreatment on the SAR observation image data in the S1 to obtain a backscattering coefficient map of the ground object;
s5, establishing a relation between the water body index graph determined in the S3 and the backscattering coefficient graph obtained in the S4 by using a sliding window regression fitting method, and performing data fusion to obtain water body information;
firstly to S1And S2The obtained backscatter coefficient map is used for resampling by a nearest neighbor method, so that the spatial resolution of the two data is the same; mixing L with1The extracted water body index image is used as a dependent variable image layer Y, and S is used1Backscattering coefficient obtained by different polarization modes and obtained by multiplying or squaring backscattering coefficientThe calculated values jointly form n layers as independent variable layers X (X1, X2, … Xn); the regression fitting method is adopted, and the sliding window size is (2m +1) × (2m +1), m>1 is ═ 1; establishing a linear relationship between Y and each independent variable X1, X2, … Xn; the specific process is as follows:
in the jth sliding window, the argument Yj is:
Figure FDA0003156496930000021
in the jth sliding window of the independent variable layers X1, X2 and … Xn, the corresponding independent variable Xij (i is more than or equal to 1 and less than or equal to n) is as follows:
Figure FDA0003156496930000022
the Yj, Xij are respectively changed into column vectors Yj, Xij,
Figure FDA0003156496930000023
assuming that k is a coefficient layer and C is a constant term layer, according to a fitting equation:
Figure FDA0003156496930000024
finally, obtaining a coefficient matrix image layer of n central pixels, such as a coefficient matrix kij corresponding to the jth window of the ith independent variable:
Figure FDA0003156496930000025
in a certain constant term layer, in a constant matrix Cj corresponding to the jth window:
Figure FDA0003156496930000031
finally obtaining n coefficient layers k (k)1,k2,…kn) And n constant term layers C (C1, C2, … Cn), and calculating a coefficient of determination R for obtaining a fitting equation of each window2The layer and the significance p layer are used for evaluating the goodness of fit of the equation;
S2the backscattering coefficients obtained by different polarization modes and related calculated values obtained by calculation form n layers which are used as input independent variable X '(X1', X2', … Xn') layers, and the formula is as follows:
Y′=kX′+C
calculating to obtain a fitting value Y';
performing precision verification on the fusion result; performing secondary classification on the fitted Y' through an Otsu threshold method, and dividing the ground objects into a water body and a non-water body; based on the GoogleEarth image, the optical image L2And performing precision verification on the water body extraction result and the water body extraction result of the fusion image Y'.
2. The method for extracting water body information through data fusion as claimed in claim 1, wherein the atmosphere correction preprocessing in the S2 adopts a FLAASH method.
3. The method for extracting water body information through data fusion as claimed in claim 1, wherein the water body index in S3 adopts Otsu threshold method to select normalized difference water body index NDWI, improved normalized difference water body index MNDWI and automatic water body extraction index AWEInshAnd AWEIshAnd (4) carrying out secondary classification, verifying the extraction result based on the Google Earth image, and selecting a water body index graph with the best extraction effect.
4. The method for extracting water body information through data fusion according to claim 1, wherein in the step S4, an RD model is selected for terrain correction, srtmdiem data is selected for geocoding, and Lee filtering is adopted to eliminate speckle noise.
5. The method of claim 1, wherein ENVI or SNAP software is selected for preprocessing the optical image data in S2, and one of SNAP, ENVI/Sarscape, PIE-SAR and PoLSAR software is selected for preprocessing the SAR observation image data in S4.
6. The method for extracting water body information through data fusion as claimed in claim 1, further comprising performing accuracy verification on the data fusion result in S5, wherein the verified indexes are the overall accuracy, the user accuracy, the product accuracy and the Kappa coefficient respectively.
CN202011201586.XA 2020-11-02 2020-11-02 Method for extracting water body information through data fusion Active CN112364289B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011201586.XA CN112364289B (en) 2020-11-02 2020-11-02 Method for extracting water body information through data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011201586.XA CN112364289B (en) 2020-11-02 2020-11-02 Method for extracting water body information through data fusion

Publications (2)

Publication Number Publication Date
CN112364289A CN112364289A (en) 2021-02-12
CN112364289B true CN112364289B (en) 2021-08-13

Family

ID=74514175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011201586.XA Active CN112364289B (en) 2020-11-02 2020-11-02 Method for extracting water body information through data fusion

Country Status (1)

Country Link
CN (1) CN112364289B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129318B (en) * 2021-04-25 2021-11-16 水利部信息中心 Method for calculating water storage capacity of stagnant flood area by utilizing SAR (synthetic aperture radar) image

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087378A (en) * 2018-09-11 2018-12-25 首都师范大学 Image processing method and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201601660D0 (en) * 2016-01-29 2016-03-16 Global Surface Intelligence Ltd System and method for earth observation and analysis
CN107862255B (en) * 2017-10-23 2020-12-22 交通运输部科学研究院 Wetland information extraction and ecological sensitivity evaluation method based on microwave remote sensing and optical remote sensing technology
CN108303044B (en) * 2018-02-01 2020-02-21 苏州市农业科学院 Leaf area index obtaining method and system
CN108613933A (en) * 2018-06-13 2018-10-02 中南林业科技大学 Forest land arid space-time dynamic monitoring method based on multi-sources RS data fusion
CN109300133B (en) * 2018-11-19 2020-10-23 珠江水利委员会珠江水利科学研究院 Urban river network area water body extraction method
CN110097101B (en) * 2019-04-19 2022-09-13 大连海事大学 Remote sensing image fusion and coastal zone classification method based on improved reliability factor
CN111475950B (en) * 2020-04-09 2022-11-29 首都师范大学 Method for simulating rainfall flood of concave overpass

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087378A (en) * 2018-09-11 2018-12-25 首都师范大学 Image processing method and system

Also Published As

Publication number Publication date
CN112364289A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN109581372B (en) Ecological environment remote sensing monitoring method
CN102565778B (en) Relative radiometric correction method for automatically extracting pseudo-invariant features for remote sensing image
Marquez et al. Intra-hour DNI forecasting based on cloud tracking image analysis
CN109993237B (en) Water body rapid extraction method and system based on high-resolution satellite optical remote sensing data
CN109974665B (en) Aerosol remote sensing inversion method and system for short-wave infrared data lack
CN111368817B (en) Method and system for quantitatively evaluating thermal effect based on earth surface type
CN111024618A (en) Water quality health monitoring method and device based on remote sensing image and storage medium
CN108256186B (en) Pixel-by-pixel atmospheric correction method for online calculation lookup table
CN113205475A (en) Forest height inversion method based on multi-source satellite remote sensing data
CN114022783A (en) Satellite image-based water and soil conservation ecological function remote sensing monitoring method and device
CN114092835B (en) Normalized vegetation index data space-time fusion method based on different space-time resolutions
CN103544477A (en) Improved linear spectral mixture model based vegetation coverage estimation method
CN112733596A (en) Forest resource change monitoring method based on medium and high spatial resolution remote sensing image fusion and application
CN113744249B (en) Marine ecological environment damage investigation method
CN110569797A (en) earth stationary orbit satellite image forest fire detection method, system and storage medium thereof
CN114564767A (en) Under-cloud surface temperature estimation method based on sun-cloud-satellite observation geometry
CN112329790B (en) Quick extraction method for urban impervious surface information
CN116519557B (en) Aerosol optical thickness inversion method
CN109671038A (en) One kind is based on the classified and layered relative radiometric correction method of pseudo- invariant features point
CN112364289B (en) Method for extracting water body information through data fusion
CN114112906B (en) Water body feature extraction system based on unmanned aerial vehicle low altitude remote sensing and local topography
Zhang et al. A back propagation neural network-based radiometric correction method (BPNNRCM) for UAV multispectral image
CN110909821B (en) Method for carrying out high-space-time resolution vegetation index data fusion based on crop reference curve
CN115546658B (en) Night cloud detection method combining quality improvement and CNN improvement of data set
CN116682024A (en) Rapid cloud detection method based on four-band remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant