CN115761480A - Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar - Google Patents

Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar Download PDF

Info

Publication number
CN115761480A
CN115761480A CN202211339312.6A CN202211339312A CN115761480A CN 115761480 A CN115761480 A CN 115761480A CN 202211339312 A CN202211339312 A CN 202211339312A CN 115761480 A CN115761480 A CN 115761480A
Authority
CN
China
Prior art keywords
alpha
area
image
burned area
burned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211339312.6A
Other languages
Chinese (zh)
Inventor
张瑞
沙马阿各
展润青
王婷
包馨
欧阳晓莹
胡金龙
庞家泰
段金亮
洪瑞凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN202211339312.6A priority Critical patent/CN115761480A/en
Publication of CN115761480A publication Critical patent/CN115761480A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention relates to the technical field of burned area extraction, in particular to a burned area extraction method based on a remote sensing image of a dual-polarized synthetic aperture radar, which comprises the following steps: step S1: carrying out radiometric calibration, multi-view processing, image registration, fine Lee filtering and geocoding data preprocessing on single-view complex image SLC data; step S2: carrying out backscattering coefficient ratio, H-A-Alpha decomposition and gray level co-occurrence matrix calculation on the basis of SLC data sets before and after burning to respectively obtain data sets capable of reflecting changes of backscattering intensity characteristics, polarization decomposition characteristics and texture characteristics; and step S3: selecting a random forest as a machine learning model, and extracting a burned area by combining various types of characteristic change parameters in the step S2; and step S4: and (3) evaluating the accuracy of extracting the burn plots by the polarized SAR by taking the optically extracted burn plots as a reference standard. The invention can better extract the burned land.

Description

Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar
Technical Field
The invention relates to the technical field of burned area extraction, in particular to a burned area extraction method based on a remote sensing image of a dual-polarized synthetic aperture radar.
Background
The variety of disasters in nature is many, and the occurrence of fire is one of the most destructive to the forest ecosystem. The hazards of forest fires include: burning trees, burning plant resources under the trees, harming wild animals, causing water and soil loss, causing air pollution, threatening the safety of people's lives and properties, and the like. The traditional method for measuring forest fire combustion areas in field operation has the advantages of large workload, high manpower and material resource consumption, complex terrain in some forest areas, difficulty in reaching people and poor traffic communication conditions. Therefore, the traditional measuring method is not suitable for extracting and positioning the fire area in the forest area with large area. The satellite remote sensing is used for sensing objects from a distance, can realize large-scale macroscopic monitoring on the ground, and can realize large-area forest fire forest damage area extraction.
For forest fire monitoring based on an optical remote sensing method, image data is often interfered by cloud cover and smoke coverage in the acquisition process. As for microwave remote sensing, the wavelength is long enough, the microwave remote sensing can penetrate through cloud layers and smoke to reach the ground surface, and reliable original data are provided for forest fire monitoring. In addition, synthetic Aperture Radar (SAR) image data has high spatial resolution, and combustion area identification and extraction are facilitated. Previous studies have demonstrated the feasibility of using radar measurements to estimate forest fire effects. Therefore, the SAR image data has great potential in forest fire monitoring and fire area mapping.
Through a series of preliminary verification researches on polarization decomposition coefficients, interference coherence, backscattering intensity and the like, SAR backscattering sensitivity is found to be an important factor influencing image classification. However, SAR backscattering intensity is a comprehensive reflection of parameters such as soil moisture, biomass, roughness, dielectric constant, etc. The burn site is extracted only by the single type of back scattering intensity change characteristics, is easily influenced by various environmental factors, and has certain limitation. The radar waves with different polarization modes are combined to carry out analysis comprehensively, and the attribute change of the forest land before and after the fire is judged accurately.
Disclosure of Invention
The invention provides a burned area extraction method based on a remote sensing image of a dual-polarized synthetic aperture radar, which can reduce the timeliness of extracting burned areas by aiming at the fact that an optical satellite data source is easily influenced by objective environmental factors such as cloud and the like. According to the method, the multi-polarization SAR image which is not influenced by cloud fog is taken as a data source, the scattering intensity, polarization decomposition and texture characteristic change parameters of the multi-polarization SAR image are fully combined to be taken as the input characteristics of the random forest RF, and finally the extraction of the burned area is realized.
The burned area extraction method based on the remote sensing image of the dual-polarized synthetic aperture radar comprises the following steps:
step S1: carrying out radiometric calibration, multi-view processing, image registration, fine Lee filtering and geocoding data preprocessing on single-view complex image SLC data;
step S2: carrying out backscattering coefficient ratio, H-A-Alpha decomposition and gray level co-occurrence matrix calculation on the basis of SLC data sets before and after burning to respectively obtain data sets capable of reflecting changes of backscattering intensity characteristics, polarization decomposition characteristics and texture characteristics;
and step S3: selecting a random forest as a machine learning model, and extracting a burned area by combining various types of characteristic change parameters in the step S2;
and step S4: and (3) evaluating the accuracy of extracting the burn plots by the polarized SAR by taking the optically extracted burn plots as a reference standard.
Preferably, the specific steps of step S2 are:
step S21 highlights a combustion area using a radar burning ratio RBR, which is a ratio of backscatter coefficients before and after a fire, and its calculation formula is as follows:
Figure BDA0003915863070000021
wherein xy represents a polarization mode, vertical polarization VV or cross polarization VH; gamma ray pre Expressing a Sigma value or a Gamma value of a normalized backscattering coefficient before fire; gamma ray post Represents a Sigma value or a Gamma value after fire hazard;
step S22: the dual-polarization H-A-Alpha decomposition method defines entropy H, scattering angle Alpha and inverse entropy A according to characteristic values and characteristic vectors of the covariance matrix of the SAR image to identify scattering mechanisms of different ground objects; the calculation formula of the entropy H, the scattering angle alpha and the inverse entropy A is as follows:
Figure BDA0003915863070000031
Figure BDA0003915863070000032
Figure BDA0003915863070000033
wherein λ is i Is a polarization covariance matrix C 2 A characteristic value; alpha is alpha i As a polarization covariance matrix C 2 Scattering angles corresponding to the feature vectors;
and calculating a normalized alpha index ND alpha I by using the scattering angle alpha, wherein the calculation formula is as follows:
Figure BDA0003915863070000034
wherein alpha is pre Representing the value of the reflection angle alpha before the fire; alpha is alpha post Expressing the reflection angle alpha value after fire disaster;
a difference image is created by using parameters extracted by polarization SAR images before and after a fire, and the calculation formula is as follows:
Figure BDA0003915863070000035
wherein p is c A difference value representing a parameter; p is a radical of pre And p post Respectively representing the parameter values before and after the fire;
step S23: the texture analysis method based on the gray level co-occurrence matrix GLCM selects 8 characteristics: mean, contrast, variance, dissimilarity, homogeneity, correlation, information entropy and second moment; and respectively calculating the difference images of the 8 texture characteristics by using a formula (6) to improve the accuracy of identifying the burned area.
Preferably, the specific steps of step S3 are:
step S31: the random forest is a bagging method using a CART decision tree as a learner, and is predicted based on mean regression of the decision tree, wherein the mathematical principle is as follows:
Figure BDA0003915863070000036
wherein x is an independent variable; theta.theta. t Are random variables subject to independent and uniform distribution; t is the number of decision trees; h (x, theta) t ) Based on x and theta t The output quantity of (a);
step S32: taking the feature set formed in the step S2 as an input feature of the random forest RF, selecting training and testing data set labels from 10m Sentinel-2 data, and forming burnt area pixels and unburnt area pixels; the comparison of the training sample to the test sample was set to 70.
Preferably, the specific steps of step S4 are:
step S41: using the burned area extracted by the Sentinel-2 as reference data to evaluate the accuracy of SAR image burned area extraction;
step S42: counting the combustion area extracted from the experimental result, and calculating Extraction Accuracy #i Error extraction error Omission Errors #i And leakage extraction error Commission Errors #i The calculation formula is as follows:
Figure BDA0003915863070000041
Figure BDA0003915863070000042
Figure BDA0003915863070000043
wherein # i represents different SAR feature combinations, #1 represents a feature combination of an optical image, agentement represents an area where a burned area extracted based on the SAR image is consistent with the burned area extracted based on the optical image, and S1 Only represents an error extraction area where the SAR image is equivalent to the optical image recognition burned area; s2 Only indicates that the SAR image corresponds to a leak extraction area of the optical image recognition burned area.
The beneficial effects of the invention are:
the SAR image which is not influenced by cloud and mist is taken as a data source. In order to enhance the variation of the scattering characteristics before and after the fire, using the difference parameters of a single scene obtained before and after the fire; and the scattering intensity, polarization decomposition and texture characteristic variation parameters are fully combined to be used as input characteristics of random forest RF, and finally, the extraction of the burned land is realized. The accuracy evaluation of SAR image extraction of the burned area is carried out by using the burned area extracted by Sentiel-2 as reference data, and the result shows that: the extraction precision is 87.12%, the wrong extraction error and the missed extraction error are 20.44% and 12.88% respectively, the method is basically consistent with the Sentinel-2A extraction result, and the method and the experimental result can provide references for accurate extraction of burned areas, post-disaster evaluation and other researches.
Drawings
FIG. 1 is a flow chart of an intelligent identification method for a railway and accompanying roads based on high-resolution remote sensing images in an embodiment;
FIG. 2 is a schematic diagram of the accuracy evaluation of the burned area extraction result in the example.
Detailed Description
For a further understanding of the invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings and examples. It is to be understood that the examples are illustrative of the invention and not limiting.
Examples
Fig. 1 shows the overall architecture design of the present invention, and the architecture is divided into four parts S1, S2, S3, and S4 according to steps, which are sequentially performed. Each section has a separate output and is the input for the next section. The embodiment provides an intelligent identification method for a railway and accompanying roads based on high-resolution remote sensing images, which comprises the following steps:
step S1: in order to eliminate speckle noise inherent in the polarized SAR system, data preprocessing such as radiometric calibration, multi-view processing, image registration, refined Lee filtering, geocoding and the like is performed on SLC (single-view complex image) data.
Step S2: and (3) carrying out backscattering coefficient ratio, H-A-Alpha decomposition and gray level co-occurrence matrix calculation on the basis of SLC data sets before and after firing to respectively obtain data sets capable of reflecting changes of backscattering intensity characteristics, polarization decomposition characteristics and texture characteristics. The specific steps of the step S2 are as follows:
step S21: the radar signals before combustion mainly come from vegetation canopies, interlaminations and earth surface vegetation are burned out successively after forest fires are burned, the radar signals reach the ground, interact with the soil surface and return the radar signals, the main factor influencing radar backscattering measurement values is soil humidity, and when the soil humidity is increased after the forest fires, backscattering intensity is increased; the backscatter intensity decreases as the moisture content of the soil decreases after a fire. In order to better identify the combustion area, the invention utilizes the Radar Burn Ratio (RBR), namely the ratio of the backscattering coefficients before and after the fire to highlight the combustion area, and the calculation formula is as follows:
Figure BDA0003915863070000061
wherein xy represents a polarization mode, vertical polarization VV or cross polarization VH; gamma ray pre Expressing a Sigma value or a Gamma value of a normalized backscattering coefficient before fire; gamma ray post Represents a Sigma value or a Gamma value after a fire.
Step S22: forest fires burn off forest canopies and destroy original vegetation structures, so that a backscattering mechanism of a fire passing area is changed. Before combustion, target scattering echoes mainly come from canopy of vegetation and are expressed as volume scattering; after burning, the volume scattering components from the canopy or under-forest foliage are greatly reduced, while the scattering components from the soil surface are increased, and thus the surface scattering is significantly increased. Polarization target decomposition is used as a main method for extracting polarization scattering characteristics, and different types of scattering mechanisms with actual physical meanings can be identified. The dual-polarization H-A-Alpha decomposition method can identify the scattering mechanisms of different ground objects according to polarization parameters such as characteristic values and characteristic vectors of the covariance matrix of the SAR image, definition entropy H, scattering angle Alpha, inverse entropy A and the like. The calculation formula of the entropy H, the scattering angle alpha and the inverse entropy A is as follows:
Figure BDA0003915863070000062
Figure BDA0003915863070000063
Figure BDA0003915863070000064
wherein λ is i Is a polarization covariance matrix C 2 A characteristic value; p i A probability function representing the ith feature value; k is a constant; lambda [ alpha ] k Representation matrix C 2 The kth eigenvalue of (a); alpha is alpha i Is a polarization covariance matrix C 2 The scatter angle corresponding to the eigenvector.
To reduce the false identification of bare areas (e.g., arable land) as burn areas, the scattering Angle α is used herein to calculate a normalized α Index (ND α I), which is calculated as follows:
Figure BDA0003915863070000071
wherein alpha is pre Representing the value of the reflection angle alpha before the fire; alpha (alpha) ("alpha") post The reflection angle α after a fire is shown.
In order to avoid the influence of noise, a difference image is created by using parameters extracted from polarized SAR images before and after a fire, and the calculation formula is as follows:
Figure BDA0003915863070000072
wherein p is c A difference value representing a parameter; p is a radical of pre And p post The values of the parameters before and after a fire are indicated, respectively.
Step S23: according to the imaging characteristics of the SAR image, the gray scale of the SAR image reflects the backscattering characteristics of the ground objects to radar waves, and if different ground objects have the same or similar backscattering coefficients, the different ground objects are represented as the same or similar gray scale values in the SAR image, so that more complex gray scale distribution is presented. This makes it impossible to obtain satisfactory extraction accuracy using only the extraction burned area based on the image gradation features. In order to improve the precision, texture information in the target is utilized to make up for the defect of only using the gray scale features, and a texture analysis method is adopted to research the spatial distribution relation among pixels. The Gray Level Co-occurrence Matrix (GLCM) is a method for counting Gray information contained in an image, and the method can calculate the occurrence frequency of each Gray information contained in the image, and has the advantage of accurately predicting and reflecting the comprehensive information of the direction, adjacent interval and change amplitude of the Gray of the image. In this embodiment, the texture analysis method based on the gray level co-occurrence matrix selects 8 common features: mean, contrast, variance, dissimilarity, homogeneity, correlation, entropy and second moment. And respectively calculating the difference images of 8 texture features by using a formula (6) to improve the accuracy of identifying the burned area.
And step S3: and selecting a random forest as a machine learning model, and extracting a burned area by combining various types of characteristic change parameters of S2. The specific steps of the step S3 are as follows:
step S31: there are two typical integration ideas in the traditional machine learning algorithm, bagging and boosting. Bagging is a technique for obtaining K new data sets after an original data set has been put back into sampling K times. And through this sampling process, the algorithm of the classification model is trained. The random forest is a bagging method using a CART decision tree as a learner. Random forest is a common classification model for classification prediction and is also an integrated learning model of regression decision tree. The mean regression prediction based on decision trees has the following specific mathematical principles:
Figure BDA0003915863070000081
wherein x is an independent variable; theta t Are random variables subject to independent and uniform distribution; t is the number of decision trees; h (x, theta) t ) Based on x and theta t The output of (a) is set,
Figure BDA0003915863070000082
represents the average of the sum of all decision tree outputs. Meanwhile, the Bagging idea is introduced into the random forest model, random independent extraction calculation can be carried out on the sub-sample sets, and optimal features can be selected on the feature subsets in the construction process of the decision tree, so that the accuracy and reliability of the prediction result of the model are ensured to a certain extent, and the noise, abnormal values and the like are well eliminated. And voting the categories of each decision tree of the sample, and determining the classification result according to the voting number of the decision trees.
Step S32: the feature set formed by S2 is used as an input feature of random forest RF, training and testing data set labels are selected from 10m Sentinel-2 data and are composed of burnt area pixels and unburnt area pixels, and the data set is designed to be balanced so as to reduce the influence of an unbalanced data set. The comparison of the training sample to the test sample was set to 70.
And step S4: and (3) evaluating the accuracy of extracting the burn plots by the polarized SAR by taking the optically extracted burn plots as a reference standard. The specific steps of the step S4 are as follows:
step S41: and (3) using the burned plot extracted by the Sentinel-2 as reference data to evaluate the accuracy of the SAR image in extracting the burned plot.
Step S42: counting the combustion area extracted from the experimental result, and calculating the extraction precision, the wrong extraction error and the missing extraction error, wherein the calculation formula is as follows:
Figure BDA0003915863070000091
Figure BDA0003915863070000092
Figure BDA0003915863070000093
wherein # i represents different SAR feature combinations, #1 represents a feature combination of an optical image, agentement represents an area where a burned area extracted based on the SAR image is consistent with a burned area extracted based on the optical image, and S1 Only represents an erroneously extracted area where the SAR image corresponds to an optical image recognition burned area. S2 Only indicates that the SAR image corresponds to a leak extraction area of the optical image recognition burned area.
FIG. 2 shows the accuracy evaluation of different SAR feature combination extraction burned areas using the burned area extracted by Sentinel-2 as reference data. Because the texture of the research area is complex, the combustion area and the non-combustion area are difficult to distinguish only by using the characteristic parameters and the backscattering coefficients obtained by polarization decomposition, and the extraction precision is improved by about 30 percent after the texture characteristics are involved in the extraction of the burned area. As shown in FIG. 2, the extraction result of the burned area comprehensively considering the backscattering intensity, the polarization decomposition and the texture features has the highest goodness of fit with the extraction result of the Sentinel-2A, the extraction precision is 87.12%, the false extraction error and the false extraction error are 20.44% and 12.88% respectively, and the method can provide reference for the extraction of the burned area of the continuous cloud cover or the smoke area.
The present invention and its embodiments have been described above schematically, without limitation, and what is shown in the drawings is only one of the embodiments of the present invention, and the actual structure is not limited thereto. Therefore, if the person skilled in the art receives the teaching, without departing from the spirit of the invention, the person skilled in the art shall not inventively design the similar structural modes and embodiments to the technical solution, but shall fall within the scope of the invention.

Claims (4)

1. A burned area extraction method based on a remote sensing image of a dual-polarized synthetic aperture radar is characterized by comprising the following steps: the method comprises the following steps:
step S1: carrying out radiometric calibration, multi-view processing, image registration, fine Lee filtering and geocoding data preprocessing on single-view complex image SLC data;
step S2: carrying out backscattering coefficient ratio, H-A-Alpha decomposition and gray level co-occurrence matrix calculation on the basis of SLC data sets before and after burning to respectively obtain data sets capable of reflecting changes of backscattering intensity characteristics, polarization decomposition characteristics and texture characteristics;
and step S3: selecting a random forest as a machine learning model, and extracting a burned area by combining various types of characteristic change parameters in the step S2;
and step S4: and (3) evaluating the accuracy of the polarimetric SAR for extracting the burned area by taking the burned area extracted by the optics as a reference standard.
2. The burned area extraction method based on the remote sensing image of the dual-polarized synthetic aperture radar according to claim 1, characterized in that: the specific steps of the step S2 are as follows:
step S21 highlights a combustion area using a radar burning ratio RBR, which is a ratio of backscatter coefficients before and after a fire, and its calculation formula is as follows:
Figure FDA0003915863060000011
wherein xy represents a polarization mode, vertical polarization VV or cross polarization VH; gamma ray pre Expressing a Sigma value or a Gamma value of a normalized backscattering coefficient before fire; gamma ray post Represents a Sigma value or a Gamma value after a fire;
step S22: the dual-polarization H-A-Alpha decomposition method identifies scattering mechanisms of different ground objects by defining entropy H, a scattering angle Alpha and a reverse entropy A according to characteristic values and characteristic vectors of an SAR image covariance matrix; the calculation formula of the entropy H, the scattering angle alpha and the inverse entropy A is as follows:
Figure FDA0003915863060000012
Figure FDA0003915863060000021
Figure FDA0003915863060000022
wherein λ is i Is a polarization covariance matrix C 2 A characteristic value; p is a radical of i A probability function representing the ith feature value; k is a constant; lambda [ alpha ] k Representation matrix C 2 The kth eigenvalue of (a); alpha is alpha i Is a polarization covariance matrix C 2 Scattering angles corresponding to the feature vectors;
and calculating a normalized alpha index ND alpha I by using the scattering angle alpha, wherein the calculation formula is as follows:
Figure FDA0003915863060000023
wherein alpha is pre Representing the value of the reflection angle alpha before the fire; alpha is alpha post Expressing the reflection angle alpha value after fire disaster;
a difference image is created by using parameters extracted by polarization SAR images before and after a fire, and the calculation formula is as follows:
Figure FDA0003915863060000024
wherein p is c A difference value representing a parameter; p is a radical of pre And p post Respectively representing the parameter values before and after the fire;
step S23: the texture analysis method based on the gray level co-occurrence matrix GLCM selects 8 characteristics: mean, contrast, variance, dissimilarity, homogeneity, correlation, entropy and second moment; and respectively calculating the difference images of 8 texture features by using a formula (6) to improve the accuracy of identifying the burned area.
3. The burned area extraction method based on the remote sensing image of the dual-polarized synthetic aperture radar according to claim 2, characterized in that: the specific steps of the step S3 are as follows:
step S31: the random forest is a bagging method using a CART decision tree as a learner, and is predicted based on mean regression of the decision tree, wherein the mathematical principle is as follows:
Figure FDA0003915863060000025
wherein x is an independent variable; theta t Are random variables subject to independent and uniform distribution; t is the number of decision trees; h (x, theta) t ) Based on x and theta t The output quantity of (a);
Figure FDA0003915863060000026
represents the average of the sum of all decision tree outputs.
Step S32: taking the feature set formed in the step S2 as an input feature of the random forest RF, selecting training and testing data set labels from 10m Sentinel-2 data, and forming pixels of a burnt area and pixels of an unburnt area; the comparison of the training sample to the test sample was set to 70.
4. The burned area extraction method based on the remote sensing image of the dual-polarized synthetic aperture radar according to claim 3, characterized in that: the specific steps of the step S4 are as follows:
step S41: using the burned area extracted by the Sentinel-2 as reference data to evaluate the accuracy of SAR image burned area extraction;
step S42: counting the combustion area extracted from the experimental result, and calculating Extraction Accuracy #i Error extraction error Omission Errors #i And leakage extraction error Commission Errors #i The calculation formula is as follows:
Figure FDA0003915863060000031
Figure FDA0003915863060000032
Figure FDA0003915863060000033
wherein # i represents different SAR feature combinations, #1 represents a feature combination of an optical image, agentement represents an area where a burned area extracted based on the SAR image is consistent with the burned area extracted based on the optical image, and S1 Only represents an error extraction area where the SAR image is equivalent to the optical image recognition burned area; s2 Only indicates that the SAR image corresponds to a leak extraction area of the optical image recognition burned area.
CN202211339312.6A 2022-10-28 2022-10-28 Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar Pending CN115761480A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211339312.6A CN115761480A (en) 2022-10-28 2022-10-28 Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211339312.6A CN115761480A (en) 2022-10-28 2022-10-28 Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar

Publications (1)

Publication Number Publication Date
CN115761480A true CN115761480A (en) 2023-03-07

Family

ID=85354210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211339312.6A Pending CN115761480A (en) 2022-10-28 2022-10-28 Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar

Country Status (1)

Country Link
CN (1) CN115761480A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436003A (en) * 2023-12-15 2024-01-23 中国科学院、水利部成都山地灾害与环境研究所 Remote sensing dynamic monitoring method for erosion of soil of fire trace land by considering fire severity

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530568A (en) * 2016-10-14 2017-03-22 黑龙江科技大学 Forest fire remote sensing monitoring information intelligent service platform
CN111259784A (en) * 2020-01-14 2020-06-09 西安理工大学 SAR image change detection method based on transfer learning and active learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530568A (en) * 2016-10-14 2017-03-22 黑龙江科技大学 Forest fire remote sensing monitoring information intelligent service platform
CN111259784A (en) * 2020-01-14 2020-06-09 西安理工大学 SAR image change detection method based on transfer learning and active learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SANJIWANA ARJASAKUSUMA 等: "Monthly Burned-Area Mapping using Multi-Sensor Integration of Sentinel-1 and Sentinel-2 and machine learning: Case Study of 2019 ’s fire events in South Sumatra Province, Indonesia", 《ELSEVIER》, pages 1 - 9 *
祁帅: "极化SAR林火燃烧面积提取与燃烧强度估计", 《中国优秀硕士学位论文全文数据库 农业科技辑》, pages 10 - 15 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117436003A (en) * 2023-12-15 2024-01-23 中国科学院、水利部成都山地灾害与环境研究所 Remote sensing dynamic monitoring method for erosion of soil of fire trace land by considering fire severity
CN117436003B (en) * 2023-12-15 2024-03-15 中国科学院、水利部成都山地灾害与环境研究所 Remote sensing dynamic monitoring method for erosion of soil of fire trace land by considering fire severity

Similar Documents

Publication Publication Date Title
Refice et al. SAR and InSAR for flood monitoring: Examples with COSMO-SkyMed data
Sun et al. Wavelength selection of the multispectral lidar system for estimating leaf chlorophyll and water contents through the PROSPECT model
Danson et al. Developing a dual-wavelength full-waveform terrestrial laser scanner to characterize forest canopy structure
Nie et al. Above-ground biomass estimation using airborne discrete-return and full-waveform LiDAR data in a coniferous forest
Stover et al. Effect of elevated CO2 on coarse‐root biomass in Florida scrub detected by ground‐penetrating radar
Wang et al. Characterizing the spatial dynamics of land surface temperature–impervious surface fraction relationship
Fang et al. Atmospheric effects on the performance and threshold extrapolation of multi-temporal Landsat derived dNBR for burn severity assessment
CN110109118B (en) Forest canopy biomass prediction method
Villard et al. Forest biomass from radar remote sensing
Liu et al. Determination of boundary layer top on the basis of the characteristics of atmospheric particles
Li et al. Extending the stochastic radiative transfer theory to simulate BRF over forests with heterogeneous distribution of damaged foliage inside of tree crowns
CN115761480A (en) Burned area extraction method based on remote sensing image of dual-polarized synthetic aperture radar
Tao et al. Soil moisture retrieval from SAR and optical data using a combined model
McNairn et al. Establishing crop productivity using RADARSAT-2
CN110516552B (en) Multi-polarization radar image classification method and system based on time sequence curve
Chen et al. Evaluation of Landsat TM vegetation indices for estimating vegetation cover on semi-arid rangelands: a case study from Australia
Challis et al. The role of lidar intensity data in interpreting environmental and cultural archaeological landscapes
Friedl et al. An overview of uncertainty in optical remotely sensed data for ecological applications
CN107945162A (en) Detection recognition method based on computer vision technique to flourishing ant nest
Traviglia Archaeological usability of hyperspectral images: Successes and failures of image processing techniques
Chaivaranont et al. Estimating grassland curing with remotely sensed data
Nolè et al. Using spatial autocorrelation techniques and multi-temporal satellite data for analyzing urban sprawl
Liu et al. Assessment of generalized allometric models for aboveground biomass estimation: A case study in Australia
Curtis Remote sensing systems for monitoring crops and vegetation
Kaasalainen Multispectral terrestrial lidar: State of the art and challenges

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230307