CN114708550A - Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge - Google Patents

Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge Download PDF

Info

Publication number
CN114708550A
CN114708550A CN202210317398.6A CN202210317398A CN114708550A CN 114708550 A CN114708550 A CN 114708550A CN 202210317398 A CN202210317398 A CN 202210317398A CN 114708550 A CN114708550 A CN 114708550A
Authority
CN
China
Prior art keywords
nbrswir
fire
matrix
index
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210317398.6A
Other languages
Chinese (zh)
Inventor
朱祺琪
李子琪
郭希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN202210317398.6A priority Critical patent/CN114708550A/en
Publication of CN114708550A publication Critical patent/CN114708550A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method and a device for detecting changes of unsupervised learning forest fires by integrating prior knowledge, wherein the method comprises the following steps: acquiring and preprocessing double-temporal remote sensing images of fire areas before and after a forest fire; respectively calculating NBRSWIR indexes to obtain a NBRSWIR index graph; carrying out uncertainty analysis to obtain a training sample; training the two symmetrical deep network branches by using a training sample; respectively extracting initial features of the NBRSWIR index graph by using the trained deep network branches; performing slow feature analysis on the initial features to obtain feature difference values of the initial features; calculating chi-square distance to obtain a variation intensity graph; performing K-means threshold segmentation to obtain forest fire areas; the invention provides a novel forest fire change detection framework, which not only integrates priori geoscience knowledge, improves the spectrum separation capability of a fire area and background information, but also can inhibit background information change to extract more complex characteristics of the fire area, and still ensures the classification precision on the premise of not manually marking a training sample.

Description

Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge
Technical Field
The invention relates to the technical field of deep learning and remote sensing image processing, mainly solves the problem of forest fire change detection based on remote sensing images, and particularly relates to an unsupervised learning forest fire change detection method and device integrating priori knowledge.
Background
Forest fires are one of the most common disasters, which not only seriously affect the stability of a forest ecosystem, but also bring environmental pollution and related secondary geological disasters and threaten the safety of human life and property. Therefore, the acquisition of information about the position, burning area and the like of a fire area is crucial to the assessment and recovery of economic and ecological losses, while traditional ground surveying is generally difficult and expensive due to the limitation of conditions such as complex terrain morphology and severe weather, while satellite remote sensing can cover a wide area at high frequency and can provide invisible spectral information, playing an important role in the monitoring of fires and mapping of burn areas. However, the characteristics of fire areas are often complex, and meanwhile, spectral similarity exists among different land cover types, although scholars at home and abroad carry out a great deal of research on the aspects, a universal method with strong universality does not appear at present, so that forest fire remote sensing image change detection is always a hotspot and a difficulty of research in the field of international remote sensing.
In the traditional forest fire change detection algorithm, the most widely applied method at present is a method based on spectral indexes and a method based on image classification. The method based on the spectral indexes mathematically combines wave bands according to the difference of spectral information of different surface feature types in different wavelength ranges so as to realize the classification of fire areas and other surface feature types, and two indexes which can be used for forest fire change detection are mainly adopted: vegetation index and fire index. Because the method is simple to implement and high in accuracy, the method is widely applied to fire area detection, but because the method is generally used for binarization based on a threshold segmentation or clustering method to further realize fire area detection, background information changes are complex, and unnecessary false detection or missed detection is caused by the method. Image classification based methods separate fire regions from regions by minimizing intra-class differences and maximizing inter-class differences using a set of attributes, such as Support Vector Machines (SVMs), Random Forests (RFs), Principal Component Analysis (PCA), and the like. However, the key step of these algorithms is the selection of attribute features, which consumes a lot of time.
In recent years, the rapid development of deep learning technology promotes the progress of forest fire change detection, and at present, the method is mainly based on a Convolutional Neural Network (CNN), and more features can be automatically extracted. However, due to complex causes of forest fires and uncertainty of wind directions, the boundary characteristics of fire areas are often complex. Meanwhile, due to the spectral similarity between the fire area and ground objects such as water, smoke, bare land and the like, background information has a large influence. However, the existing methods rarely consider the above problems. Furthermore, there are fewer forest fire data sets publicly available, whereas supervised deep learning methods require a large number of data sets to train.
Disclosure of Invention
The invention aims to solve the main technical problem of providing the unsupervised learning forest fire change detection method and the unsupervised learning forest fire change detection device which are integrated with the priori knowledge, so that the accuracy of forest fire change detection is ensured under the condition that training samples are not artificially marked.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
according to one aspect of the invention, the invention provides an unsupervised learning forest fire change detection method integrated with priori knowledge, which comprises the following steps:
s1: acquiring double-time-phase remote sensing images of fire areas before and after a forest fire, and preprocessing the double-time-phase remote sensing images;
s2: respectively calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images to integrate prior knowledge, and obtaining NBRSWIR index maps X, Y of the double-time phase remote sensing images;
s3: performing uncertainty analysis on the NBRSWIR index map X, Y to obtain a training sample Xtrain、Ytrain
S4: training the two symmetrical deep network branches through the training sample, and obtaining the trained deep network branches after the training is finished;
s5: respectively extracting initial features X of the NBRSWIR index map X, Y by using the trained deep network branchesφ、Yφ
S6: for the initial feature Xφ、YφPerforming slow feature analysis to obtain the initial feature Xφ、YφA characteristic difference value of (a);
s7: calculating chi-square distance of each pixel point according to the characteristic difference value to obtain a variation intensity graph;
s8: and performing K-means threshold segmentation on the change intensity graph to obtain a final forest fire area.
Further, in step S1, the preprocessing step includes:
s11: respectively downloading L1C-level multispectral data of a sentinel II in a forest fire area before and after a disaster by a Copeny data center of the European Bureau;
s12: radiometric calibration and atmospheric correction are carried out on the L1C-level multispectral data by using an sen2cor tool, so that an L2A-level product is obtained;
s13: performing super-resolution synthesis on the L2A-grade product by using SNAP software, synthesizing all wave bands into wave bands with the spatial resolution of 10m, and further obtaining pre-disaster and post-disaster remote sensing images with the resolution of 10 m;
s14: and respectively cutting the pre-disaster remote sensing image and the post-disaster remote sensing image with the resolution of 10m according to the range of the research area, and further obtaining the preprocessed double-time-phase remote sensing image.
Further, in step S2, performing band operation on two short wave infrared bands of the preprocessed dual-temporal remote sensing image to obtain NBRSWIR index maps of temporal phases before and after the fire, wherein the specific calculation formula is as follows:
Figure BDA0003570268820000031
the NBRSWIR index is a new fire index, and the SWIR1 and the SWIR2 are respectively the 11 th and 12 th wave band data of the preprocessed double-time phase remote sensing image.
Further, step S3 specifically includes:
s31: and (3) carrying out difference on the NBRSWIR index maps X, Y before and after the fire to obtain an NBRSWIR index difference map reflecting fire zone information, wherein the specific calculation formula is as follows:
dNBRSWIR=NBRSWIRpost-NBRSWIRpre
wherein NBRSWIRpostNBRSWIR index map, NBRSWIR, for post-fire remote sensing imagespreNBRSWIR index map, d, of remote sensing image before fireNBRSWIRThe NBRSWIR index difference value chart before and after the fire disaster;
s32: carrying out fuzzy C-means clustering on the NBRSWIR index difference graph to realize threshold segmentation, and dividing a research area into a determined burning area, an uncertain area and a determined non-burning area;
s33: randomly selecting and determining pixels in NBRSWIR index graphs before and after the fire in the unburnt area as training samples Xtrain、Ytrain
Further, step S32 specifically includes:
s321: setting the precision e of an objective function, a fuzzy index m, a clustering number c and the maximum iteration number t of the algorithm, wherein the objective function J is as follows:
Figure BDA0003570268820000032
the constraints of the objective function are:
Figure BDA0003570268820000041
Figure BDA0003570268820000042
wherein c represents the number of clusters, n represents the total number of pixels of the NBRSWIR index difference graph, m represents the fuzzy index, u represents the number of clustersijRepresenting a samplexjMembership degree belonging to i class, j represents j pixel, x represents sample represented by NBRSWIR index difference graph, i represents i cluster, viRepresents the center of class i, d () represents a measure of distance;
s322: random initialization membership degree matrix uijAnd a clustering center vi
S323: updating a membership matrix and a clustering center, specifically:
Figure BDA0003570268820000043
Figure BDA0003570268820000044
where k denotes the kth cluster, vkRepresents the center of class k;
s324: if the target function satisfies | J (t) | < e (t) |, ending the iteration, and entering step S325, otherwise, repeating step S323;
s325: and according to the obtained membership matrix, sampling the class corresponding to the maximum value of the membership as a sample clustering result, finishing clustering, and further dividing the sample into a determined burnt area, an uncertain area and a determined unburnt area.
Further, step S4 specifically includes:
s41: constructing two symmetrical deep network branches which are all full-connection layers and comprise an input layer, a hidden layer and an output layer, wherein each hidden layer is provided with the same number of nodes;
s42: initializing parameters of two symmetric deep network branches { theta }12};
S43: respectively calculating training samples X before and after firetrain、YtrainProjection characteristics after conversion of deep network branches
Figure BDA0003570268820000045
S44: calculating a loss function according to the slowEigen-analysis theory, invariant has the smallest eigenvalues, so the loss function suppresses the variance of invariant pixels by minimizing the total square of all eigenvalues
Figure BDA0003570268820000051
The method specifically comprises the following steps:
Figure BDA0003570268820000052
in the formula:
Figure BDA0003570268820000053
Figure BDA0003570268820000054
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003570268820000055
Figure BDA0003570268820000056
after representation of centralisation
Figure BDA0003570268820000057
n represents the number of pixels,
Figure BDA0003570268820000058
is a matrix with elements of 1, I represents an identity matrix, r represents a regularization constant, ()TA transpose of a matrix, ()-1Tr (.) is the inverse of the matrix and is the trace of the matrix;
s45: calculating the gradient:
Figure BDA0003570268820000059
and
Figure BDA00035702688200000510
s46: updating the parameter θ using a gradient descent algorithm12};
S47: and repeating the steps S43-S46 until the maximum iteration number is reached to obtain the trained deep network branch.
Further, step S43 specifically includes:
s431: for training sample X before firetrain
The output of the first hidden layer is represented as:
Figure BDA00035702688200000511
wherein
Figure BDA00035702688200000512
m represents the number of bands, and n represents the number of pixels;
Figure BDA00035702688200000513
a matrix of weights is represented by a matrix of weights,
Figure BDA00035702688200000514
represents a deviation vector, h1The number of nodes of the first hidden layer; s (-) represents an activation function;
the output of each hidden layer/is then expressed as:
Figure BDA00035702688200000515
wherein
Figure BDA00035702688200000516
Finally, the projection characteristics after the network conversion are as follows:
Figure BDA00035702688200000517
where o denotes the number of nodes of the output layer,
Figure BDA00035702688200000518
is a weight matrix, hlIn order to hide the number of layer nodes,
Figure BDA0003570268820000061
is a deviation vector;
s432: for training sample Y after firetrainProjection features after conversion of deep network branches
Figure BDA0003570268820000062
The process of (1) is the same as step S431.
Further, step S6 specifically includes:
s61: acquiring initial characteristics X before and after fireφ、YφMean value of
Figure BDA0003570268820000063
Sum standard deviation
Figure BDA0003570268820000064
S62: normalizing feature X according to the mean and standard deviationφ、YφThe method specifically comprises the following steps:
for initial characteristics X before fireφBy the formula
Figure BDA0003570268820000065
Normalizing the initial features obtained after the deep network conversion, wherein
Figure BDA0003570268820000066
Is XφThe normalized value of the ith pixel; xφ iIs XφThe value of the ith pixel;
Figure BDA0003570268820000067
is XφA mean value of each pixel value;
Figure BDA0003570268820000068
is XφA standard deviation of each pixel value; initial characteristic Y after fireφThe standardization process is the same;
s63: based on normalized features
Figure BDA0003570268820000069
Obtaining a feature Xφ、YφCovariance matrix A of differences and feature XφAnd YφThe sum matrix B of the covariance matrices is specifically:
the matrix A is:
Figure BDA00035702688200000610
the matrix B is:
Figure BDA00035702688200000611
wherein the content of the first and second substances,
Figure BDA00035702688200000612
and
Figure BDA00035702688200000613
the values are respectively the normalized value of the ith pixel of the initial characteristic of the remote sensing image NBRSWIR index graph before and after the fire after the depth network conversion, and P is the total pixel number;
s64: solving generalized eigenvalues of the matrix A relative to the matrix B and eigenvectors corresponding to the generalized eigenvalues, and sorting the corresponding eigenvectors from small to large according to the magnitude of the generalized eigenvalues to obtain an ordered eigenvector matrix;
s65: using ordered feature vector matrix to normalize the features
Figure BDA00035702688200000614
Projecting into a feature space and obtaining Xφ、YφThe characteristic difference value is specifically as follows:
Figure BDA00035702688200000615
wherein SFA represents the feature difference, w represents the ordered feature vector matrix,
Figure BDA00035702688200000616
indicating before or after a fireInitial feature X of remote sensing image NBRSWIR index map after depth network conversionφ、YφA matrix of normalized values for each pixel.
Further, in step S7, the step of calculating the chi-square distance of each pixel point includes:
dividing the square of the characteristic difference value of each pixel point by the corresponding evolution of the generalized characteristic value, wherein the calculation formula is as follows:
Figure BDA0003570268820000071
wherein chiIs the chi-square distance of the ith pixel, SFAiIs the characteristic difference value of the ith pixel, and is the generalized characteristic value of the wave band.
According to another aspect of the invention, the invention provides an unsupervised learning forest fire change detection device with integrated prior knowledge, which comprises the following modules:
the image preprocessing module is used for acquiring double-time-phase remote sensing images of fire areas before and after a forest fire and preprocessing the double-time-phase remote sensing images;
the NBRSWIR index calculation module is used for calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images respectively so as to be integrated with priori knowledge, and an NBRSWIR index graph X, Y of the double-time phase remote sensing images is obtained;
a training sample acquisition module for performing uncertainty analysis on the NBRSWIR index map X, Y to acquire a training sample Xtrain、Ytrain
The network training module is used for training the two symmetrical deep network branches through the training sample, and after the training is finished, the trained deep network branches are obtained;
an initial feature extraction module, configured to extract initial features X of the NBRSWIR index map X, Y using the trained deep net branches respectivelyφ、Yφ
The slow feature analysis module is used for carrying out slow feature analysis on the initial features to obtain feature difference values of the initial features;
the chi-square distance calculation module is used for calculating the chi-square distance of each pixel point according to the characteristic difference value to obtain a variation intensity graph;
and the threshold segmentation module is used for performing K-means threshold segmentation on the change intensity map to obtain a final forest fire area.
The technical scheme provided by the invention has the following beneficial effects:
a forest fire change detection framework is provided, and the spectrum separation capability of a fire area and a complex background is improved by integrating the NBRSWIR index into the prior knowledge; meanwhile, the powerful feature extraction capability of the deep network is combined, and a more complete fire area can be extracted. An unsupervised symmetrical deep network is provided, and the generalization capability is strong; generating more reliable pseudo samples by uncertainty analysis based on the fire index map; and (3) suppressing the change of complex background information by combining a deep network and Slow Feature Analysis (SFA) to extract the fire area. The method and the device can ensure the detection precision of forest fire change under the condition of not artificially marking training samples.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flow chart of a forest fire change detection method according to an embodiment of the present invention;
FIG. 2 is an overall block diagram of a forest fire change detection method according to an embodiment of the present invention;
FIG. 3 is a flow chart of a slow signature analysis process in accordance with an embodiment of the present invention;
FIG. 4 is a graph of comparative results of fire burnout zone detection on the West Chang fire data set in an embodiment of the present invention;
fig. 5 is a structural view of a forest fire change detection apparatus according to an embodiment of the present invention.
Detailed Description
For a more clear understanding of the technical features, objects and effects of the present invention, embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
The specific implementation mode discloses a priori knowledge-incorporated unsupervised learning forest fire change detection method and device.
Referring to fig. 1-2, fig. 1 is a flow chart of a forest fire change detection method according to an embodiment of the present invention, and fig. 2 is an overall frame diagram of a forest fire change detection method according to an embodiment of the present invention;
a method for detecting changes of unsupervised learning forest fires by integrating prior knowledge comprises the following specific operation steps:
and S1, acquiring double-time-phase remote sensing images of the fire areas before and after the forest fire, and preprocessing the double-time-phase remote sensing images.
Firstly, respectively downloading L1C-level multispectral data of a sentinel II in a forest fire area before and after a fire according to the position of the fire area by a British data center (https:// scihub. copernius. eu/dhus/#/home); radiometric calibration and atmospheric correction are carried out on the L1C level multispectral data by using a sen2cor tool so as to obtain an L2A level product; then performing super-resolution synthesis on the L2A-level product by using SNAP software, and synthesizing all wave bands into a wave band with the spatial resolution of 10 m; and finally, respectively cutting the remote sensing images before and after disasters into images with the same size according to the research area range, and further obtaining the preprocessed double-time-phase remote sensing images. In the present embodiment, a total of three forest fire data sets are used, but only the West Chang fire data set is analyzed and discussed in this example. 30 days in 2020, 3 months, the West Chang fire occurs in West Chang city, Liangshan, Sichuan, the image imaging time before the fire is 3 months and 25 days in 2020, 4 months and 4 days after the fire. The image is mainly characterized by forest zones, the spatial resolution of the forest zones is 10m, 12 wave bands are shared, and the size of the forest zones is 953 multiplied by 1501. The spectral image includes the types of ground objects such as fire burning areas, vegetation, buildings, bare land, smoke and the like, and the fire burning areas account for 17% of the whole research area.
S2: respectively calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images to integrate prior knowledge, and obtaining NBRSWIR index graphs X, Y of the double-time phase remote sensing images;
carrying out band operation on two short wave infrared bands of the preprocessed double-time phase remote sensing image to obtain NBRSWIR index graphs before and after the fire, wherein the obtained NBRSWIR index graphs before and after the fire are only one band, and the specific calculation method comprises the following steps:
Figure BDA0003570268820000091
wherein, NBRSWIR index (normalized burning index) is a new fire index, SWIR1 and SWIR2 are respectively the 11 th and 12 th wave band data of the preprocessed double-time phase remote sensing image.
S3: performing uncertainty analysis on the NBRSWIR index map X, Y to obtain a training sample Xtrain、Ytrain
Step S3 specifically includes:
s31: and (3) carrying out difference on the NBRSWIR index maps X, Y before and after the fire to obtain an NBRSWIR index difference map reflecting fire zone information, wherein the specific calculation formula is as follows:
dNBRSWIR=NBRSWIRpost-NBRSWIRpre
wherein NBRSWIRpostNBRSWIR index map, NBRSWIR, for post-fire remote sensing imagespreNBRSWIR index map, d, of remote sensing image before fireNBRSWIRThe NBRSWIR index difference value chart before and after the fire disaster;
s32: carrying out fuzzy C-means clustering on the NBRSWIR index difference graph to realize threshold segmentation, and dividing a research area into a determined burning area, an uncertain area and a determined non-burning area;
further, S32 specifically includes:
s321, setting the precision e of the objective function to be 0.0001, the fuzzy index m to be 2, the clustering number c to be 3, the maximum iteration number t of the algorithm to be 1000, and setting the objective function J to be:
Figure BDA0003570268820000101
the constraints of the objective function are:
Figure BDA0003570268820000102
Figure BDA0003570268820000103
wherein c represents the number of categories, n represents the total number of picture elements of the NBRSWIR index difference map described in S31, m represents the blur index, u represents the number of pixelsijRepresents a sample xjMembership degree belonging to class i, j represents j-th pixel, x represents sample represented by NBRSWIR index difference graph described in S31, i represents i-th cluster, viRepresents the center of class i, d () represents a measure of distance;
s322, randomly initializing a membership matrix uijAnd a clustering center vi
S323, updating the membership matrix and the clustering center, specifically:
Figure BDA0003570268820000104
Figure BDA0003570268820000105
where k denotes the kth cluster, vkRepresents the center of class k;
s324: if the target function satisfies | J (t) | < e (t) |, ending the iteration, and entering step S325, otherwise, repeating step S323;
s325: and according to the obtained membership matrix, sampling the class corresponding to the maximum value of the membership as a sample clustering result, finishing clustering, and further dividing the sample into a determined burnt area, an uncertain area and a determined unburnt area.
S33: since the present invention aims to suppress the change of background information and extract a fire area, invariant pixels (determination of an unburnt area) are used as trainingThe sample is helpful for improving the final precision, and 2.5% of pixels in NBRSWIR index graphs before and after a fire in an unburnt area are randomly selected and determined to be used as a training sample Xtrain、Ytrain
S4: training sample X as described in S3train、YtrainTraining the two symmetrical deep network branches, and obtaining the two trained symmetrical deep network branches after the training is finished;
s4 specifically includes:
s41: constructing two symmetrical deep network branches which are all full-connection layers and comprise an input layer, three hidden layers and an output layer, wherein each hidden layer is provided with the same number of nodes;
s42: initializing parameters of a two-branch deep network { theta }12In which θ1Included
Figure BDA0003570268820000111
And
Figure BDA0003570268820000112
θ2Included
Figure BDA0003570268820000113
and
Figure BDA0003570268820000114
s43: respectively calculating training samples X before and after firetrain、YtrainProjection characteristics after deep network conversion
Figure BDA0003570268820000115
The method specifically comprises the following steps:
taking a training sample before fire as an example:
the output of the first hidden layer can be expressed as:
Figure BDA0003570268820000116
wherein
Figure BDA0003570268820000117
m represents the number of bands, and n represents the number of pixels;
Figure BDA0003570268820000118
a matrix of weights is represented by a matrix of weights,
Figure BDA0003570268820000119
represents a deviation vector, h1The number of nodes of the first hidden layer; s (-) represents the activation function.
The output of each hidden layer l can then be expressed as:
Figure BDA00035702688200001110
wherein
Figure BDA00035702688200001111
Finally, the projection characteristics after the network conversion are as follows:
Figure BDA00035702688200001112
where o denotes the number of nodes of the output layer,
Figure BDA00035702688200001113
is a weight matrix, hlIn order to hide the number of layer nodes,
Figure BDA00035702688200001114
is a deviation vector;
post-fire training sample YtrainThe process is the same as the training sample before fire.
S44: calculating a loss function, the invariant having the smallest eigenvalue according to the slow eigen analysis theory, so that the variance of the invariant pixels can be suppressed by minimizing the total square of all eigenvalues, the loss function
Figure BDA0003570268820000121
The method specifically comprises the following steps:
Figure BDA0003570268820000122
in the formula:
Figure BDA0003570268820000123
Figure BDA0003570268820000124
wherein the content of the first and second substances,
Figure BDA0003570268820000125
Figure BDA0003570268820000126
after representation of centralisation
Figure BDA0003570268820000127
n represents the number of pixels,
Figure BDA0003570268820000128
is a matrix with elements of 1, I represents an identity matrix, r represents a regularization constant, ()TA transpose of a matrix, ()-1Tr (.) is the inverse of the matrix and is the trace of the matrix;
s45: calculating the gradient:
Figure BDA0003570268820000129
and
Figure BDA00035702688200001210
s46: updating parameters using a gradient descent algorithm;
s47: to ensure convergence of the model, steps S43-S46 are repeated until a maximum number of iterations is reached.
S5: for the spectral index map X, Y of the remote sensing images before and after the fire obtained in S2, the initial feature X of the spectral index map X, Y is extracted by using two symmetric depth network branches trained in S4φ、Yφ
S6: for the initial feature Xφ、YφPerforming Slow Feature Analysis (SFA), projecting it into a feature space, and obtaining the initial features Xφ、YφThe difference of the pixels in the fire area in this feature difference space will be suppressed, so that the separability between the pixels in the fire area and the pixels in the non-fire area is enhanced, and the specific steps of the slow feature analysis are shown in fig. 3, and include:
s61: acquiring initial characteristics X before and after fireφ、YφMean value of
Figure BDA00035702688200001211
Sum standard deviation
Figure BDA00035702688200001212
S62: normalizing feature X according to mean and standard deviationφ、YφThe method specifically comprises the following steps:
by initial characteristics X before fireφFor example, using the formula
Figure BDA00035702688200001213
Normalizing the initial features obtained after the deep network conversion, wherein
Figure BDA00035702688200001214
Is XφThe normalized value of the ith pixel; xφ iIs XφThe value of the ith pixel;
Figure BDA00035702688200001215
is XφA mean value of each pixel value;
Figure BDA00035702688200001216
is XφA standard deviation of each pixel value; initial characteristic after fire YφThe solving process is similar;
s63: based on normalized features
Figure BDA0003570268820000131
Obtaining feature Xφ、YφCovariance matrix A of differences and feature XφAnd YφThe sum matrix B of the covariance matrices is specifically:
the matrix A is:
Figure BDA0003570268820000132
the matrix B is:
Figure BDA0003570268820000133
wherein the content of the first and second substances,
Figure BDA0003570268820000134
and
Figure BDA0003570268820000135
the values are respectively the normalized value of the ith pixel of the initial characteristic of the remote sensing image NBRSWIR index graph before and after the fire after the depth network conversion, and P is the total pixel number;
s64: solving generalized eigenvalues of the matrix A relative to the matrix B and eigenvectors corresponding to the generalized eigenvalues, and sorting the corresponding eigenvectors from small to large according to the magnitude of the generalized eigenvalues to obtain an ordered eigenvector matrix;
s65: using ordered feature vector matrix to normalize the features
Figure BDA0003570268820000136
Projecting into a feature space and obtaining Xφ、YφThe characteristic difference value is specifically as follows:
Figure BDA0003570268820000137
where SFA represents the feature difference, w represents the ordered feature vector matrix,
Figure BDA0003570268820000138
the remote sensing image NBRSWIR index graph before and after the fire is converted by the depth networkInitial characteristic X ofφ、YφA matrix of normalized values for each pixel.
S7: calculating the chi-square distance of each pixel point according to the characteristic difference obtained in the step S6 to obtain a variation intensity map, which specifically comprises the following steps: dividing the square of the characteristic difference value of each pixel point by the corresponding evolution of the generalized characteristic value, wherein the calculation formula is as follows:
Figure BDA0003570268820000139
wherein chiChi-square distance, SFA, of ith pixeliIs the characteristic difference value of the ith pixel, and is the generalized characteristic value of the wave band.
S8: and performing K-means threshold segmentation on the variation intensity map in the S7 to obtain a final forest fire area.
In order to verify the unsupervised learning forest fire change detection method integrated with the priori knowledge, in the embodiment of the invention, the classification result is further analyzed and evaluated.
Fig. 4 shows a diagram of the detection results of fire regions on the sichang fire data set by different methods, fig. 4(a) is an image synthesized by a red wavelength band (R), a near infrared wavelength band (NIR), and a short wavelength infrared wavelength band (SWIR1) after fire, fig. 4(b) is an image synthesized by a real surface Change label for reference (black regions are non-fire-burning regions, and white regions are fire-burning regions), fig. 4(c) -fig. 4(h) are classification diagrams of algorithm BAI (Burn Area Index), NBR (Normalized Burn Index), CVA (Change Vector Analysis ), PCA (Principal Component Analysis, main Component Analysis), SFA (Slow Feature Analysis ), DSFA (Deep Slow Feature Analysis, and fig. 4(i) is an experimental classification diagram of the present method. The comparison shows that the result of the method provided by the invention is closest to the real landmark change label, thereby effectively reducing the misclassification of the fire area and ground objects such as smoke, water bodies, vegetation and the like, and simultaneously detecting a more complete fire burning area. Table 1 shows the comparison of the precision evaluation results of the different methods, and it can be seen that the method proposed in the present experiment has the best detection precision as a whole, thereby illustrating the effectiveness of the method of the present invention.
Table 1 shows comparison of precision evaluation results of the respective classification methods
Figure BDA0003570268820000141
Referring to fig. 5, fig. 5 is a structural diagram of a forest fire change detection apparatus according to an embodiment of the present invention.
A non-supervised learning forest fire change detection device integrated with priori knowledge is used for realizing the steps of the non-supervised learning forest fire change detection method, and specifically comprises the following modules:
the image preprocessing module 1 is used for acquiring double-time-phase remote sensing images of fire areas before and after a forest fire and preprocessing the double-time-phase remote sensing images;
the NBRSWIR index calculating module 2 is used for respectively calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images so as to integrate prior knowledge, and obtaining NBRSWIR index graphs X, Y of the double-time phase remote sensing images;
a training sample obtaining module 3, configured to perform uncertainty analysis on the NBRSWIR index map X, Y to obtain a training sample Xtrain、Ytrain
The network training module 4 is used for training the two symmetrical deep network branches through the training sample, and after the training is finished, the trained deep network branches are obtained;
an initial feature extraction module 5, configured to extract initial features X of the NBRSWIR index map X, Y using the trained deep net branches respectivelyφ、Yφ
A slow feature analysis module 6, configured to perform slow feature analysis on the initial feature to obtain the initial feature Xφ、YφA characteristic difference value of (a);
the chi-square distance calculation module 7 is used for calculating the chi-square distance of each pixel point according to the characteristic difference value to obtain a variation intensity map;
and the threshold segmentation module 8 is used for performing K-means threshold segmentation on the change intensity map to obtain a final forest fire area.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third and the like do not denote any order, but rather the words first, second and the like may be interpreted as indicating any order.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for detecting changes of unsupervised learning forest fires by integrating prior knowledge is characterized by comprising the following steps:
s1: acquiring double-time-phase remote sensing images of fire areas before and after a forest fire, and preprocessing the double-time-phase remote sensing images;
s2: respectively calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images to integrate prior knowledge, and obtaining NBRSWIR index maps X, Y of the double-time phase remote sensing images;
s3: for the NBRSWIR indexFIG. X, Y is used for uncertainty analysis to obtain training sample Xtrain、Ytrain
S4: training the two symmetrical deep network branches through the training sample, and obtaining the trained deep network branches after the training is finished;
s5: respectively extracting initial feature X of the NBRSWIR index map X, Y by using the trained deep network branchφ、Yφ
S6: for the initial feature Xφ、YφPerforming slow feature analysis to obtain the initial feature Xφ、YφA characteristic difference value of (a);
s7: calculating chi-square distance of each pixel point according to the characteristic difference value to obtain a variation intensity graph;
s8: and performing K-means threshold segmentation on the change intensity graph to obtain a final forest fire area.
2. A priori knowledge incorporated unsupervised learning forest fire change detection method as claimed in claim 1, wherein in step S1, the preprocessing step comprises:
s11: respectively downloading L1C-level multispectral data of a sentinel II in a forest fire area before and after a disaster by a Copeny data center of the European Bureau;
s12: radiometric calibration and atmospheric correction are carried out on the L1C-level multispectral data by using an sen2cor tool, so that an L2A-level product is obtained;
s13: performing super-resolution synthesis on the L2A-grade product by using SNAP software, synthesizing all wave bands into wave bands with the spatial resolution of 10m, and further obtaining pre-disaster and post-disaster remote sensing images with the resolution of 10 m;
s14: and respectively cutting the pre-disaster remote sensing image and the post-disaster remote sensing image with the resolution of 10m according to the range of the research area, and further obtaining the preprocessed double-time-phase remote sensing image.
3. The method for detecting forest fire variation through unsupervised learning and incorporating the priori knowledge as claimed in claim 1, wherein in step S2, two short wave infrared bands of the preprocessed double-temporal remote sensing image are subjected to band operation to obtain NBRSWIR exponential graphs of temporal phases before and after the fire, and the specific calculation formula is as follows:
Figure FDA0003570268810000021
the NBRSWIR index is a new fire index, and the SWIR1 and the SWIR2 are respectively the 11 th and 12 th wave band data of the preprocessed double-time phase remote sensing image.
4. The method for detecting a change in a forest fire through unsupervised learning incorporating a priori knowledge as claimed in claim 1, wherein the step S3 specifically comprises:
s31: and (3) carrying out difference on the NBRSWIR index maps X, Y before and after the fire to obtain an NBRSWIR index difference map reflecting fire zone information, wherein the specific calculation formula is as follows:
dNBRSWIR=NBRSWIRpost-NBRSWIRpre
wherein, NBRSWIRpostNBRSWIR index map, NBRSWIR, for post-fire remote sensing imagespreNBRSWIR index map, d, of remote sensing image before fireNBRSWIRThe NBRSWIR index difference value chart before and after the fire disaster;
s32: carrying out fuzzy C-means clustering on the NBRSWIR index difference graph to realize threshold segmentation, and dividing a research area into a determined burning area, an uncertain area and a determined non-burning area;
s33: randomly selecting and determining pixels in NBRSWIR index graphs before and after the fire in the unburnt area as training samples Xtrain、Ytrain
5. The method for detecting a change in a forest fire through unsupervised learning incorporating a priori knowledge as claimed in claim 4, wherein the step S32 specifically comprises:
s321: setting the precision e of an objective function, a fuzzy index m, a clustering number c and the maximum iteration number t of the algorithm, wherein the objective function J is as follows:
Figure FDA0003570268810000022
the constraints of the objective function are:
Figure FDA0003570268810000023
Figure FDA0003570268810000024
wherein c represents the number of clusters, n represents the total number of pixels of the NBRSWIR index difference graph, m represents the fuzzy index, u represents the number of clustersijRepresents a sample xjMembership degree belonging to i class, j represents j pixel, x represents sample represented by NBRSWIR index difference graph, i represents i cluster, viRepresents the center of class i, d () represents a measure of distance;
s322: random initialization membership degree matrix uijAnd a clustering center vi
S323: updating a membership matrix and a clustering center, specifically:
Figure FDA0003570268810000031
Figure FDA0003570268810000032
where k denotes the kth cluster, vkRepresents the center of class k;
s324: if the objective function satisfies | J (t) -J (t +1) | < e, the iteration is finished, the step S325 is entered, otherwise the step S323 is repeated;
s325: and according to the obtained membership matrix, sampling the class corresponding to the maximum value of the membership as a sample clustering result, finishing clustering, and further dividing the sample into a determined burnt area, an uncertain area and a determined unburnt area.
6. The method for detecting a change in a forest fire through unsupervised learning incorporating a priori knowledge as claimed in claim 1, wherein the step S4 specifically comprises:
s41: constructing two symmetrical deep network branches which are all full-connection layers and comprise an input layer, a hidden layer and an output layer, wherein each hidden layer is provided with the same number of nodes;
s42: initializing parameters of two symmetric deep network branches { theta }12};
S43: respectively calculating training samples X before and after firetrain、YtrainProjection characteristics after conversion of deep network branches
Figure FDA0003570268810000033
S44: calculating a loss function, the invariant having the smallest eigenvalue according to the slow eigen analysis theory, whereby the variance of the invariant pixel is suppressed by minimizing the total square of all eigenvalues
Figure FDA0003570268810000034
The method specifically comprises the following steps:
Figure FDA0003570268810000041
in the formula:
Figure FDA0003570268810000042
Figure FDA0003570268810000043
wherein the content of the first and second substances,
Figure FDA0003570268810000044
Figure FDA0003570268810000045
after representation of centralisation
Figure FDA0003570268810000046
n represents the number of pixels,
Figure FDA0003570268810000047
is a matrix with elements of 1, I represents an identity matrix, r represents a regularization constant, ()TA transposed matrix representing a matrix, ()-1Tr (.) is the inverse of the matrix and is the trace of the matrix;
s45: calculating the gradient:
Figure FDA0003570268810000048
and
Figure FDA0003570268810000049
s46: updating the parameter θ using a gradient descent algorithm12};
S47: and repeating the steps S43-S46 until the maximum iteration number is reached to obtain the trained deep network branch.
7. The method for detecting a change in a forest fire through unsupervised learning incorporating a priori knowledge as claimed in claim 6, wherein the step S43 specifically comprises:
s431: for training sample X before firetrain
The output of the first hidden layer is represented as:
Figure FDA00035702688100000410
wherein
Figure FDA00035702688100000411
m represents the number of wave bands, n representsDisplaying the number of pixels;
Figure FDA00035702688100000412
a matrix of weights is represented by a matrix of weights,
Figure FDA00035702688100000413
represents a deviation vector, h1The number of nodes of the first hidden layer; s (-) represents an activation function;
the output of each hidden layer/is then expressed as:
Figure FDA00035702688100000414
wherein
Figure FDA00035702688100000415
Finally, the projection characteristics after the network conversion are as follows:
Figure FDA00035702688100000416
where o denotes the number of nodes of the output layer,
Figure FDA00035702688100000417
Figure FDA00035702688100000418
is a weight matrix, hlIn order to hide the number of layer nodes,
Figure FDA00035702688100000419
Figure FDA00035702688100000420
is a deviation vector;
s432: for training sample Y after firetrainProjection features after conversion of deep network branches
Figure FDA0003570268810000051
The process of (1) is the same as step S431.
8. The method for detecting changes in forest fires through unsupervised learning and incorporating a priori knowledge as claimed in claim 1, wherein the step S6 specifically comprises:
s61: acquiring initial characteristics X before and after fireφ、YφMean value of
Figure FDA0003570268810000052
And standard deviation of
Figure FDA0003570268810000053
S62: normalizing feature X according to the mean and standard deviationφ、YφThe method specifically comprises the following steps:
for initial characteristics X before fireφBy the formula
Figure FDA0003570268810000054
Normalizing the initial features obtained after the deep network conversion, wherein
Figure FDA0003570268810000055
Is XφThe normalized value of the ith pixel; xφ iIs XφThe value of the ith pixel;
Figure FDA0003570268810000056
is XφThe mean value of each pixel value;
Figure FDA0003570268810000057
is XφA standard deviation of each pixel value; initial characteristic Y after fireφThe standardization process is the same;
s63: based on normalized features
Figure FDA0003570268810000058
Obtaining a feature Xφ、YφCovariance matrix A of differences and feature XφAnd YφThe sum matrix B of the covariance matrices specifically is:
the matrix A is:
Figure FDA0003570268810000059
the matrix B is:
Figure FDA00035702688100000510
wherein the content of the first and second substances,
Figure FDA00035702688100000511
and
Figure FDA00035702688100000512
the values are respectively the normalized value of the ith pixel of the initial characteristic of the remote sensing image NBRSWIR index graph before and after the fire after the depth network conversion, and P is the total pixel number;
s64: solving generalized eigenvalues of the matrix A relative to the matrix B and eigenvectors corresponding to the generalized eigenvalues, and sorting the corresponding eigenvectors from small to large according to the magnitude of the generalized eigenvalues to obtain an ordered eigenvector matrix;
s65: using ordered feature vector matrix to normalize the features
Figure FDA00035702688100000513
Projecting into a feature space and obtaining Xφ、YφThe characteristic difference value is specifically as follows:
Figure FDA00035702688100000514
wherein SFA represents the feature difference, w represents the ordered feature vector matrix,
Figure FDA00035702688100000515
representing initial characteristic X of remote sensing image NBRSWIR index graph before and after fire after depth network conversionφ、YφA matrix of normalized values for each pixel.
9. The method for detecting forest fire changes through unsupervised learning and incorporating priori knowledge as claimed in claim 1, wherein in step S7, the step of calculating chi-square distance of each pixel point comprises:
dividing the square of the characteristic difference value of each pixel point by the corresponding evolution of the generalized characteristic value, wherein the calculation formula is as follows:
Figure FDA0003570268810000061
wherein chiIs the chi-square distance of the ith pixel, SFAiIs the characteristic difference value of the ith pixel, and is the generalized characteristic value of the wave band.
10. The unsupervised learning forest fire change detection device integrated with the priori knowledge is characterized by comprising the following modules:
the image preprocessing module is used for acquiring double-time-phase remote sensing images of fire areas before and after a forest fire and preprocessing the double-time-phase remote sensing images;
the NBRSWIR index calculation module is used for calculating NBRSWIR indexes of the preprocessed double-time phase remote sensing images respectively so as to be integrated with priori knowledge, and an NBRSWIR index graph X, Y of the double-time phase remote sensing images is obtained;
a training sample acquisition module for performing uncertainty analysis on the NBRSWIR index map X, Y to acquire a training sample Xtrain、Ytrain
The network training module is used for training the two symmetrical deep network branches through the training sample, and after the training is finished, the trained deep network branches are obtained;
an initial feature extraction module, configured to extract initial features X of the NBRSWIR index map X, Y using the trained deep network branches respectivelyφ、Yφ
A slow feature analysis module for performing slow feature analysis on the initial features to obtain the initial features Xφ、YφA characteristic difference value of (a);
the chi-square distance calculation module is used for calculating the chi-square distance of each pixel point according to the characteristic difference value to obtain a variation intensity graph;
and the threshold segmentation module is used for performing K-means threshold segmentation on the change intensity map to obtain a final forest fire area.
CN202210317398.6A 2022-03-29 2022-03-29 Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge Pending CN114708550A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210317398.6A CN114708550A (en) 2022-03-29 2022-03-29 Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210317398.6A CN114708550A (en) 2022-03-29 2022-03-29 Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge

Publications (1)

Publication Number Publication Date
CN114708550A true CN114708550A (en) 2022-07-05

Family

ID=82171531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210317398.6A Pending CN114708550A (en) 2022-03-29 2022-03-29 Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge

Country Status (1)

Country Link
CN (1) CN114708550A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115393660A (en) * 2022-10-28 2022-11-25 松立控股集团股份有限公司 Parking lot fire detection method based on weak supervision collaborative sparse relationship ranking mechanism
CN117218535A (en) * 2023-09-12 2023-12-12 黑龙江省网络空间研究中心(黑龙江省信息安全测评中心、黑龙江省国防科学技术研究院) SFA-based long-term forest coverage change detection method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115393660A (en) * 2022-10-28 2022-11-25 松立控股集团股份有限公司 Parking lot fire detection method based on weak supervision collaborative sparse relationship ranking mechanism
CN115393660B (en) * 2022-10-28 2023-02-24 松立控股集团股份有限公司 Parking lot fire detection method based on weak supervision collaborative sparse relationship ranking mechanism
CN117218535A (en) * 2023-09-12 2023-12-12 黑龙江省网络空间研究中心(黑龙江省信息安全测评中心、黑龙江省国防科学技术研究院) SFA-based long-term forest coverage change detection method
CN117218535B (en) * 2023-09-12 2024-05-14 黑龙江省网络空间研究中心(黑龙江省信息安全测评中心、黑龙江省国防科学技术研究院) SFA-based long-term forest coverage change detection method

Similar Documents

Publication Publication Date Title
Zhang et al. A feature difference convolutional neural network-based change detection method
Fu et al. Hyperspectral anomaly detection via deep plug-and-play denoising CNN regularization
Zhan et al. Log-based transformation feature learning for change detection in heterogeneous images
Khazai et al. An approach for subpixel anomaly detection in hyperspectral images
Hou et al. Hyperspectral change detection based on multiple morphological profiles
Peerbhay et al. Random forests unsupervised classification: The detection and mapping of solanum mauritianum infestations in plantation forestry using hyperspectral data
Zhang et al. Automatic radiometric normalization for multitemporal remote sensing imagery with iterative slow feature analysis
CN114708550A (en) Unsupervised learning forest fire change detection method and unsupervised learning forest fire change detection device integrated with priori knowledge
Xie et al. New hyperspectral difference water index for the extraction of urban water bodies by the use of airborne hyperspectral images
Zhan et al. Unsupervised scale-driven change detection with deep spatial–spectral features for VHR images
Cheng et al. Total variation and sparsity regularized decomposition model with union dictionary for hyperspectral anomaly detection
Zhang et al. A mangrove recognition index for remote sensing of mangrove forest from space
CN106844739B (en) Remote sensing image change information retrieval method based on neural network collaborative training
Janowski et al. Exploration of glacial landforms by object-based image analysis and spectral parameters of digital elevation model
CN111698258A (en) WiFi-based environmental intrusion detection method and system
CN112199983A (en) Multi-level screening long-time large-range pedestrian re-identification method
CN115170961A (en) Hyperspectral image classification method and system based on deep cross-domain few-sample learning
Tian et al. Improving change detection in forest areas based on stereo panchromatic imagery using kernel MNF
Feng et al. Detection of urban built-up area change from Sentinel-2 images using multiband temporal texture and one-class random forest
Jenerowicz et al. Multifractality in humanitarian applications: a case study of internally displaced persons/refugee camps
Murtagh et al. Decision boundaries using Bayes factors: the case of cloud masks
Veracini et al. Fully unsupervised learning of Gaussian mixtures for anomaly detection in hyperspectral imagery
CN114067188A (en) Infrared polarization image fusion method for camouflage target
Kar et al. Classification of multispectral satellite images
Yang et al. Evaluating airborne hyperspectral imagery for mapping saltcedar infestations in west Texas

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination