CN110991248B - High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion - Google Patents

High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion Download PDF

Info

Publication number
CN110991248B
CN110991248B CN201911065814.2A CN201911065814A CN110991248B CN 110991248 B CN110991248 B CN 110991248B CN 201911065814 A CN201911065814 A CN 201911065814A CN 110991248 B CN110991248 B CN 110991248B
Authority
CN
China
Prior art keywords
mad
remote sensing
resolution
change
change detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911065814.2A
Other languages
Chinese (zh)
Other versions
CN110991248A (en
Inventor
柳思聪
冯毅
童小华
杜谦
谢欢
王超
冯永玖
金雁敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN201911065814.2A priority Critical patent/CN110991248B/en
Publication of CN110991248A publication Critical patent/CN110991248A/en
Application granted granted Critical
Publication of CN110991248B publication Critical patent/CN110991248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • G06V10/464Salient features, e.g. scale invariant feature transforms [SIFT] using a plurality of salient features, e.g. bag-of-words [BoW] representations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion, which comprises the following steps: 1) Acquiring two-time-phase high-resolution noctilucent remote sensing data before and after a short-time major event of a research area, and preprocessing front and rear time-phase remote sensing images; 2) Based on the preprocessed high-resolution noctilucent remote sensing data, extracting various derivative texture feature images, and superposing and constructing a multiband feature image fused with the texture features; 3) Performing change detection on the multiband characteristic image in the step 2) by adopting a multivariate change detection algorithm MAD and an iterative weighting algorithm IR-MAD to obtain a change intensity graph T fusing multiple characteristics MAD And T IR‑MAD The method comprises the steps of carrying out a first treatment on the surface of the 4) Respectively for the change intensity map T MAD And T IR‑MAD And (5) dividing to obtain corresponding binary change detection result graphs. Compared with the prior art, the invention has the advantages of being suitable for processing the spaceborne high-resolution LJ1-01 noctilucent remote sensing image, having high degree of automation, high precision, long time sequence, monitoring in a large range and the like.

Description

High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion
Technical Field
The invention relates to the field of multi-time-phase high-resolution noctilucent remote sensing image change detection application, in particular to a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion.
Background
Liu Biao short-time significant events are often dangerous, bursty, frequent and difficult to remedy, and once they occur, emergency management personnel will have difficulty predicting and warning them. Early detection of short-term significant events by remote sensing techniques is critical to properly allocating management resources. However, existing change detection methods can be expensive and time consuming when the change detection means are applied to the extraction and monitoring of short-time significant events. The luminous remote sensing image is very sensitive to light source information caused by short-time major events (especially fire, volcanic eruptions, explosions and the like) while providing normal night light illumination information of human beings at low cost, is a unique earth observation source and is a powerful supplement to the conventional optical remote sensing monitoring result. Lopa one-size 01 star (LJ 1-01) is used as a new generation of domestic high-resolution night light remote sensing satellite which is independently developed in China, and is launched and lifted off in 2018, 6 and 2. Compared with the existing foreign DMSP/OLS and NPP/VIIRS noctilucent remote sensing data, the method has obvious advantages in the aspects of high spatial resolution (130 m), high radiation quantification (14 bits) and space detail.
The present research shows that the LJ1-01 data is mainly applied to the fields of urban monitoring, socioeconomic performance, light pollution and the like. Because the noctilucent data is a single-band gray level image, the applicable change detection method is limited and has low automation degree. The change detection capability of the multi-temporal LJ1-01 noctilucent remote sensing image is initially evaluated by domestic Li Xi et al. But the method is only limited to coarser and simpler change detection by band operation and manually setting an experience threshold, and the result verification is also only to adopt high-resolution images for qualitative verification, so that the quantitative accuracy evaluation basis is lacked. How to fully utilize the unique advantages of LJ1-01 high-resolution noctilucent remote sensing images for change detection and discover the application potential of the noctilucent remote sensing images for automatic extraction and monitoring of land-surface short-time major events is an important and urgent problem. The invention aims to construct a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion, and extracts and fuses various derivative feature information so as to realize automatic and high-accuracy short-time significant event automatic extraction and monitoring.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion.
The aim of the invention can be achieved by the following technical scheme:
a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion comprises the following steps:
1) Acquiring two-time-phase high-resolution noctilucent remote sensing data before and after a short-time major event of a research area, and preprocessing front and rear time-phase remote sensing images;
2) Based on the preprocessed high-resolution noctilucent remote sensing data, extracting various derivative texture feature images, and superposing and constructing a multiband feature image fused with the texture features;
3) Performing change detection on the multiband characteristic image in the step 2) by adopting a multivariate change detection algorithm MAD and an iterative weighting algorithm IR-MAD to obtain a change intensity graph T fusing multiple characteristics MAD And T IR-MAD
4) Respectively for the change intensity map T MAD And T IR-MAD And (5) dividing to obtain corresponding binary change detection result graphs.
In the step 1), preprocessing is carried out on the obtained two-phase high-resolution noctilucent remote sensing data sequentially through image cutting and radiation correction.
The high-resolution noctilucent remote sensing data is remote sensing data acquired by the star 01 (LJ 1-01) of the Lopa nationality.
The step 2) is specifically as follows:
according to the preprocessed high-resolution noctilucent remote sensing data, five derived texture feature images based on probability statistics and eight derived texture feature images based on second-order matrixes and including a data range, a mean value 1, a variance, an information entropy and a skew are extracted, wherein the derived texture feature images comprise a mean value 2, a variance, a cooperativity, a contrast ratio, dissimilarity, an information entropy, a second moment and a correlation, four types of texture feature images with highest correlation coefficients are obtained by constructing a 2D scatter diagram of a texture difference image and an original gray difference image, namely the data range, the mean value 1, the mean value 2 and the dissimilarity, and the four types of texture feature images and the original gray images are overlapped to form multiband feature images with five wave bands in front and back time phases.
In the step 3), var { a } is satisfied T X}=Var{b T Maximizing Var { a } = 1 T X-b T Under the constraint of Y, the intensity pattern T is changed MAD Expressed as:
Figure BDA0002259302670000031
Figure BDA0002259302670000032
wherein MAD is MAD variable composed of variables obtained by subtracting corresponding typical variables in reverse order, a and b are projection vectors of multi-band images X and Y of two phases respectively, P is image dimension,
Figure BDA0002259302670000033
variance for the ith MAD variable, +.>
Figure BDA0002259302670000034
For a chi-square distribution with degrees of freedom p, subscripts i and j are the number of MAD variables and the number of pixels, respectively, and if no change occurs at the jth pixel, the ith MAD variable MAD ij Is 0.
In the step 3), the IR-MAD algorithm introduces a weight w j In the calculation of the mean value and the variance, iterating repeatedly until the typical correlation coefficient converges, and changing the intensity graph T IR-MAD Expressed as:
Figure BDA0002259302670000035
/>
Figure BDA0002259302670000036
wherein IR-MAD is IR-MAD variable, a 'and b' are projection vectors of two-phase multiband images X and Y obtained by weighted iteration processing,
Figure BDA0002259302670000037
variance for the ith IR-MAD variable, +.>
Figure BDA0002259302670000038
Is a degree of freedom->
Figure BDA0002259302670000039
The subscripts i and j are the IR-MAD variable number and pixel number, respectively.
In the step 4), a fuzzy C-means clustering algorithm FCM is adopted to respectively compare the change intensity map T MAD And T IR-MAD And (5) dividing to obtain corresponding binary change detection result graphs.
In the step 4), in the binary change detection result diagram, the value 1 is white, which represents a change area, and the value 0 is black, which represents a non-change area, so that automatic extraction and monitoring of short-time significant events are realized through change detection.
Compared with the prior art, the invention has the following advantages:
1. compared with the traditional method for extracting and monitoring short-time significant events based on optical images, the method for automatically detecting the changes of the high-resolution noctilucent remote sensing images based on feature fusion is quite sensitive to light source information, the high-resolution noctilucent remote sensing images can reflect finer spatial details of images, and the extracted change areas are more in line with actual conditions.
2. According to the high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion, information in different aspects of the image is mined by extracting different texture features of the noctilucent remote sensing image, so that a change detection result is more accurate.
3. According to the high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion, MAD and IR-MAD algorithms are utilized on the noctilucent remote sensing image based on feature fusion, the change detection process does not depend on any priori knowledge and sample, and automation of short-time significant event change information extraction is achieved.
Drawings
FIG. 1 is a flow chart of the overall method of the present invention.
Fig. 2 is a verification reference sample of a fire area.
Fig. 3 is four types of texture feature images extracted from two-time-phase LJ1-01 noctilucent remote sensing images, wherein fig. 3a-d are respectively front-time-phase "data range, mean 1, mean 2, dissimilarity" texture images, and fig. 3e-h are respectively rear-time-phase "data range, mean 1, mean 2, dissimilarity" texture images.
Fig. 4 is a 2D scattergram of four texture difference images and gray difference images, wherein fig. 4a is a 2D scattergram of a "data range" difference image and gray difference image, fig. 4b is a 2D scattergram of a "mean 1" difference image and gray difference image, fig. 4c is a 2D scattergram of a "mean 2" difference image and gray difference image, and fig. 4D is a 2D scattergram of a "dissimilarity" difference image and gray difference image.
FIG. 5 is a graph T of the intensity of variation obtained using MAD and IR-MAD algorithms based on LJ1-01 data MAD And T IR-MAD Wherein, the graph (5 a) is a variation intensity graph T MAD FIG. 5b shows a variation intensity pattern T IR-MAD
FIG. 6 is a graph T of variation intensity MAD And T IR-MAD And (3) obtaining a binary change detection result graph through an FCM clustering algorithm, wherein the graph (6 a) is an MAD binary change detection result graph, and the graph (6 b) is an IR-MAD binary change detection result graph.
Fig. 7 is a graph showing a change analysis of the map of the MAD and IR-MAD binary change detection result, wherein fig. 7a is a graph showing a change analysis of the MAD binary map, and fig. 7b is a graph showing a change analysis of the IR-MAD binary map.
Fig. 8 is a detailed view of the change of the two types of LJ1-01 binary change detection map local areas 9, 10, 11, wherein fig. 8a is a MAD result area 9, fig. 8b is an IR-MAD result area 9, fig. 8c is a MAD result area 10, fig. 8d is an IR-MAD result area 10, fig. 8e is a MAD result area 11, and fig. 8f is an IR-MAD result area 11.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples.
As shown in fig. 1, the invention provides a high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion, which mainly comprises the following four steps:
(1) Noctilucent remote sensing image preprocessing
The obtained LJ1-01 noctilucent remote sensing image is subjected to radiation correction according to an absolute radiation correction formula, and the gray value is converted into a radiation brightness value.
L=DN 3/2 ×10 -10
Wherein the absolute radiation corrected radiance value is represented by L in W/(m) 2 Sr·μm), the image gray value is denoted by DN. Finally, converting the unit of the radiance into nW/(m) 2 ·sr·μm)。
(2) Extraction and superposition of multiple derived texture features
Based on the preprocessed LJ1-01 data, five derived texture feature images based on probability statistics and eight derived texture feature images based on second-order matrixes, wherein the derived texture feature images comprise a data range, a mean value 1, a variance, an information entropy and a skew, and the derived texture feature images based on the second-order matrixes comprise a mean value 2, a variance, a cooperativity, a contrast ratio, a dissimilarity, an information entropy, a second-order moment and a correlation. By constructing a 2D scatter diagram (fig. 4) of the texture difference image and the original gray difference image, four types of texture feature images with highest correlation coefficients are selected: the data range, the mean value 1, the mean value 2 and the dissimilarity are overlapped with the original gray level image to form the multiband characteristic image with five wave bands respectively at the front and back time phases.
(3) Variable information fusion and intensity map structure
And carrying out change information fusion and intensity map construction on the multiband LJ1-01 noctilucent remote sensing image with the superimposed texture characteristics by means of an MAD and IR-MAD algorithm.
The mathematical nature of MAD is mainly a typical correlation analysis (Canonical Correlation Analysis, CCA) and a band difference operation, and is a change detection algorithm based on the criterion of maximizing the variance of the projection characteristic difference.
Assume that two multi-band images X and Y with different phases respectively contain n wave bands, the image dimensions are p and q respectively, and a= [ a ] 1 ,a 2 ,...,a p ] T ,b=[b 1 ,b 2 ,...,b q ] T (p.ltoreq.q) represent projection vectors of X, Y, respectively, MAD can be expressed as the following optimization problem:
Figure BDA0002259302670000061
Figure BDA0002259302670000062
var (a) can be obtained from the constraint T X-b T Y)=2(1-ρ(a T X,b T Y)) based on the CCA solving method to obtain the corresponding eigenvalue ρ 2 And feature vectors a, b. And the eigenvalues are arranged in reverse order to meet the optimization objective Var (a T X-b T Y) is maximum. After solving for a, b, the final MAD variable can be calculated by:
Figure BDA0002259302670000063
since the MAD variable is a linear combination of X and Y, the MAD variable approximately satisfies a Gaussian distribution according to the central limit theorem, the sum of the square of the MAD variable divided by the variance is distributed from the chi-square with the degree of freedom p, thereby obtaining a variation intensity map T MAD
Figure BDA0002259302670000064
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002259302670000065
variance for the ith MAD variable, +.>
Figure BDA0002259302670000066
Is a degree of freedom->
Figure BDA0002259302670000067
If no change occurs at the jth pixel, the ith MAD variable (i.e., MAD ij ) Is 0.
The iterative weighted multivariate variation detection (Iteratively Reweighted MAD, IR-MAD) algorithm is based on the MAD algorithm, and the weighted iteration is performed according to the chi-square distance of the difference image.
Figure BDA0002259302670000068
Then weight w j And participating in the next calculation of the mean value and the variance, and repeating iteration until the typical correlation coefficient converges. After the weighted iteration process, a group of eigenvectors a ', b' can be solved, and then a variable intensity graph T is obtained according to the construction process of the MAD variable and the variable intensity graph IR-MAD
Figure BDA0002259302670000069
Figure BDA00022593026700000610
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA00022593026700000611
variance for the ith IR-MAD variable, +.>
Figure BDA00022593026700000612
Is a degree of freedom->
Figure BDA00022593026700000613
If no change occurs at the jth pixel, the ith IR-MAD variable (i.e., IR-MAD ij ) Is 0.
(4) Fuzzy C-means clustering to generate binary change detection result graph
Respectively using fuzzy C-means (FCM) clustering algorithm to change intensity graph T MAD And T IR-MAD And (5) dividing to obtain respective binary change detection result graphs. The value 1 is white and represents a change area, and the value 0 is black and represents a non-change area, so that automatic extraction and monitoring of short-time significant events are realized through change detection.
Examples:
1. experimental data
LJ1-01 noctilucent remote sensing images are used as a main data source (source: http:// www.hbeos.org.cn /). The study objective was 11 months 2018, 8 days of the "kampx" forest fire event occurring in the canteen town of northern bieuet county, california.
TABLE 1 details of data for Kamprosate fire zone LJ1-01
Figure BDA0002259302670000071
2. Experimental results
(1) Single feature difference variation detection result analysis
And respectively carrying out difference operation on four types of texture feature images (figure 3) extracted from the front-back time-phase high-resolution LJ1-01 noctilucent remote sensing images and the original gray level images to obtain 4 texture difference images and 1 gray level difference image. And carrying out self-adaptive threshold segmentation on the 5 single-feature difference images by using an Otsu method (OTSU) to obtain 5 single-feature binary change detection images. The accuracy evaluation table 2 is obtained by constructing a confusion matrix between 5 single-feature binary change detection maps and a verification sample (fig. 2). The precision evaluation results show that for four single texture features, the Kappa coefficient and the overall precision are higher than those of the original single-band image, and the texture features can extract more reliable and accurate results compared with the gray features of the original noctilucent brightness.
Table 2 change detection accuracy evaluation table based on original luminous gray scale characteristics and derived texture characteristics
Figure BDA0002259302670000072
(2) The method utilizes the analysis of the change detection result of MAD and IR-MAD algorithm
The results of the change detection test of the noctilucent remote sensing image of the California fire area are shown in fig. 5 and 6, the two algorithms can extract the change information to a certain extent, and the results of the detected main change areas are similar. The IR-MAD algorithm can iteratively update the weight, so that the background noise is effectively restrained compared with the MAD algorithm. The invention specifically analyzes the binary change detection result graphs obtained by the two algorithms. By comparing the original gray difference images, the area with positive gray value on the gray difference images is regarded as the main fire area (mainly forest area, main fire burning area) of the research area, namely the area with increased brightness is marked on the binary change detection result graph by red circles (1-8); the area with negative gray value on the gray difference image is regarded as an auxiliary fire area (mainly urban area, light of residential area is reduced due to power facility destroy and the like) of the research area, namely, the area with reduced brightness is marked on a binary change detection result graph by blue circles (9-14), and the result is shown in fig. 7. In general, both algorithms extract the main fire zones 1-8 (red), with little difference in extraction results. The auxiliary fire areas 9-14 (blue) are also extracted, and the extraction results are different to a certain extent. By looking at the enlarged detail of the auxiliary fire zones 9, 10, 11, fig. 8, it can be seen that the two algorithms extract approximately the same area, but the MAD algorithm can extract more areas of variation than the IR-MAD algorithm, which is more sensitive to the change in brightness information in the background.
And constructing a confusion matrix between a binary change detection result diagram obtained by two algorithms and a verification sample (figure 2). From the precision evaluation table 3, the detection results of the two algorithms after fusing a plurality of texture features have obviously improved total precision and Kappa coefficient compared with the single texture feature. The total accuracy and Kappa coefficient of the MAD are higher than those of the IR-MAD, and the miss-division error of the MAD method change area is lower than that of the IR-MAD on the basis of controlling the miss-division error of the change area. Analysis shows that the missed separation error mainly comes from auxiliary fire areas, and because the ground object types of the auxiliary fire areas mainly comprise buildings and residential areas, the related details are relatively large, and the great reason for missed separation is that the IR-MAD can treat the area which is luminous change information as noise when the IR-MAD iteratively filters background information. The main fire area is located in the forest area, the related change range is larger and concentrated, and crisscrossed texture detail information does not exist, so that the detected change area is more accurate. In summary, when extracting the change information of the area with highlighted detail information (i.e. the study area has complex topography, small ground feature and scattered distribution), the MAD is an ideal detection means, and when detecting the area with de-emphasized detail and densely concentrated change range (i.e. the study area has flat topography, concentrated ground feature and homogeneous ground feature), the IR-MAD can effectively inhibit the background noise, thus achieving the purpose of effective detection.
Table 3 shows the evaluation of the accuracy of detection of changes by MAD and IR-MAD algorithms in the method
Figure BDA0002259302670000081
Figure BDA0002259302670000091
In summary, a series of qualitative and quantitative test analysis proves that the high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion has higher actual change detection performance, operability and automation level, does not depend on any priori knowledge and sample, and can be applied to automatically extracting and monitoring short-time important events in the high-resolution noctilucent remote sensing image.

Claims (5)

1. The high-resolution noctilucent remote sensing image automatic change detection method based on feature fusion is characterized by comprising the following steps of:
1) Acquiring two-time-phase high-resolution noctilucent remote sensing data before and after a short-time major event of a research area, and preprocessing front and rear time-phase remote sensing images;
2) Based on the preprocessed high-resolution noctilucent remote sensing data, extracting various derivative texture feature images, and superposing and constructing a multiband feature image fused with the texture features;
3) Performing change detection on the multiband characteristic image in the step 2) by adopting a multivariate change detection algorithm MAD and an iterative weighting algorithm IR-MAD to obtain a change intensity graph T fusing multiple characteristics MAD And T IR-MAD
4) Respectively for the change intensity map T MAD And T IR-MAD Dividing to obtain corresponding binary change detection result graphs;
the step 2) is specifically as follows:
according to the preprocessed high-resolution noctilucent remote sensing data, extracting five derived texture feature images based on probability statistics and including a data range, a mean value 1, a variance, an information entropy and a skew, and eight derived texture feature images based on a second-order matrix and including a mean value 2, a variance, a cooperativity, a contrast, a dissimilarity, an information entropy, a second-order moment and a correlation, and obtaining four types of texture feature images with highest correlation coefficients, namely a data range, a mean value 1, a mean value 2 and a dissimilarity, and superposing the four types of texture feature images with the original gray level images to form multiband feature images with five wave bands in front and back time phases;
in the step 3), var { a } is satisfied T X}=Var{b T Maximizing Var { a } = 1 T X-b T Under the constraint of Y, the intensity pattern T is changed MAD Expressed as:
Figure FDA0004105796880000011
Figure FDA0004105796880000012
wherein MAD is MAD variable, a and b are projection vectors of multiband images X and Y of two phases respectively, P is image dimension,
Figure FDA0004105796880000025
variance for the ith MAD variable, +.>
Figure FDA0004105796880000023
Is a degree of freedom->
Figure FDA0004105796880000024
The subscripts i and j are the MAD variable number and the pixel number, respectively, and if no change occurs at the jth pixel, the ith MAD variable MAD ij Is 0;
in the step 3), the IR-MAD algorithm introduces a weight w j In the calculation of the mean value and the variance, iterating repeatedly until the typical correlation coefficient converges, and changing the intensity graph T IR-MAD Expressed as:
Figure FDA0004105796880000021
Figure FDA0004105796880000022
wherein IR-MAD is IR-MAD variable, a 'and b' are projection vectors of two-phase multiband images X and Y obtained by weighted iteration processing,
Figure FDA0004105796880000026
variance for the ith IR-MAD variable, +.>
Figure FDA0004105796880000027
Is a degree of freedom->
Figure FDA0004105796880000028
The subscripts i and j are the IR-MAD variable number and pixel number, respectively.
2. The method for automatically detecting the change of the high-resolution luminous remote sensing image based on the feature fusion according to claim 1, wherein in the step 1), the obtained two-phase high-resolution luminous remote sensing data are preprocessed by image cutting and radiation correction in sequence.
3. The method for automatically detecting the change of the high-resolution luminous remote sensing image based on the feature fusion according to claim 2, wherein the high-resolution luminous remote sensing data is remote sensing data acquired by one-01 star of the Lopa nationality.
4. The method for automatically detecting the change of the high-resolution noctilucent remote sensing image based on the feature fusion according to claim 1, wherein in the step 4), a fuzzy C-means clustering algorithm FCM is adopted to respectively detect the change intensity map T MAD And T IR-MAD And (5) dividing to obtain corresponding binary change detection result graphs.
5. The method for automatically detecting the change of the high-resolution noctilucent remote sensing image based on feature fusion according to claim 1, wherein in the step 4), in the binary change detection result graph, a value 1 is white, which represents a change area, and a value 0 is black, which represents a non-change area, so that automatic extraction and monitoring of short-time significant events are realized through change detection.
CN201911065814.2A 2019-11-04 2019-11-04 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion Active CN110991248B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911065814.2A CN110991248B (en) 2019-11-04 2019-11-04 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911065814.2A CN110991248B (en) 2019-11-04 2019-11-04 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion

Publications (2)

Publication Number Publication Date
CN110991248A CN110991248A (en) 2020-04-10
CN110991248B true CN110991248B (en) 2023-05-05

Family

ID=70083028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911065814.2A Active CN110991248B (en) 2019-11-04 2019-11-04 High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion

Country Status (1)

Country Link
CN (1) CN110991248B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111814671A (en) * 2020-07-08 2020-10-23 三亚中科遥感研究所 Forest land dynamic change monitoring method based on remote sensing cooperation
CN112435202B (en) * 2020-12-10 2022-06-03 湖北省地震局(中国地震局地震研究所) Mutual correction method for DMSP local noctilucent images
CN113723175A (en) * 2021-07-14 2021-11-30 中国人民解放军战略支援部队信息工程大学 Method for extracting urban built-up area of remote sensing image
CN115410096B (en) * 2022-11-03 2023-01-24 成都国星宇航科技股份有限公司 Satellite remote sensing image multi-scale fusion change detection method, medium and electronic device
CN115861823B (en) * 2023-02-21 2023-05-09 航天宏图信息技术股份有限公司 Remote sensing change detection method and device based on self-supervision deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010121992A (en) * 2008-11-18 2010-06-03 Taisei Corp Crack detecting method
CN104834942A (en) * 2015-05-22 2015-08-12 武汉大学 Remote sensing image change detection method and system based on mask classification
CN107689055A (en) * 2017-08-24 2018-02-13 河海大学 A kind of multi-temporal remote sensing image change detecting method
CN109300115A (en) * 2018-09-03 2019-02-01 河海大学 A kind of multispectral high-resolution remote sensing image change detecting method of object-oriented

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108256419B (en) * 2017-12-05 2018-11-23 交通运输部规划研究院 A method of port and pier image is extracted using multispectral interpretation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010121992A (en) * 2008-11-18 2010-06-03 Taisei Corp Crack detecting method
CN104834942A (en) * 2015-05-22 2015-08-12 武汉大学 Remote sensing image change detection method and system based on mask classification
CN107689055A (en) * 2017-08-24 2018-02-13 河海大学 A kind of multi-temporal remote sensing image change detecting method
CN109300115A (en) * 2018-09-03 2019-02-01 河海大学 A kind of multispectral high-resolution remote sensing image change detecting method of object-oriented

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Xiaolong Ma等.A New Fusion Approach for Extracting Urban Built-up Areas from Multisource Remotely Sensed Data .Remote sensing.2019,1-17. *
杨红卫等.中高分辨率遥感影像在农业中的应用现状.农业工程学报.2012,(第24期),146-157. *

Also Published As

Publication number Publication date
CN110991248A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
CN110991248B (en) High-resolution noctilucent remote sensing image automatic change detection method based on feature fusion
Nieto et al. Mesoscale frontal structures in the Canary Upwelling System: New front and filament detection algorithms applied to spatial and temporal patterns
CN111666918B (en) Coastline change identification method based on multiple factors
EP3022686B1 (en) Automatic generation of multi-scale descriptors from overhead imagery through manipulation of alpha-tree data structures
Coppin et al. Review ArticleDigital change detection methods in ecosystem monitoring: a review
Yang et al. Timely and accurate national-scale mapping of urban land in China using Defense Meteorological Satellite Program’s Operational Linescan System nighttime stable light data
Ghosh et al. Fuzzy clustering algorithms for unsupervised change detection in remote sensing images
Vu et al. Context-based mapping of damaged buildings from high-resolution optical satellite images
CN110309781B (en) House damage remote sensing identification method based on multi-scale spectrum texture self-adaptive fusion
Xu et al. Object‐based mapping of karst rocky desertification using a support vector machine
Qiu et al. Automatic method to monitor floating macroalgae blooms based on multilayer perceptron: case study of Yellow Sea using GOCI images
CN108776360A (en) A kind of method of urban heat island strength Monitoring on Dynamic Change
CN114821018B (en) Infrared dim target detection method for constructing convolutional neural network by utilizing multidirectional characteristics
Samadzadegan et al. Automatic detection and classification of damaged buildings, using high resolution satellite imagery and vector data
Kurukuru et al. Machine learning framework for photovoltaic module defect detection with infrared images
Uehara et al. Object detection of satellite images using multi-channel higher-order local autocorrelation
Hollitt et al. Feature detection in radio astronomy using the circle Hough transform
Bhatt et al. Spectral indices based object oriented classification for change detection using satellite data
Cheng et al. Extracting urban areas in China using DMSP/OLS nighttime light data integrated with biophysical composition information
Lefebvre et al. Monitoring the morphological transformation of Beijing old city using remote sensing texture analysis
Chacon-Murguía et al. Dust storm detection using a neural network with uncertainty and ambiguity output analysis
CN116168240A (en) Arbitrary-direction dense ship target detection method based on attention enhancement
Teodoro et al. Identification of beach hydromorphological patterns/forms through image classification techniques applied to remotely sensed data
Aahlaad et al. An object-based image analysis of worldview-3 image for urban flood vulnerability assessment and dissemination through ESRI story maps
Sakieh et al. An integrated spectral-textural approach for environmental change monitoring and assessment: analyzing the dynamics of green covers in a highly developing region

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant