CN103198482B - Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges - Google Patents

Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges Download PDF

Info

Publication number
CN103198482B
CN103198482B CN201310117627.0A CN201310117627A CN103198482B CN 103198482 B CN103198482 B CN 103198482B CN 201310117627 A CN201310117627 A CN 201310117627A CN 103198482 B CN103198482 B CN 103198482B
Authority
CN
China
Prior art keywords
mrow
value
difference
pixel
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310117627.0A
Other languages
Chinese (zh)
Other versions
CN103198482A (en
Inventor
王桂婷
焦李成
刘博伟
公茂果
钟桦
王爽
张小华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201310117627.0A priority Critical patent/CN103198482B/en
Publication of CN103198482A publication Critical patent/CN103198482A/en
Application granted granted Critical
Publication of CN103198482B publication Critical patent/CN103198482B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The present invention discloses a kind of method for detecting change of remote sensing image merged based on disparity map fuzzy membership, mainly solves the problem that existing change detecting method not only effectively can not remove pseudo-change information but also Retain edge information.Its implementation procedure is: the remote sensing images inputting the different phase of two width, calculate the structural similarity coefficient of its corresponding pixel points, obtain a width similarity disparity map; Difference is done to two width remote sensing images and obtains a width error image; Category label is carried out to the pixel of differential chart and obtains a width category label figure; According to category label figure, filtering process is carried out to differential chart and obtain a width denoising differential chart; To similarity disparity map and denoising disparity map carry out fuzzy membership melt merge sort obtain change testing result.The present invention has stronger noise immunity, and can effectively remove pseudo-change information, retain good marginal information simultaneously, testing result accuracy rate is high, can be used for urban sprawl monitoring, forest and coupling relationship monitoring.

Description

Remote sensing image change detection method based on difference map fuzzy membership degree fusion
Technical Field
The invention belongs to the field of digital image processing, mainly relates to remote sensing image change detection, and particularly relates to a remote sensing image change detection method based on difference map fuzzy membership fusion, which can be used for remote sensing image analysis and processing.
Background
The change detection of the remote sensing image is a process for identifying the state change or phenomenon change of an object by analyzing and extracting the electromagnetic spectrum characteristic difference or the space structure characteristic difference existing between the remote sensing images in different time phases in the same region. The method is widely applied to various fields of national economy and national defense construction, such as agricultural investigation, forest and vegetation change monitoring, urban area expansion monitoring, military target monitoring and the like.
The common method for detecting the change of the remote sensing image generally comprises the steps of constructing a difference graph, and then selecting a proper threshold value to divide the difference graph into a change class and a non-change class. Wherein the construction of the difference map and the processing of the difference map are important steps of image change detection. The simpler difference image construction method is a difference method, which is easy to implement but has more noise in the constructed difference image, and an effective method is needed to process the noise in the difference image. The noise with stronger quantization in the difference image is removed, and the change pixels with small gray value are enhanced, so that the quality of the difference image can be effectively improved, and the detection result is more accurate.
In order to construct a better disparity map, some scholars construct a disparity map by measuring the Similarity of the gray values of the corresponding pixels of two-phase Images, and Inglada and Mercier (2007) in the article "A New Statistical Similarity measure Change Detection in Multitemporal SAR Images and Its Extension to multiscale Analysis, IEEE Transactions on Geoscience and Remote Sensing,2007,45(5):1432 KL 1445" proposes a SAR image Change Detection method based on Statistical Similarity, which constructs a disparity map by measuring the Statistical Similarity of the neighborhood of the corresponding pixels of two-phase Images by using the divergence, and then threshold-divides the disparity map to obtain a Change result, which is a histogram modeling of a local region, but the pixels of the local region are few and are difficult to effectively model, so the Detection result of the method is poor. He (2010) proposes a Change Detection method based on Euclidean distance in the article "Application of Euclidean norm in Multi-temporal Remote Sensing Image Change Detection, International Congression Image and Signal Processing (CISP' 2010),2010,5: 2111-2115", which constructs a difference map by calculating the Euclidean distance of two time-phase Remote Sensing images of multiple bands, and then performs threshold segmentation on the difference map to obtain a Change result.
In order to combine the advantages of different difference maps, some scholars fuse different difference maps, and Maturian et al scholars (2006) propose a method for fusing a difference map and a ratio map by a product fusion strategy in the article 'remote sensing image change detection based on fusion and a generalized Gaussian model, remote sensing academic, 2006,10(6):847 and 853', wherein the method can inhibit the background and enhance the change area to a certain extent, but is unstable and sometimes inhibits the change area. In the article ' change detection of multi-feature evidence fusion remote sensing images ' remote sensing academic report 2010,14(2):1-7 ' of Wangmen and Zhang Xingyue (2010), a method for fusing multiple feature difference graphs by using an evidence theory fusion method is provided, the method improves the detection precision of a single feature detection method, but the structural similarity measurement method adopted by the method is unstable, so that the accuracy of the detection result is reduced. Du et al (2012) in the article "Fusion of Difference Images for change Detection over Urban Areas, IEEE Journal of Selected Topics in Applied Earth based and Remote Sensing,2012,5(4):1076 and 1086" propose a change Detection method for feature level and decision level Fusion of multiple Difference maps, which combines the advantages of multiple Difference maps and improves the accuracy of change Detection, but the method does not have Fusion Difference maps with pertinence according to the advantages and disadvantages of the Difference maps, so the accuracy of Detection results is not improved much and may be reduced sometimes.
Disclosure of Invention
The invention aims to overcome the defects of the transformation detection technology and provides a remote sensing image change detection method based on difference map fuzzy membership fusion so as to reduce the influence of noise on a detection result, reduce pseudo change information in the detection result and improve the accuracy of the detection result.
In order to achieve the above object, the detection method of the present invention comprises the steps of:
(1) the two input frames are I × J sameRegistered remote sensing image X of different time phases in one area1And X2Calculating the two remote sensing images X1And X2And (3) obtaining a similarity coefficient matrix SIM (m, n) corresponding to the structural similarity coefficient SIM (m, n) of the pixel point:
SIM={SIM(m,n)|1≤m≤I,1≤n≤J}
wherein, <math> <mrow> <mi>SIM</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&lambda;</mi> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&mu;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&mu;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&mu;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&mu;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&lambda;</mi> <mo>)</mo> </mrow> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&sigma;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&sigma;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&sigma;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&sigma;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> <mo>,</mo> </mrow> </math>
where m and n are the row number and column number of the image, respectively, m =1,2,. eta., I, n =1,2,. eta., J, μ1(m,n)、σ1(m, n) and(m, n) are respectively remote sensing images X1Mean, standard deviation and variance of pixel values of local region with pixel point (m, n) as center and window size of w × w, μ2(m,n)、σ2(m, n) and(m, n) are respectively remote sensing images X2The method comprises the steps of taking a pixel point (m, n) as a center, and taking the mean value, the standard deviation and the variance of a local area pixel value with the window size of wxw, wherein the value range of the window w is 3-9, C is a constant for avoiding an unstable phenomenon when a denominator is close to zero, and the value range of C is C>0, lambda is a weight coefficient, and the value range of the structural similarity coefficient is more than or equal to 0 and less than or equal to 1 of SIM (m, n);
(2) linearly mapping the structural similarity coefficient SIM (m, n) of the similarity coefficient matrix SIM at position (m, n) to the interval [0, 255%]Obtaining a similarity difference graph XSThe pixel grey value X at position (m, n)S(m,n):
XS(m,n)=(1-SIM(m,n))×255,
(3) To remote sensing image X1And X2The gray value X of the pixel point at the corresponding position (m, n) in space1(m, n) and X2(m, n) calculating the difference value to obtain the difference value Xd(m,n)=|X1(m,n)-X2(m, n) |, calculating the remote sensing image X from left to right and from top to bottom in sequence1And X2Obtaining a difference map X by the difference of the gray values of the pixel points at the corresponding positions in spaced
Xd={Xd(m,n)|1≤m≤I,1≤n≤J},
(4) For difference chart XdThe pixels in the image are subjected to median filtering with the window size of 3 multiplied by 3 to obtain a filtered difference image XfTo the filtered difference map XfPerforming threshold segmentation on the statistical histogram to obtain an initial classification diagram Xm
(5) From the filtered difference map XfGray value range and initial classification map X ofmFor difference map XdThe pixel points in the image are subjected to category marking to obtain a category marking image Xb
(6) Tagging of graph X according to categorybMarker at middle position (m, n), for difference disparity map XdFiltering the pixel at the middle position (m, n) to obtain a de-noising difference image XN
(7) Calculating a similarity difference map XSDegree of membership of varying classes of pixels at the middle position (m, n)(m, n) and degree of membership of non-varying classes(m, n), calculating a denoised difference map XNVarying class membership of a pixel at position (m, n)(m, n) and degree of membership of non-varying classes(m,n);
(8) Calculating a similarity difference map XSSum denoised difference map XNCorresponding pixel point (m, n) of (a) to (b) of (a)(m, n) and(m, n) fusion membership value Hc(m, n), and then calculating the non-variable membership degree of the pixel point(m, n) andfusion membership H of (m, n)u(m,n);
(9) Establishing a difference graph X with similaritySFused image X of the same sizeIIf H is presentc(m,n)>Hu(m, n), then the fused image X at that pixel point isIThe value is marked as 1, otherwise, the value is marked as 0, and therefore a change detection result image is obtained.
Compared with the prior art, the invention has the following advantages:
1. the structural similarity measurement method provided by the invention can stably and effectively measure the similarity between two time-phase image pixels, and the constructed similarity difference graph can inhibit background noise and enhance the contrast between a target and a background.
2. The invention marks the pixel in the difference image in a classification way, and can effectively inhibit the severe noise in the difference image while keeping the edge information.
3. The method disclosed by the invention integrates the similarity difference image and the denoising difference image, effectively combines the advantages of the two difference images, reduces the influence of noise on the detection result, keeps the edge of the change area and improves the accuracy of the change detection result.
Drawings
FIG. 1 is a block diagram of an implementation flow of the present invention;
FIG. 2 is a first set of remote sensing images and corresponding change reference images for an experiment;
FIG. 3 is a second set of remote sensing images and corresponding change reference images for an experiment;
FIG. 4 is a graph of the change detection results obtained by the present invention and comparison method simulating a first set of remote sensing images;
FIG. 5 is a graph of the change detection results obtained by simulating a second set of remote sensing images according to the present invention and the comparison method.
Detailed Description
Referring to fig. 1, the implementation steps of the invention are as follows:
step 1, inputting two registered remote sensing images X with different time phases in the same region and I multiplied by J1And X2The two remote sensing images X are calculated as shown in FIG. 2(a) and FIG. 2(b)1And X2And obtaining a similarity coefficient matrix SIM (m, n) corresponding to the structural similarity coefficient SIM (m, n) of the pixel point.
1.1) calculating the phase remote sensing images X of the time phase 1 and the time phase 2 respectively1And X2To ChineseMean value mu of local area with pixel point (m, n) as center and window size of w x w pixels1(m, n) and μ2(m, n), standard deviation σ1(m, n) and σ2(m, n) and variance(m, n) and(m, n), wherein m and n are the row number and the column number of the image respectively, m =1,2, …, I, n =1,2, … and J, in order to compromise the positioning accuracy and the anti-noise capability, the value range of the window w is 3-9, and w =5 in the embodiment of the invention;
1.2) calculating remote sensing images X of time phase 1 and time phase 21And X2The structural similarity coefficient SIM (m, n) at the corresponding pixel point (m, n):
<math> <mrow> <mi>SIM</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&lambda;</mi> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&mu;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&mu;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&mu;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&mu;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&lambda;</mi> <mo>)</mo> </mrow> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&sigma;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&sigma;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&sigma;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&sigma;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> </mrow> </math>
wherein C is a constant for avoiding the unstable phenomenon when the denominator is close to zero, and the value range of C is C>0, taking 1 in the embodiment of the invention, wherein lambda is a weight coefficient, in the embodiment of the invention, lambda =0.95, the range of SIM (m, n) which is more than or equal to 0 and less than or equal to 1 and is obtained by calculating the structural similarity coefficient is closer to 0, which shows that the remote sensing image X is a remote sensing image1And X2The more dissimilar at a pixel (m, n), the greater the likelihood of belonging to a change class;
1.3) calculating the structural similarity coefficients of all pixels in the image according to the step (1.1) and the step (1.2) to obtain a similarity coefficient matrix SIM (m, n) |1 ≦ m ≦ I, and 1 ≦ n ≦ J).
Step 2, linearly mapping the structural similarity coefficient SIM (m, n) of the similarity coefficient matrix SIM at the position (m, n) to an interval [0, 255%]Obtaining a similarity difference graph XSThe pixel grey value X at position (m, n)S(m,n):
XS(m,n)=(1-SIM(m,n))×255。
Step 3, the remote sensing images X of the time phase 1 and the time phase 2 are obtained1And X2The gray value X of the pixel point at the corresponding position (m, n) in space1(m, n) and X2(m, n) calculating the difference value to obtain the difference value Xd(m,n)=|X1(m,n)-X2(m, n) |, calculating the remote sensing image X from left to right and from top to bottom in sequence1And X2Obtaining a difference map X by the difference of the gray values of the pixel points at the corresponding positions in spaced
Xd={Xd(m,n)|1≤m≤I,1≤n≤J}。
Step 4, the difference map XdCarrying out median filtering with the window size of 3 multiplied by 3 to obtain a filtered difference image XfTo the filtered difference map XfPerforming threshold segmentation on the statistical histogram to obtain an initial classification diagram Xm
4.1) vs. Difference map XdCarrying out median filtering with the window size of 3 multiplied by 3 to obtain a filtered difference image Xf
4.2) Pair of filtered difference maps XfWill satisfy the conditionsIs set to a low threshold value TlWherein L represents the maximum pixel gray level for gray level pixel number accumulation, and has a value ranging from 0 to 255, and NxRepresenting the total number of pixels with a grey level xThe number of lambda is a proportionality constant, the value range is 0.5-0.6, and lambda =0.5 in the embodiment of the invention;
4.3) to the filtered difference map XfWill be greater than the low threshold TlAnd satisfies the condition Nx<(1-λ)×I×J/(255-Tl) Is set as a statistical histogram threshold value
4.4) Pair of filtered difference maps XfBy statistical histogram thresholdingDividing to obtain an initial classification map Xm
<math> <mrow> <msub> <mi>X</mi> <mi>m</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mn>1</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>f</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>></mo> <mo>=</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>f</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math>
Wherein, Xm(m, n) is an initial classification chart XmPixel value in (m, n), Xf(m, n) is the difference diagram X after filteringfThe gray value at (m, n) in (1) indicates a changed class, and the mark 0 indicates a non-changed class.
Step 5, according to the filtered difference value graph XfGray value range and initial classification map X ofmFor difference map XdThe pixel points in the image are subjected to category marking to obtain a category marking image Xb
5.1) computing an initial classification map XmIf the area of each region formed by the pixels of the middle variation class is less than 70, the initial classification map X ismThe pixel mark in the area is assigned to be 2, otherwise, the pixel mark is unchanged;
5.2) calculating the filtered difference map XfMaximum gray value N ofmaxIf N is presentmaxGreater than threshold T, initial classification map XmLabel graph X for the categoryb(ii) a Otherwise, turning to the step (5.3), wherein the gray level threshold value T is a constant and ranges from 100 to 150, and in the embodiment of the invention, T = 100;
5.3) to the initial Classification map XmThe region formed by the pixels marked 1 is subjected to mathematical morphological dilation of a square window with the structural element 3 × 3, obtaining a small expanded image Xm1And small expanded image Xm1The pixel value of the midpoint (m, n) is denoted as Xm1(m,n);
5.4) to the initial Classification map XmThe region formed by the pixels marked 1 is subjected to mathematical morphological dilation of a square window with the structural element 7 × 7, obtaining a large expanded image Xm2And greatly expand the image Xm2The pixel value of the midpoint (m, n) is denoted as Xm2(m,n);
5.5) expand the image X greatlym2And a small extended image Xm1Performing difference calculation on the gray value of the pixel point at the corresponding position (m, n) in space to obtain a difference value Xm3(m,n)=Xm2(m,n)-Xm1(m, n) to obtain an extended difference image Xm3={Xm3(m, n) }, wherein Xm3(m, n) is an extended difference image Xm3The pixel value at the midpoint (m, n);
5.6) creating a class label map X with the same size as the initial classification imagebAnd the pixel value X at the point (m, n) in the image is compared according to the following five conditionsb(m, n) assignment:
for satisfaction of condition Xm(m, n) =2 pixels (m, n), the category label graph XbValue X of the pixel pointb(m, n) is labeled 2;
for satisfaction of condition Xm(m, n) ≠ 2 and Xm3(m, n) =1 pixel (m, n), type label graph XbValue X of the pixel pointb(m, n) is labeled 3;
for satisfaction of condition Xm(m, n) ≠ 2 and Xm1(m, n) =1 pixel (m, n), type label graph XbValue X of the pixel pointb(m, n) is marked 1;
for satisfaction of condition Xm(m, n) =0 pixel (m, n), type label graph XbValue X of the pixel pointb(m, n) is labeled 0;
for the pixel points (m, n) which do not meet the four conditions, the category label graph X is usedbValue X of the pixel pointbThe (m, n) is marked 0.
Step 6, marking the graph X according to the categorybPixel value at the middle position (m, n), versus difference map XdFiltering the pixel at the middle position (m, n) to obtain a de-noising difference image XN
The filtering operation is performed according to the following rules:
for satisfaction of condition Xb(m, n) =0 pixel point, calculate difference map XdThe median value of all pixels in a window with the size of 9 multiplied by 9 of the pixel point is assigned to a de-noising difference value image XNPixel value X at midpoint (m, n)N(m,n);
For satisfaction of condition Xb(m, n) =1 pixel points, difference map XdValue X of the pixel pointd(m, n) is given to the denoised difference map XNPixel value X at midpoint (m, n)N(m,n);
For satisfaction of condition Xb(m, n) =2 pixel points, and calculating difference graph XdThe median value of all pixels in a window of 11X 11 size of the pixel point is assigned to the denoising difference value image XNPixel value X at midpoint (m, n)N(m,n);
If there is a condition X satisfiedb(m, n) =3 pixel points, then a difference graph X is calculateddThe Gaussian kernel function value with the Gaussian scale of 3 multiplied by 3 of the pixel point is assigned to the de-noising difference image XNPixel value X at midpoint (m, n)N(m,n);
Value X of all pixel points (m, n)N(m, n) form a denoised difference map XN={XN(m,n)}。
Step 7, calculating a similarity difference graph XSDegree of membership of variation class of pixel point at middle position (m, n)(m, n) and degree of membership of non-varying classes(m, n), and calculating a de-noising difference image XNDegree of membership of varying classes of pixel points at position (m, n)(m, n) and degree of membership of non-varying classes(m,n)。
7.1) calculating the similarity difference map XSDegree of membership of variation class of pixel point at middle position (m, n)(m, n) and degree of membership of non-varying classes(m,n):
<math> <mrow> <msub> <mi>H</mi> <msub> <mrow> <mi>c</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>S</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mn>0</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&times;</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mrow> <mrow> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> </mtd> <mtd> <msub> <mi>T</mi> <mn>1</mn> </msub> <mo>&le;</mo> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> <mo>-</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&times;</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mrow> <mrow> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> </mtd> <mtd> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>&le;</mo> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <msub> <mi>X</mi> <mi>S</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math>
H u , X S ( m , n ) = 1 - H c , X S ( m , n ) ,
Wherein, T1、T2And T3Are three different thresholds, and T1<T2<T3 T2=(T1+T3) And/2, wherein,segmenting similarity difference graph X for maximum entropy threshold segmentation methodSThe optimal threshold value of (a);
7.2) calculating the denoised difference map XNDegree of membership of varying classes of pixel points at position (m, n)(m, n) and degree of membership of non-varying classes(m,n):
<math> <mrow> <msub> <mi>H</mi> <msub> <mrow> <mi>c</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>N</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mn>0</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&times;</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mrow> <mrow> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>T</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> </mtd> <mtd> <msub> <mi>T</mi> <mn>1</mn> </msub> <mo>&le;</mo> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> <mo>-</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mo>&times;</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mrow> <mrow> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>,</mo> </mtd> <mtd> <msub> <mi>T</mi> <mn>2</mn> </msub> <mo>&le;</mo> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <msub> <mi>X</mi> <mi>N</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&GreaterEqual;</mo> <msub> <mi>T</mi> <mn>3</mn> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> </mrow> </math>
H u , X N ( m , n ) = 1 - H c , X N ( m , n ) ,
Wherein, XN(m, n) is a denoised difference map XNThe pixel value at (m, n).
Step 8, calculating a similarity difference graph XSSum denoised difference map XNCorresponding pixel point (m, n) of (a) to (b) of (a)(m, n) and(m, n) fusion membership value Hc(m, n), and then calculating the non-variable membership degree of the pixel point(m, n) andfusion membership H of (m, n)u(m,n)。
8.1) calculating the similarity difference map XSSum denoised difference map XNCorresponding pixel point (m, n) of (a) to (b) of (a)(m, n) andfusion membership value H of variation classes of (m, n)c(m,n):
<math> <mrow> <msub> <mi>H</mi> <mi>c</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&beta;</mi> <mo>&times;</mo> <msub> <mi>H</mi> <msub> <mrow> <mi>c</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>S</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> <mo>&times;</mo> <msub> <mi>H</mi> <msub> <mrow> <mi>c</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>N</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
Wherein beta is the variable class membership of the similarity difference graph(m, n) and degree of membership of non-varying classes(m, n), β =0.4 in the present example.
8.2) calculating the similarity difference graph X of the pixel points (m, n)SDegree of non-varying class membership(m, n) and denoised difference map XNDegree of non-varying class membershipFusion membership H of (m, n)u(m,n):
<math> <mrow> <msub> <mi>H</mi> <mi>u</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&beta;</mi> <mo>&times;</mo> <msub> <mi>H</mi> <msub> <mrow> <mi>u</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>S</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&beta;</mi> <mo>)</mo> </mrow> <mo>&times;</mo> <msub> <mi>H</mi> <msub> <mrow> <mi>u</mi> <mo>,</mo> <mi>X</mi> </mrow> <mi>N</mi> </msub> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>.</mo> </mrow> </math>
Step 9, establishing a difference graph X with similaritySFused image X of the same sizeIIf H is presentc(m,n)>Hu(m, n), then the fused image X of the pixel point (m, n) is obtainedIValue X ofI(m, n) is assigned to 1, otherwise 0; thereby obtaining a change detection result image XI={XI(m,n)}。
The effects of the present invention can be further illustrated by the following experimental results and analyses:
1. experimental data
Fig. 2(a) and 2(b) show a first set of remote sensing data, and the real set of remote sensing data is composed of two Landsat7ETM + (Enhanced thermal Mapper Plus) 4 th wave band remote sensing images in the suburbs of mexico in 4 months and 5 months in 2002. Fig. 2(a) and 2(b) both have the size of 512 × 512 and 256 gray levels, the changed area is mainly caused by fire destroying large-area local vegetation, fig. 2(c) is a corresponding change reference diagram, and the white area in fig. 2(c) represents the changed area, wherein the number of changed pixels is 25599, and the number of unchanged pixels is 236545.
Fig. 3(a) and 3(b) are a second set of remote sensing data consisting of two-phase Landsat-5 satellite TM 4 band spectral images of western region of Elba island in italy, month 8 and 9 month 1994. Fig. 3(a) and 3(b) show dimensions of 326 × 414, 256 gray levels, a change region caused by a forest fire, fig. 3(c) shows a corresponding change reference diagram, and a white region in fig. 3(c) shows the change region, where the number of changed pixels is 2415 and the number of unchanged pixels is 99985.
2. Comparative test
To illustrate the effectiveness of the present invention, the present invention was compared with the following three comparative methods.
The comparison method 1 is a method for fusing fuzzy membership of a difference map and a similarity difference map. The comparison of the present invention with the comparison method 1 is to verify the effectiveness of the present invention in processing the difference map based on the signature map.
The comparison method 2 is a method for constructing a difference map by changing a method for calculating Euclidean distances of pixels corresponding to a plurality of wave bands of a two-time phase Remote Sensing Image Change Detection proposed by He (2010) in the article "Application of Euclidean Norm in Multi-temporal Remote Sensing Image Change Detection" into a method for calculating Euclidean distances of a neighborhood of pixels of a single wave band of the two-time phase Image. The comparison between the present invention and the comparison method 2 is performed to verify the validity of the similarity difference map constructed in the present invention.
The comparison method 3 is a change detection method for feature-level Fusion of multiple Difference maps proposed by Du et al (2012) in the article "Fusion of Difference Images for ChangeDetection over Urban Areas", wherein the feature-level Fusion method also adopts fuzzy membership Fusion. The comparison between the method and the comparison method 3 is to verify the effectiveness of the change detection of the fusion similarity difference image and the denoising difference image.
3. Content and analysis of experiments
Experiment 1, change detection is performed on a first group of remote sensing images by using the invention and the comparison method 1, the comparison method 2 and the comparison method 3, and the result is shown in fig. 4, wherein:
FIG. 4(a) is a graph showing the results of change detection in comparative method 1;
FIG. 4(b) is a graph showing the results of change detection in comparative method 2;
FIG. 4(c) is a graph showing the results of change detection by comparative method 3;
FIG. 4(d) is a graph showing the results of change detection according to the present invention.
As can be seen from fig. 4, in the detection result graphs of the three comparison methods, in addition to the variation region, the whole graph is distributed with the miscellaneous points with small areas; compared with the three comparison methods, the method has the advantages that the shape of the change area in the result graph is well maintained, more complete edge information is reserved, and pseudo change information such as miscellaneous points is less.
Experiment 2, the change detection is performed on the first group of remote sensing images by using the invention and the comparison method 1, the comparison method 2 and the comparison method 3, and the result is shown in fig. 5, wherein:
FIG. 5(a) is a graph showing the results of change detection in comparative method 1;
FIG. 5(b) is a graph showing the results of change detection in comparative method 2;
FIG. 5(c) is a graph showing the results of change detection by comparative method 3;
FIG. 5(d) is a graph showing the results of change detection according to the present invention.
As can be seen from fig. 5, only a part of the change area is detected in the detection result graph of the comparison method 1, and more omission factors are detected; the overall shape of the change area in the detection result graph of the comparison method 2 is kept better, but the result graph has some pseudo change information with small area; the change area detected by the comparison method 3 is a white area, almost the whole image is covered, the detection result cannot keep the basic shape of the change area, and the detection result is very poor; compared with the three comparison methods, the method has the advantages that the shape of the change area in the detection result graph is kept well, and false change information such as miscellaneous points is avoided.
Table 1 lists the quantitative results of the present invention and three comparative methods for simulating the first set of remote sensing images and the second set of remote sensing images.
TABLE 1
As can be seen from Table 1, for the first group of remote sensing images, the missing detection rate is less than that of the comparison method 1, the false detection rate is slightly more than that of the first group of remote sensing images, and the total number of errors is less than that of the comparison method 1; the invention has more missed detection numbers than the comparison method 2, but has less false detection numbers than the comparison method 2, and the total number of errors is less than the comparison method 2; the invention has slightly more missed detection numbers than the comparison method 3, but the false detection number is far less than the comparison method 3, and the total false number is less than the comparison method 3. For the second group of remote sensing images, the total error number is few, and the accuracy is highest.
Therefore, on the whole, the method can obtain more accurate change information, has stronger noise immunity, can effectively remove impurity points, and can keep better edge information of a change area; the method is excellent in both visual effect and performance index.

Claims (4)

1. A remote sensing image change detection method based on difference map fuzzy membership fusion is characterized by comprising the following steps: the method comprises the following steps:
(1) inputting two registered remote sensing images X with different time phases in the same region and I multiplied by J1And X2Calculating the two remote sensing images X1And X2And (3) obtaining a similarity coefficient matrix SIM (m, n) corresponding to the structural similarity coefficient SIM (m, n) of the pixel point:
SIM={SIM(m,n)|1≤m≤I,1≤n≤J}
wherein, <math> <mrow> <mi>SIM</mi> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>&lambda;</mi> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&mu;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&mu;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&mu;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&mu;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&lambda;</mi> <mo>)</mo> </mrow> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&sigma;</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&CenterDot;</mo> <msub> <mi>&sigma;</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> <mrow> <msubsup> <mi>&sigma;</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&sigma;</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>C</mi> </mrow> </mfrac> <mo>,</mo> </mrow> </math>
where m and n are the row number and the column number of the image, respectively, and m is 1,21(m,n)、σ1(m, n) andrespectively, a remote sensing image X1Mean, standard deviation and variance of pixel values of local region with pixel point (m, n) as center and window size of w × w, μ2(m,n)、σ2(m, n) andrespectively, a remote sensing image X2The method comprises the steps of taking a pixel point (m, n) as a center, and taking the mean value, the standard deviation and the variance of a local area pixel value with the window size of wxw, wherein the value range of the window w is 3-9, C is a constant for avoiding an unstable phenomenon when a denominator is close to zero, and the value range of C is C>0, lambda is a weight coefficient, and the value range of the structural similarity coefficient is more than or equal to 0 and less than or equal to 1 of SIM (m, n);
(2) linearly mapping the structural similarity coefficient SIM (m, n) of the similarity coefficient matrix SIM at position (m, n) to the interval [0, 255%]Obtaining a similarity difference graph XSThe pixel grey value X at position (m, n)S(m,n):
XS(m,n)=(1-SIM(m,n))×255
(3) To remote sensing image X1And X2Gray scale of pixel point at corresponding position (m, n) in spaceValue X1(m, n) and X2(m, n) calculating the difference value to obtain the difference value Xd(m,n)=|X1(m,n)-X2(m, n) |, calculating the remote sensing image X from left to right and from top to bottom in sequence1And X2Obtaining a difference map X by the difference of the gray values of the pixel points at the corresponding positions in spaced
Xd={Xd(m,n)|1≤m≤I,1≤n≤J}
(4) For difference chart XdThe pixels in the image are subjected to median filtering with the window size of 3 multiplied by 3 to obtain a filtered difference image XfTo the filtered difference map XfPerforming threshold segmentation on the statistical histogram to obtain an initial classification diagram Xm
(5) From the filtered difference map XfGray value range and initial classification map X ofmFor difference map XdThe pixel points in the image are subjected to category marking to obtain a category marking image Xb
(6) Tagging of graph X according to categorybMarker at middle position (m, n), for difference disparity map XdFiltering the pixel at the middle position (m, n) to obtain a de-noising difference image XN
(7) Calculating a similarity difference map XSDegree of membership of varying classes of pixels at the middle position (m, n)And degree of membership of non-varying classesCalculating a denoised difference map XNVarying class membership of a pixel at position (m, n)And degree of membership of non-varying classes
(8) Calculating a similarity difference map XSSum denoised difference map XNCorresponding pixel point (m, n) of (a) to (b) of (a)Andfusion membership value H ofc(m, n), and then calculating the non-variable membership degree of the pixel pointAndfusion membership degree H ofu(m,n);
(9) Establishing a difference graph X with similaritySFused image X of the same sizeIIf H is presentc(m,n)>Hu(m, n), then the fused image X at that pixel point isIThe value is marked as 1, otherwise, the value is marked as 0, and therefore a change detection result image is obtained.
2. The remote sensing image change detection method according to claim 1, characterized in that: the difference map X after filtering in the step (4)fPerforming threshold segmentation on the statistical histogram to obtain an initial classification diagram XmThe method comprises the following steps:
(4a) for difference chart XdCarrying out median filtering with the window size of 3 multiplied by 3 to obtain a filtered difference image Xf
(4b) To the filtered difference value diagram XfWill satisfy the conditionsIs set to a low threshold value TlWherein L represents the maximum pixel gray level for gray level pixel number accumulation, and has a value ranging from 0 to 255, and NxThe total number of pixels with the gray level x is represented, and the lambda is a proportionality constant and ranges from 0.5 to 0.6, I and J respectively represent the number of rows and columns of the image;
(4c) to the filtered difference value diagram XfWill be greater than the low threshold TlAnd satisfies the condition Nx<(1-λ)×I×J/(255-Tl) Is set as a statistical histogram threshold value
(4d) To the filtered difference value diagram XfBy statistical histogram thresholdingDividing to obtain an initial classification map Xm
<math> <mrow> <msub> <mi>X</mi> <mi>m</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mn>1</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>f</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>></mo> <mo>=</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> <mo>,</mo> </mtd> <mtd> <msub> <mi>X</mi> <mi>f</mi> </msub> <mrow> <mo>(</mo> <mi>m</mi> <mo>,</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <mover> <mi>T</mi> <mo>&OverBar;</mo> </mover> </mtd> </mtr> </mtable> </mfenced> </mrow> </math>
Wherein, Xm(m, n) is an initial classification chart XmPixel value in (m, n), Xf(m, n) is the difference diagram X after filteringfThe gray value at (m, n) in (1) indicates a changed class, and the mark 0 indicates a non-changed class.
3. The remote sensing image change detection method according to claim 1, characterized in that: the step (5) is that the difference graph X is obtained according to the filteringfGray value range and initial classification map X ofmFor difference map XdThe pixel points in the image are subjected to category marking to obtain a category marking image XbThe method comprises the following steps:
(5a) computing an initial classification map XmIf the area of each region formed by the pixels of the middle variation class is less than 70, the initial classification map X ismThe pixel mark in the area is assigned to be 2, otherwise, the pixel mark is unchanged;
(5b) calculating a filtered difference map XfMaximum gray value N ofmaxIf N is presentmaxGreater than the threshold T, the initial classification map XmLabel graph X for the categorybAnd ending; otherwise, turning to (5c), wherein the gray level threshold T is a constant and the value range is 100-150;
(5c) for the initial classification chart XmThe region formed by the pixels marked 1 is subjected to mathematical morphological dilation of a square window with the structural element 3 × 3, obtaining a small expanded image Xm1And a small extended image Xm1The pixel value of the midpoint (m, n) is denoted as Xm1(m,n);
(5d) For the initial classification chart XmThe region formed by the pixels marked 1 is subjected to mathematical morphological dilation of a square window with the structural element 7 × 7, obtaining a large expanded image Xm2And will expand the image X greatlym2The pixel value of the midpoint (m, n) is denoted as Xm2(m,n);
(5e) Will expand the image X greatlym2And a small extended image Xm1Spatial correspondence positionPerforming difference calculation on the gray values of the pixel points at the positions (m, n) to obtain a difference value Xm3(m,n)=Xm2(m,n)-Xm1(m, n) to obtain an extended difference image Xm3={Xm3(m, n) }, wherein Xm3(m, n) is an extended difference image Xm3The pixel value at the midpoint (m, n);
(5f) establishing a class label graph X with the same size as the initial classification imagebAnd the pixel value X at the point (m, n) in the image is compared according to the following five conditionsb(m, n) assignment:
for satisfaction of condition Xm(m, n) ═ 2 pixel points (m, n), label the category label map XbValue X of the pixel pointb(m, n) is labeled 2;
for satisfaction of condition Xm(m, n) ≠ 2 and Xm3(m, n) ═ 1 pixel (m, n), labeling the category label map XbValue X of the pixel pointb(m, n) is labeled 3;
for satisfaction of condition Xm(m, n) ≠ 2 and Xm1(m, n) ═ 1 pixel (m, n), labeling the category label map XbValue X of the pixel pointb(m, n) is marked 1;
for satisfaction of condition Xm(m, n) ═ 0 pixel point (m, n), label the classification map XbValue X of the pixel pointb(m, n) is labeled 0;
for the pixel points (m, n) which do not meet the four conditions, the category label graph X is usedbValue X of the pixel pointbThe (m, n) is marked 0.
4. The remote sensing image change detection method according to claim 1, characterized in that: labeling the graph X according to the category in the step (6)bPixel value at the middle position (m, n), versus difference map XdFiltering the pixel at the middle position (m, n) to obtain a de-noising difference image XNThe method comprises the following steps:
for satisfaction of condition XbCalculating difference map X for pixel point with (m, n) equal to 0dThe median value of all pixels in a window with the size of 9 multiplied by 9 of the pixel point is assigned to a de-noising difference value image XNMidpoint (m, n)) Pixel value X ofN(m,n);
For satisfaction of condition Xb(m, n) is equal to 1 pixel point, difference map XdValue X of the pixel pointd(m, n) is given to the denoised difference map XNPixel value X at midpoint (m, n)N(m,n);
For satisfaction of condition XbCalculating difference map X for (m, n) ═ 2 pixel pointsdThe median value of all pixels in a window of 11X 11 size of the pixel point is assigned to the denoising difference value image XNPixel value X at midpoint (m, n)N(m,n);
If there is a condition X satisfiedbCalculating difference map X for (m, n) ═ 3 pixel pointsdThe Gaussian kernel function value with the Gaussian scale of 3 multiplied by 3 of the pixel point is assigned to the de-noising difference image XNPixel value X at midpoint (m, n)N(m,n);
Value X of all pixel points (m, n)N(m, n) form a denoised difference map XN={XN(m,n)}。
CN201310117627.0A 2013-04-07 2013-04-07 Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges Expired - Fee Related CN103198482B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310117627.0A CN103198482B (en) 2013-04-07 2013-04-07 Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310117627.0A CN103198482B (en) 2013-04-07 2013-04-07 Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges

Publications (2)

Publication Number Publication Date
CN103198482A CN103198482A (en) 2013-07-10
CN103198482B true CN103198482B (en) 2015-10-28

Family

ID=48720988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310117627.0A Expired - Fee Related CN103198482B (en) 2013-04-07 2013-04-07 Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges

Country Status (1)

Country Link
CN (1) CN103198482B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103632155B (en) * 2013-12-16 2016-08-17 武汉大学 Remote sensing image variation detection method based on slow feature analysis
CN103871039B (en) * 2014-03-07 2017-02-22 西安电子科技大学 Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection
CN104700374A (en) * 2015-03-26 2015-06-10 东莞职业技术学院 Scene image de-noising method based on Type-2 fuzzy logic system
CN109003230A (en) * 2018-06-07 2018-12-14 西安电子科技大学 A kind of Cherenkov's fluorescent image impact noise minimizing technology and system
CN112104878B (en) * 2020-08-21 2024-09-13 西安万像电子科技有限公司 Image coding method, device, coding end equipment and storage medium
CN112995518A (en) * 2021-03-12 2021-06-18 北京奇艺世纪科技有限公司 Image generation method and device
CN113609990A (en) * 2021-08-06 2021-11-05 中国工商银行股份有限公司 Method and device for determining construction progress of target building and server
CN115647696B (en) * 2022-12-14 2023-03-21 中国华西企业股份有限公司 Automatic machining device, machining method and machining terminal for large steel structure

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950364A (en) * 2010-08-30 2011-01-19 西安电子科技大学 Remote sensing image change detection method based on neighbourhood similarity and threshold segmentation
CN102169584A (en) * 2011-05-28 2011-08-31 西安电子科技大学 Remote sensing image change detection method based on watershed and treelet algorithms
US8265356B2 (en) * 2008-01-30 2012-09-11 Computerized Medical Systems, Inc. Method and apparatus for efficient automated re-contouring of four-dimensional medical imagery using surface displacement fields
CN102968790A (en) * 2012-10-25 2013-03-13 西安电子科技大学 Remote sensing image change detection method based on image fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2780595A1 (en) * 2011-06-22 2012-12-22 Roman Palenychka Method and multi-scale attention system for spatiotemporal change determination and object detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8265356B2 (en) * 2008-01-30 2012-09-11 Computerized Medical Systems, Inc. Method and apparatus for efficient automated re-contouring of four-dimensional medical imagery using surface displacement fields
CN101950364A (en) * 2010-08-30 2011-01-19 西安电子科技大学 Remote sensing image change detection method based on neighbourhood similarity and threshold segmentation
CN102169584A (en) * 2011-05-28 2011-08-31 西安电子科技大学 Remote sensing image change detection method based on watershed and treelet algorithms
CN102968790A (en) * 2012-10-25 2013-03-13 西安电子科技大学 Remote sensing image change detection method based on image fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Memetic算法的SAR图像变化检测;辛芳芳;《红外与毫米波学报》;20120229;第31卷(第1期);第67-72页 *

Also Published As

Publication number Publication date
CN103198482A (en) 2013-07-10

Similar Documents

Publication Publication Date Title
CN103198482B (en) Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges
Sahebjalal et al. Analysis of land use-land covers changes using normalized difference vegetation index (NDVI) differencing and classification methods
Xie et al. Multiscale densely-connected fusion networks for hyperspectral images classification
CN106529508B (en) Based on local and non local multiple features semanteme hyperspectral image classification method
CN108053419A (en) Inhibited and the jamproof multiscale target tracking of prospect based on background
Turker et al. Field-based sub-boundary extraction from remote sensing imagery using perceptual grouping
Watkins et al. Automating field boundary delineation with multi-temporal Sentinel-2 imagery
CN109816012A (en) A kind of multiscale target detection method of integrating context information
CN105389817B (en) A kind of two phase remote sensing image variation detection methods
CN112001928B (en) Retina blood vessel segmentation method and system
CN103456020B (en) Based on the method for detecting change of remote sensing image of treelet Fusion Features
CN104732546B (en) The non-rigid SAR image registration method of region similitude and local space constraint
CN101853509A (en) SAR (Synthetic Aperture Radar) image segmentation method based on Treelets and fuzzy C-means clustering
CN105160355A (en) Remote sensing image change detection method based on region correlation and visual words
CN104408463A (en) High-resolution construction land graph spot identification method based on PanTex and linear characteristic
CN104794729A (en) SAR image change detection method based on significance guidance
CN113963261A (en) Method and system for extracting full convolution neural network cultivated land based on multi-scale fusion
CN104217436A (en) SAR image segmentation method based on multiple feature united sparse graph
Helmholz et al. Semi-automatic verification of cropland and grassland using very high resolution mono-temporal satellite images
CN105512622A (en) Visible remote-sensing image sea-land segmentation method based on image segmentation and supervised learning
CN114202539A (en) Hyperspectral image anomaly detection method based on end-to-end RX
CN103218823B (en) Based on the method for detecting change of remote sensing image that core is propagated
CN105354845B (en) A kind of semi-supervised change detecting method of remote sensing image
Gaucherel et al. The comparison map profile method: A strategy for multiscale comparison of quantitative and qualitative images
Abujayyab et al. Integrating object-based and pixel-based segmentation for building footprint extraction from satellite images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151028

Termination date: 20200407