CN103198482B - Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges - Google Patents
Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges Download PDFInfo
- Publication number
- CN103198482B CN103198482B CN201310117627.0A CN201310117627A CN103198482B CN 103198482 B CN103198482 B CN 103198482B CN 201310117627 A CN201310117627 A CN 201310117627A CN 103198482 B CN103198482 B CN 103198482B
- Authority
- CN
- China
- Prior art keywords
- difference
- map
- pixel
- value
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000008859 change Effects 0.000 title claims abstract description 93
- 238000000034 method Methods 0.000 title abstract description 51
- 238000001514 detection method Methods 0.000 claims abstract description 64
- 230000004927 fusion Effects 0.000 claims abstract description 24
- 230000011218 segmentation Effects 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 6
- 238000009825 accumulation Methods 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 abstract description 5
- 230000008569 process Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 12
- 238000012733 comparative method Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000002474 experimental method Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 238000000691 measurement method Methods 0.000 description 2
- 230000000877 morphologic effect Effects 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 101000633607 Bos taurus Thrombospondin-2 Proteins 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000011524 similarity measure Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Image Processing (AREA)
Abstract
本发明公开一种基于差异图模糊隶属度融合的遥感图像变化检测方法,主要解决现有变化检测方法不能既有效去除伪变化信息又保持边缘信息的问题。其实现过程是:输入两幅不同时相的遥感图像,计算其对应像素点的结构相似度系数,得到一幅相似度差异图;对两幅遥感图像做差得到一幅差值图像;对差值图的像素进行类别标记得到一幅类别标记图;根据类别标记图对差值图进行滤波处理得到一幅去噪差值图;对相似度差异图和去噪差异图进行模糊隶属度融合并分类得到变化检测结果。本发明具有较强的抗噪性,能有效去除伪变化信息,同时保留较好的边缘信息,检测结果准确率高,可用于城区扩展监测、森林和植被变化监测。
The invention discloses a remote sensing image change detection method based on fusion of difference map fuzzy membership degree, which mainly solves the problem that the existing change detection method cannot effectively remove false change information and maintain edge information. The implementation process is: input two remote sensing images with different time phases, calculate the structural similarity coefficients of the corresponding pixels, and obtain a similarity difference map; make a difference between the two remote sensing images to obtain a difference image; The pixels of the value map are labeled with categories to obtain a category label map; the difference map is filtered according to the category label map to obtain a denoising difference map; the similarity difference map and the denoising difference map are fuzzy membership fusion and combined Classify the change detection results. The invention has strong noise resistance, can effectively remove false change information, and at the same time retain better edge information, has high detection result accuracy, and can be used for urban expansion monitoring, forest and vegetation change monitoring.
Description
技术领域technical field
本发明属于数字图像处理领域,主要涉及遥感图像变化检测,具体是一种基于差异图模糊隶属度融合的遥感图像变化检测方法,可用于遥感图像分析和处理。The invention belongs to the field of digital image processing, and mainly relates to remote sensing image change detection, in particular to a remote sensing image change detection method based on fusion of difference map fuzzy membership degrees, which can be used for remote sensing image analysis and processing.
背景技术Background technique
遥感图像的变化检测是通过分析和提取同一地区不同时相的遥感图像之间存在的电磁波谱特征差异或空间结构特征差异,来识别物体的状态变化或现象变化的过程。在国民经济和国防建设的诸多领域已得到广泛应用,如农业调查、森林和植被变化监测、城区扩展监测、军事目标监测等。The change detection of remote sensing images is the process of identifying the state changes or phenomena changes of objects by analyzing and extracting the differences in electromagnetic spectrum characteristics or spatial structure characteristics between remote sensing images in different time periods in the same area. It has been widely used in many fields of national economy and national defense construction, such as agricultural survey, forest and vegetation change monitoring, urban expansion monitoring, military target monitoring, etc.
常见的遥感图像变化检测方法的一般步骤是先构造差异图,然后选取适当的阈值将差异图分为变化类和非变化类。其中差异图的构造和对差值图的处理是图像变化检测的重要步骤。比较简单的差异图像构造方法有差值法,这种方法易于实现但是构造出的差异图噪声较多,需要有效的方法对差异图中的噪声进行处理。将差异图像中化较剧烈的噪声去除,将灰度值不大的变化类像素进行增强,能有效提高差异图的质量,使检测结果更准确。The general steps of common remote sensing image change detection methods are to construct a difference map first, and then select an appropriate threshold to classify the difference map into change classes and non-change classes. The construction of the difference map and the processing of the difference map are important steps in image change detection. A relatively simple difference image construction method is the difference method. This method is easy to implement, but the constructed difference image has more noise, and an effective method is needed to deal with the noise in the difference image. Removing the more severe noise in the difference image and enhancing the pixels with small gray value changes can effectively improve the quality of the difference image and make the detection results more accurate.
为了构造较好的差异图,一些学者通过度量两时相图像对应像素灰度值的相似度来构造差异图,Inglada和Mercier(2007)在文章“A New Statistical Similarity Measurefor Change Detection in Multitemporal SAR Images and Its Extension to MultiscaleChange Analysis,IEEE Transactions on Geoscience and Remote Sensing,2007,45(5):1432-1445”中提出了一种基于统计相似度的SAR图像变化检测方法,该方法用KL散度衡量两时相图像对应像素邻域的统计相似度来构造差异图,然后阈值分割差异图得到变化结果,该方法是对局部直方图建模,但是局部区域的像素很少,很难有效的对其建模,所以该方法的检测结果较差。He(2010)在文章“Application of EuclideanNorm in Multi-temporal Remote Sensing Image Change Detection,International Congresson Image and Signal Processing(CISP’2010),2010,5:2111-2115”中提出了一种基于欧式距离的变化检测方法,该方法通过计算多个波段的两时相遥感图像的欧式距离来构造差异图,然后对差异图进行阈值分割得到变化结果,该方法能够有效地减少部分噪声的影响但检测结果仍存在较多伪变化信息。In order to construct a better difference map, some scholars construct a difference map by measuring the similarity of the corresponding pixel gray value of the two-temporal images, Inglada and Mercier (2007) in the article "A New Statistical Similarity Measure for Change Detection in Multitemporal SAR Images and Its Extension to MultiscaleChange Analysis, IEEE Transactions on Geoscience and Remote Sensing, 2007,45(5):1432-1445" proposed a SAR image change detection method based on statistical similarity, which uses KL divergence to measure two-time The statistical similarity of the corresponding pixel neighborhood of the phase image is used to construct the difference map, and then the threshold value is used to segment the difference map to obtain the change result. This method is to model the local histogram, but there are few pixels in the local area, so it is difficult to effectively model it. , so the detection result of this method is poor. He (2010) proposed a change based on Euclidean distance in the article "Application of EuclideanNorm in Multi-temporal Remote Sensing Image Change Detection, International Congresson Image and Signal Processing (CISP'2010), 2010,5:2111-2115" Detection method, this method constructs a difference map by calculating the Euclidean distance of two-temporal remote sensing images of multiple bands, and then performs threshold segmentation on the difference map to obtain the change result. This method can effectively reduce the influence of part of the noise but the detection result still exists. More pseudo-change information.
为了合并不同差异图的优点,一些学者对不同的差异图进行了融合,马国锐等学者(2006)在文章“基于融合和广义高斯模型的遥感影像变化检测,遥感学报,2006,10(6):847-853”中提出了用乘积融合策略来融合差值图和比值图的方法,该方法能够抑制背景,在一定程度上增强变化区域,但是该方法并不稳定,有时也会抑制变化区域。汪闽和张星月(2010)在文章“多特征证据融合的遥感图像变化检测.遥感学报,2010,14(2):1-7”中提出了用证据理论融合方法融合多种特征差异图的方法,该方法提高了单一特征检测方法的检测精度,但该方法采用的结构相似度度量方法不稳定,从而降低了检测结果的正确率。Du等(2012)在文章“Fusion of Difference Images forChange Detection over Urban Areas,IEEE Journal of Selected Topics in Applied EarthObserbations and Remote Sensing,2012,5(4):1076-1086”中提出了对多种差异图进行特征级和决策级融合的变化检测方法,该方法合并了多种差异图的优点,提高了变化检测的正确率,但该方法并没有根据差异图的优缺点有针对性的融合差异图,所以检测结果的正确率提高不多,有时可能会下降。In order to combine the advantages of different difference maps, some scholars have fused different difference maps. Ma Guorui and other scholars (2006) wrote in the article "Remote Sensing Image Change Detection Based on Fusion and Generalized Gaussian Model", Journal of Remote Sensing, 2006, 10(6): 847-853" proposed a method of fusing the difference map and the ratio map using the product fusion strategy. This method can suppress the background and enhance the change area to a certain extent, but the method is not stable and sometimes suppresses the change area. Wang Min and Zhang Xingyue (2010) proposed in the article "Remote Sensing Image Change Detection Based on Multi-Feature Evidence Fusion. Journal of Remote Sensing, 2010, 14(2): 1-7" to use evidence theory fusion method to fuse multiple feature difference maps. , this method improves the detection accuracy of the single feature detection method, but the structural similarity measurement method adopted by this method is unstable, thus reducing the correct rate of detection results. Du et al. (2012) in the article "Fusion of Difference Images for Change Detection over Urban Areas, IEEE Journal of Selected Topics in Applied Earth Obserbations and Remote Sensing, 2012, 5(4): 1076-1086" proposes a multi-difference map A change detection method based on feature-level and decision-level fusion. This method combines the advantages of multiple difference maps and improves the accuracy of change detection. However, this method does not fuse difference maps according to the advantages and disadvantages of difference maps. Therefore, The correct rate of detection results has not improved much, and may sometimes decrease.
发明内容Contents of the invention
本发明的目的在于克服上述变换检测技术的不足,提出了一种基于差异图模糊隶属度融合的遥感图像变化检测方法,以降低噪声对检测结果的影响,减少检测结果中的伪变化信息,提高检测结果的正确率。The purpose of the present invention is to overcome the deficiencies of the above-mentioned transformation detection technology, and propose a remote sensing image change detection method based on the fuzzy membership degree fusion of the difference map, to reduce the influence of noise on the detection results, reduce the false change information in the detection results, and improve The accuracy of the test results.
为实现上述目的,本发明的检测方法包括如下步骤:To achieve the above object, the detection method of the present invention comprises the following steps:
(1)输入的两幅大小均为I×J的同一地区不同时相的已配准的遥感图像X1和X2,计算这两幅遥感图像X1和X2对应像素点的结构相似度系数SIM(m,n),得到一个相似度系数矩阵SIM:(1) Input two registered remote sensing images X 1 and X 2 of the same area with different time phases of size I×J, and calculate the structural similarity of the corresponding pixels of the two remote sensing images X 1 and X 2 Coefficient SIM(m,n), get a similarity coefficient matrix SIM:
SIM={SIM(m,n)|1≤m≤I,1≤n≤J}SIM={SIM(m,n)|1≤m≤I, 1≤n≤J}
其中,
式中,m和n分别为图像的行序号和列序号,m=1,2,...,I,n=1,2,...,J,μ1(m,n)、σ1(m,n)和(m,n)分别为遥感图像X1中以像素点(m,n)为中心、窗口大小为w×w的局部区域像素值的均值、标准差和方差,μ2(m,n)、σ2(m,n)和(m,n)分别为遥感图像X2中以像素点(m,n)为中心、窗口大小为w×w的局部区域像素值的均值、标准差和方差,窗口w的取值范围为3~9,C是用来避免分母接近零时产生不稳定现象的常数,C的取值范围为C>0,λ为权值系数,结构相似度系数的取值范围为0≤SIM(m,n)≤1;In the formula, m and n are the row number and column number of the image respectively, m=1,2,...,I, n=1,2,...,J, μ 1 (m,n), σ 1 (m,n) and (m,n) are the mean, standard deviation and variance of the pixel values in the local area of the remote sensing image X 1 centered on the pixel (m,n) and the window size is w×w, μ 2 (m,n), σ 2 (m,n) and (m,n) are the mean, standard deviation and variance of the pixel values in the local area of the remote sensing image X 2 centered on the pixel point (m,n) and the window size is w×w, and the value range of the window w is 3 ~9, C is a constant used to avoid instability when the denominator is close to zero, the value range of C is C>0, λ is the weight coefficient, and the value range of the structural similarity coefficient is 0≤SIM(m, n)≤1;
(2)将相似度系数矩阵SIM在位置(m,n)处的结构相似度系数SIM(m,n)线性映射到区间[0,255],得到相似度差异图XS在位置(m,n)处的像素灰度值XS(m,n):(2) Linearly map the structural similarity coefficient SIM(m,n) of the similarity coefficient matrix SIM at position (m,n) to the interval [0,255], and obtain the similarity difference map X S at position (m,n) Pixel gray value X S (m,n) at:
XS(m,n)=(1-SIM(m,n))×255,X S (m,n)=(1-SIM(m,n))×255,
(3)将遥感图像X1和X2空间对应位置(m,n)处的像素点灰度值X1(m,n)和X2(m,n)进行差值计算,得到差值Xd(m,n)=|X1(m,n)-X2(m,n)|,按照从左到右、从上到下的顺序依次计算遥感图像X1和X2空间对应位置的像素点灰度值的差值,得到一幅差值差异图Xd:(3) Calculate the difference between the pixel gray values X 1 (m,n) and X 2 (m,n) at the corresponding positions (m,n) of the remote sensing images X 1 and X 2 to obtain the difference X d (m,n)=|X 1 (m,n)-X 2 (m,n)|, according to the order from left to right and from top to bottom, calculate the corresponding position of remote sensing image X 1 and X 2 The difference between the gray value of the pixel, and get a difference difference map X d :
Xd={Xd(m,n)|1≤m≤I,1≤n≤J},X d ={X d (m,n)|1≤m≤I, 1≤n≤J},
(4)对差值差异图Xd中的像素进行窗口大小为3×3的中值滤波,得到滤波后差值图Xf,对滤波后差值图Xf进行统计直方图阈值分割,得到初始分类图Xm;(4) Perform median filtering with a window size of 3×3 on the pixels in the difference difference map X d to obtain the filtered difference map X f , and perform statistical histogram threshold segmentation on the filtered difference map X f to obtain initial classification map X m ;
(5)根据滤波后差值图Xf的灰度值范围和初始分类图Xm,对差值差异图Xd中的像素点进行类别标记,得到一幅类别标记图Xb;(5) According to the gray value range of the filtered difference map X f and the initial classification map X m , classify the pixels in the difference difference map X d to obtain a class label map X b ;
(6)根据类别标记图Xb中位置(m,n)处的标记,对差值差异图Xd中位置(m,n)处的像素进行滤波,得到去噪差值图XN;(6) Filter the pixels at position (m, n) in the difference difference map X d according to the mark at the position (m, n) in the class mark map X b , and obtain the denoising difference map X N ;
(7)计算相似度差异图XS中位置(m,n)处像素的变化类隶属度(m,n)和非变化类隶属度(m,n),计算去噪差值图XN在位置(m,n)处像素的变化类隶属度(m,n)和非变化类隶属度(m,n);(7) Calculate the change class membership degree of the pixel at position (m, n) in the similarity difference map X S (m,n) and non-changing class membership (m,n), calculate the change class membership degree of the pixel at position (m,n) in the denoising difference image X N (m,n) and non-changing class membership (m,n);
(8)计算相似度差异图XS和去噪差值图XN的对应像素点(m,n)的变化类隶属度(m,n)和(m,n)的融合隶属度值Hc(m,n),再计算该像素点的非变化类隶属度(m,n)和(m,n)的融合隶属度Hu(m,n);(8) Calculate the change class membership of the corresponding pixel points (m, n) in the similarity difference map X S and the denoising difference map X N (m,n) and (m,n) fusion membership value H c (m,n), and then calculate the non-changing class membership of the pixel (m,n) and Fusion membership degree H u (m,n) of (m,n);
(9)建立一幅与相似度差异图XS相同大小的融合图像XI,如果Hc(m,n)>Hu(m,n),则将该像素点处的融合图像XI值的标记为1,否则记为0,由此得到一幅变化检测结果图像。(9) Create a fused image X I with the same size as the similarity difference map X S , if H c (m,n)>H u (m,n), then the value of the fused image X I at the pixel is marked as 1, otherwise it is recorded as 0, thus obtaining a change detection result image.
本发明与现有技术相比具有如下优点:Compared with the prior art, the present invention has the following advantages:
1、本发明提出的结构相似度度量方法能稳定有效地度量两时相图像像素之间的相似性,构造的相似度差异图能够抑制背景噪声,增强目标与背景的对比度。1. The structural similarity measurement method proposed by the present invention can stably and effectively measure the similarity between the pixels of the two-temporal images, and the constructed similarity difference map can suppress background noise and enhance the contrast between the target and the background.
2、本发明对差值图中的像素进行类别标记,能够在保持边缘信息的同时有效抑制差值图中较剧烈的噪声。2. The present invention classifies the pixels in the difference map, which can effectively suppress the severe noise in the difference map while maintaining the edge information.
3、本发明融合相似度差异图和去噪差值图,有效地合并了两种差异图的优点,不仅降低了噪声对检测结果的影响,还保持了变化区域的边缘,提高了变化检测结果的正确率。3. The present invention fuses the similarity difference map and the denoising difference map, effectively combining the advantages of the two difference maps, not only reducing the influence of noise on the detection results, but also maintaining the edge of the change area, improving the change detection results correct rate.
附图说明Description of drawings
图1是本发明的实现流程框图;Fig. 1 is the realization flow diagram of the present invention;
图2是用于实验的第一组遥感图像和对应的变化参考图像;Figure 2 is the first set of remote sensing images used in the experiment and the corresponding change reference images;
图3是用于实验的第二组遥感图像和对应的变化参考图像;Fig. 3 is the second group of remote sensing images used in the experiment and the corresponding change reference images;
图4是本发明与对比方法仿真第一组遥感图像得到的变化检测结果图;Fig. 4 is the change detection result figure that the present invention and comparative method simulate the first group of remote sensing images to obtain;
图5是本发明与对比方法仿真第二组遥感图像得到的变化检测结果图。Fig. 5 is a diagram of change detection results obtained by simulating the second group of remote sensing images according to the present invention and the comparison method.
具体实施方式detailed description
参照图1,本发明的实现步骤如下:With reference to Fig. 1, the realization steps of the present invention are as follows:
步骤1,输入的两幅大小均为I×J的同一地区不同时相的已配准的遥感图像X1和X2,如图2(a)和图2(b)所示,计算这两幅遥感图像X1和X2对应像素点的结构相似度系数SIM(m,n),得到一个相似度系数矩阵SIM。Step 1, input two registered remote sensing images X 1 and X 2 of the same area with different time phases of size I×J, as shown in Fig. 2(a) and Fig. 2(b), and calculate the two Structural similarity coefficients SIM(m,n) of corresponding pixels of remote sensing images X 1 and X 2 are obtained to obtain a similarity coefficient matrix SIM.
1.1)分别计算时相1和时相2的相遥感图像X1和X2中以像素点(m,n)为中心、窗口大小为w×w像素的局部区域的均值μ1(m,n)和μ2(m,n)、标准差σ1(m,n)和σ2(m,n)及方差(m,n)和(m,n),m和n分别为图像的行序号和列序号,m=1,2,…,I,n=1,2,…,J,为了在定位精度和抗噪能力方面进行折中,窗口w的取值范围为3~9,本发明实例中w=5;1.1) Calculate the mean value μ 1 (m,n) of the local area centered on the pixel point (m,n) and the window size is w×w pixels in the phase remote sensing images X 1 and X 2 of phase 1 and phase 2 respectively ) and μ 2 (m,n), standard deviation σ 1 (m,n) and σ 2 (m,n) and variance (m,n) and (m,n), m and n are the row number and column number of the image respectively, m=1,2,...,I, n=1,2,...,J, in order to compare the positioning accuracy and noise resistance Among them, the value range of the window w is 3-9, and w=5 in the example of the present invention;
1.2)计算时相1和时相2的遥感图像X1和X2对应像素点(m,n)处的结构相似度系数SIM(m,n):1.2) Calculate the structural similarity coefficient SIM(m,n) at the corresponding pixel point (m,n) of the remote sensing images X 1 and X 2 in phase 1 and phase 2:
式中,C是用来避免分母接近零时产生不稳定现象的常数,C的取值范围为C>0,本发明实例中取为1,λ为权值系数,本发明实例中λ=0.95,结构相似度系数计算所得范围0≤SIM(m,n)≤1,SIM(m,n)越接近于0,表明遥感图像X1和X2在像素点(m,n)处越不相似,属于变化类的可能性越大;In the formula, C is the constant that is used to avoid the unstable phenomenon when the denominator is close to zero, and the value range of C is C>0, which is taken as 1 in the example of the present invention, and λ is a weight coefficient, and λ=0.95 in the example of the present invention , the calculated range of the structural similarity coefficient is 0≤SIM(m,n)≤1, and the closer SIM(m,n) is to 0, it indicates that the remote sensing images X 1 and X 2 are less similar at the pixel point (m,n) , the greater the possibility of belonging to the change class;
1.3)按照步骤(1.1)和步骤(1.2)计算图像中所有像素的结构相似度系数,得到一个相似度系数矩阵SIM={SIM(m,n)|1≤m≤I,1≤n≤J}。1.3) Calculate the structural similarity coefficients of all pixels in the image according to step (1.1) and step (1.2), and obtain a similarity coefficient matrix SIM={SIM(m,n)|1≤m≤I, 1≤n≤J }.
步骤2,将相似度系数矩阵SIM在位置(m,n)处的结构相似度系数SIM(m,n)线性映射到区间[0,255],得到相似度差异图XS在位置(m,n)处的像素灰度值XS(m,n):Step 2, linearly map the structural similarity coefficient SIM(m,n) of the similarity coefficient matrix SIM at position (m,n) to the interval [0,255], and obtain the similarity difference map X S at position (m,n) Pixel gray value X S (m,n) at:
XS(m,n)=(1-SIM(m,n))×255。X S (m,n)=(1-SIM(m,n))×255.
步骤3,将时相1和时相2的遥感图像X1和X2空间对应位置(m,n)处的像素点灰度值X1(m,n)和X2(m,n)进行差值计算,得到差值Xd(m,n)=|X1(m,n)-X2(m,n)|,按照从左到右、从上到下的顺序依次计算遥感图像X1和X2空间对应位置的像素点灰度值的差值,得到一幅差值差异图Xd:Step 3, the pixel gray value X 1 (m,n) and X 2 (m,n) at the corresponding positions (m,n) of the remote sensing images X 1 and X 2 in the phase 1 and phase 2 are calculated Calculate the difference to get the difference X d (m,n)=|X 1 (m,n)-X 2 (m,n)|, and calculate the remote sensing image X in order from left to right and from top to bottom 1 and X 2 The difference between the gray value of the pixel at the corresponding position in the space to obtain a difference difference map X d :
Xd={Xd(m,n)|1≤m≤I,1≤n≤J}。X d ={X d (m,n)|1≤m≤I, 1≤n≤J}.
步骤4,对差值差异图Xd进行窗口大小为3×3的中值滤波,得到滤波后差值图Xf,对滤波后差值图Xf进行统计直方图阈值分割,得到初始分类图Xm。Step 4: Perform median filtering with a window size of 3×3 on the difference difference map X d to obtain the filtered difference map X f , perform statistical histogram threshold segmentation on the filtered difference map X f to obtain the initial classification map X m .
4.1)对差值差异图Xd进行窗口大小为3×3的中值滤波,得到滤波后差值图Xf;4.1) Perform median filtering with a window size of 3×3 on the difference difference map X d to obtain the filtered difference map X f ;
4.2)对滤波后差值图Xf,将满足条件的灰度级中的最小灰度级设置为低阈值Tl,其中,L表示进行灰度级像素个数累加的最大像素灰度级,其取值范围为0到255,Nx表示灰度级为x的像素的总个数,λ为一个比例常数,取值范围为0.5~0.6,本发明实例中λ=0.5;4.2) For the filtered difference map X f , the condition will be satisfied The minimum gray level in the gray level is set to the low threshold T l , where L represents the maximum pixel gray level for the accumulation of the number of gray level pixels, and its value ranges from 0 to 255, and N x represents the gray level Level is the total number of pixels of x, λ is a proportional constant, and the value range is 0.5~0.6, and λ=0.5 in the example of the present invention;
4.3)对滤波后差值图Xf,将大于低阈值Tl且满足条件Nx<(1-λ)×I×J/(255-Tl)的灰度级中的最小灰度级设定为统计直方图阈值 4.3) For the filtered difference image X f , set the minimum gray level among the gray levels greater than the low threshold T l and satisfy the condition N x <(1-λ)×I×J/(255-T l ) Statistical histogram threshold
4.4)对滤波后差值图Xf以统计直方图阈值进行分割,得到初始分类图Xm:4.4) Statistical histogram threshold for the filtered difference map X f Carry out segmentation to obtain the initial classification map X m :
其中,Xm(m,n)为初始分类图Xm中(m,n)处的像素值,Xf(m,n)为滤波后差值图Xf中(m,n)处的灰度值,标记1表示为变化类,标记0表示非变化类。Among them, X m (m, n) is the pixel value at (m, n) in the initial classification map X m , and X f (m, n) is the gray value at (m, n) in the filtered difference map X f Degree value, mark 1 means change class, mark 0 means non-change class.
步骤5,根据滤波后差值图Xf的灰度值范围和初始分类图Xm,对差值差异图Xd中的像素点进行类别标记,得到一幅类别标记图Xb。Step 5, according to the gray value range of the filtered difference map X f and the initial classification map X m , classify the pixels in the difference difference map X d to obtain a class label map X b .
5.1)计算初始分类图Xm中变化类像素形成的各个区域的面积,如果面积小于70,则将初始分类图Xm中该区域内的像素标记赋值为2,否则像素标记不变;5.1) Calculate the area of each area formed by the change class pixels in the initial classification map Xm , if the area is less than 70, assign the pixel mark in this area in the initial classification map Xm to 2, otherwise the pixel mark remains unchanged;
5.2)计算滤波后差值图Xf的最大灰度值Nmax,如果Nmax大于阈值T,初始分类图Xm则为类别标记图Xb;否则,转到步骤(5.3),其中灰度级阈值T为一个常数,取值范围为100~150,本发明实例中T=100;5.2) Calculate the maximum gray value N max of the filtered difference image X f , if N max is greater than the threshold T, the initial classification image X m is the category label image X b ; otherwise, go to step (5.3), where the gray value Level threshold T is a constant, and the value range is 100~150, and T=100 in the example of the present invention;
5.3)对初始分类图Xm中标记为1的像素形成的区域,进行结构元素为3×3的方形窗的数学形态学膨胀,得到小扩展图像Xm1,并小扩展图像Xm1中点(m,n)的像素值记为Xm1(m,n);5.3) For the area formed by the pixels marked as 1 in the initial classification map X m , perform mathematical morphological expansion of a square window with a structural element of 3×3 to obtain a small extended image X m1 , and the midpoint of the small extended image X m1 ( The pixel value of m,n) is denoted as X m1 (m,n);
5.4)对初始分类图Xm中标记为1的像素形成的区域,进行结构元素为7×7的方形窗的数学形态学膨胀,得到大扩展图像Xm2,并大扩展图像Xm2中点(m,n)的像素值记为Xm2(m,n);5.4) For the area formed by the pixels marked as 1 in the initial classification map X m , perform mathematical morphological expansion of a square window with a structural element of 7×7 to obtain a large extended image X m2 , and the midpoint of the large extended image X m2 ( The pixel value of m,n) is recorded as X m2 (m,n);
5.5)将大扩展图像Xm2和小扩展图像Xm1空间对应位置(m,n)处的像素点灰度值进行差值计算,得到差值Xm3(m,n)=Xm2(m,n)-Xm1(m,n),由此得到一幅扩展差值图像Xm3={Xm3(m,n)},其中Xm3(m,n)为扩展差值图像Xm3中点(m,n)处的像素值;5.5) Calculate the difference between the pixel gray value at the corresponding position (m, n) of the large extended image X m2 and the small extended image X m1 space, and obtain the difference X m3 (m, n)=X m2 (m, n)-X m1 (m,n), thus obtaining an extended difference image X m3 ={X m3 (m,n)}, where X m3 (m,n) is the midpoint of the extended difference image X m3 Pixel value at (m,n);
5.6)建立一幅与初始分类图像相同大小的类别标记图Xb,并按以下五种情况对该图像中点(m,n)处的像素值Xb(m,n)进行赋值:5.6) Create a class label map X b of the same size as the initial classification image, and assign the pixel value X b (m,n) at the point (m,n) in the image according to the following five situations:
对满足条件Xm(m,n)=2的像素点(m,n),将类别标记图Xb中该像素点的值Xb(m,n)标记为2;For a pixel point (m, n) that satisfies the condition X m (m, n)=2, mark the value X b (m, n) of the pixel point in the category label map X b as 2;
对满足条件Xm(m,n)≠2且Xm3(m,n)=1的像素点(m,n),将类别标记图Xb中该像素点的值Xb(m,n)标记为3;For a pixel point (m,n) that satisfies the condition X m (m,n)≠2 and X m3 (m,n)=1, the value X b (m,n) of the pixel point in the category label map X b marked as 3;
对满足条件Xm(m,n)≠2且Xm1(m,n)=1的像素点(m,n),将类别标记图Xb中该像素点的值Xb(m,n)标记为1;For a pixel point (m,n) that satisfies the condition X m (m,n)≠2 and X m1 (m,n)=1, the value X b (m,n) of the pixel point in the category label map X b mark as 1;
对满足条件Xm(m,n)=0的像素点(m,n),将类别标记图Xb中该像素点的值Xb(m,n)标记为0;For a pixel point (m, n) that satisfies the condition X m (m, n)=0, mark the value X b (m, n) of the pixel point in the category label map X b as 0;
对不满足以上四种条件的像素点(m,n),将类别标记图Xb中该像素点的值Xb(m,n)标记为0。For a pixel point (m, n) that does not meet the above four conditions, the value X b (m, n) of the pixel point in the category label map X b is marked as 0.
步骤6,根据类别标记图Xb中位置(m,n)处的像素值,对差值差异图Xd中位置(m,n)处的像素进行滤波,得到一幅去噪差值图XN。Step 6, according to the pixel value at the position (m, n) in the class label map X b , filter the pixel at the position (m, n) in the difference difference map X d , and obtain a denoising difference map X N.
滤波操作按如下规则进行:The filtering operation is performed according to the following rules:
对满足条件Xb(m,n)=0的像素点,计算差值差异图Xd中该像素点的9×9大小的窗口内所有像素的中值,将该中值赋给去噪差值图XN中点(m,n)处的像素值XN(m,n);For a pixel that satisfies the condition X b (m, n)=0, calculate the median value of all pixels in the 9×9 window of the pixel point in the difference difference map X d , and assign the median value to the denoising difference Pixel value X N (m,n) at point (m,n) in value map X N ;
对满足条件Xb(m,n)=1的像素点,将差值差异图Xd中该像素点的值Xd(m,n)赋给去噪差值图XN中点(m,n)处的像素值XN(m,n);For a pixel point that satisfies the condition X b (m, n)=1, assign the value X d (m, n) of the pixel point in the difference difference map X d to the point (m, n) in the denoising difference map X N The pixel value X N (m,n) at n);
对满足条件Xb(m,n)=2的像素点,计算差值差异图Xd中该像素点的11×11大小的窗口内所有像素的中值,将该中值赋给去噪差值图XN中点(m,n)处的像素值XN(m,n);For a pixel point that satisfies the condition X b (m, n)=2, calculate the median value of all pixels in the 11×11 window of the pixel point in the difference difference map X d , and assign the median value to the denoising difference Pixel value X N (m,n) at point (m,n) in value map X N ;
如果存在满足条件Xb(m,n)=3的像素点,则计算差值差异图Xd中该像素点的高斯尺度为3×3的高斯核函数的值,将该值赋给去噪差值图XN中点(m,n)处的像素值XN(m,n);If there is a pixel that satisfies the condition X b (m,n)=3, then calculate the value of the Gaussian kernel function whose Gaussian scale is 3×3 for the pixel in the difference difference map X d , and assign this value to the denoising The pixel value X N (m, n) at the point (m, n) in the difference map X N ;
所有像素点(m,n)的值XN(m,n)构成去噪差值图XN={XN(m,n)}。The values X N (m, n) of all pixel points (m, n) constitute the denoising difference map X N ={X N (m,n)}.
步骤7,计算相似度差异图XS中位置(m,n)处像素点的变化类隶属度(m,n)和非变化类隶属度(m,n),再计算去噪差值图XN在位置(m,n)处像素点的变化类隶属度(m,n)和非变化类隶属度(m,n)。Step 7, calculate the change class membership degree of the pixel point at position (m, n) in the similarity difference map X S (m,n) and non-changing class membership (m,n), and then calculate the change class membership of the denoising difference map X N at the position (m,n) (m,n) and non-changing class membership (m,n).
7.1)计算相似度差异图XS中位置(m,n)处像素点的变化类隶属度(m,n)和非变化类隶属度(m,n):7.1) Calculate the change class membership degree of the pixel at position (m, n) in the similarity difference map X S (m,n) and non-changing class membership (m,n):
其中,T1、T2和T3是三个不同的阈值,且T1<T2<T3, T2=(T1+T3)/2,其中,为最大熵阈值分割方法分割相似度差异图XS的最佳阈值;Among them, T 1 , T 2 and T 3 are three different thresholds, and T 1 <T 2 <T 3 , T 2 =(T 1 +T 3 )/2, where, The optimal threshold for segmenting the similarity difference graph X S for the maximum entropy threshold segmentation method;
7.2)计算去噪差值图XN在位置(m,n)处像素点的变化类隶属度(m,n)和非变化类隶属度(m,n):7.2) Calculate the change class membership degree of the pixel point of the denoising difference image X N at position (m,n) (m,n) and non-changing class membership (m,n):
其中,XN(m,n)为去噪差值图XN中(m,n)处的像素值。Wherein, X N (m, n) is the pixel value at (m, n) in the denoising difference image X N.
步骤8,计算相似度差异图XS和去噪差值图XN的对应像素点(m,n)的变化类隶属度(m,n)和(m,n)的融合隶属度值Hc(m,n),再计算该像素点的非变化类隶属度(m,n)和(m,n)的融合隶属度Hu(m,n)。Step 8, calculate the change class membership of the corresponding pixel points (m, n) in the similarity difference map X S and the denoising difference map X N (m,n) and (m,n) fusion membership value H c (m,n), and then calculate the non-changing class membership of the pixel (m,n) and (m,n) fusion membership Hu (m,n).
8.1)计算相似度差异图XS和去噪差值图XN的对应像素点(m,n)的变化类隶属度(m,n)和(m,n)的变化类的融合隶属度值Hc(m,n):8.1) Calculate the change class membership of the corresponding pixel points (m, n) of the similarity difference map X S and the denoising difference map X N (m,n) and The fusion membership value H c (m,n) of the change class of (m,n):
其中,β为相似度差异图的变化类隶属度(m,n)和非变化类隶属度(m,n)的融合权值,本发明实例中β=0.4。Among them, β is the change class membership of the similarity difference map (m,n) and non-changing class membership The fusion weight of (m, n), β=0.4 in the example of the present invention.
8.2)计算像素点(m,n)的相似度差异图XS的非变化类隶属度(m,n)和去噪差值图XN的非变化类隶属度(m,n)的融合隶属度Hu(m,n):8.2) Calculate the non-change class membership of the similarity difference map X S of the pixel point (m, n) (m,n) and non-variant class membership of the denoised difference map X N Fusion membership degree H u (m,n) of (m,n):
步骤9,建立一幅与相似度差异图XS相同大小的融合图像XI,如果Hc(m,n)>Hu(m,n),则将该处像素点(m,n)的融合图像XI的值XI(m,n)赋为1,否则赋为0;由此得到一幅变化检测结果图像XI={XI(m,n)}。Step 9, create a fused image X I with the same size as the similarity difference map X S , if H c (m,n)>H u (m,n), then the pixel point (m,n) The value X I (m,n) of the fused image X I is assigned 1, otherwise it is assigned 0; thus a change detection result image X I ={X I (m,n)} is obtained.
本发明的效果可通过以下实验结果与分析进一步说明:Effect of the present invention can be further illustrated by following experimental results and analysis:
1.实验数据1. Experimental data
图2(a)和图2(b)是第一组遥感数据,该组真实遥感数据是由2000年4月和2002年5月的墨西哥郊外的两幅Landsat7ETM+(Enhanced Thematic Mapper Plus)第4波段遥感图像组成。图2(a)和图2(b)大小均为512×512,256个灰度级,变化区域主要是大火破坏了大面积的当地植被所致,图2(c)为对应的变化参考图,图2(c)中白色区域表示变化区域,其中,变化的像元数为25599,非变化像元数为236545。Figure 2(a) and Figure 2(b) are the first set of remote sensing data. This set of real remote sensing data is from the fourth band of Landsat7ETM+ (Enhanced Thematic Mapper Plus) in the suburbs of Mexico in April 2000 and May 2002. composition of remote sensing images. Figure 2(a) and Figure 2(b) both have a size of 512×512, 256 gray levels. The change area is mainly caused by the fire destroying a large area of local vegetation. Figure 2(c) is the corresponding change reference map , the white area in Figure 2(c) represents the changed area, where the number of changed pixels is 25599 and the number of non-changed pixels is 236545.
图3(a)和图3(b)是第二组遥感数据,该组真实遥感数据由1994年8月和1994年9月意大利Elba岛西部地区的两时相Landsat-5卫星TM第4波段光谱图像组成。图3(a)和图3(b)大小为326×414,256个灰度级,变化区域是由森林火灾引起的,图3(c)为对应的变化参考图,图3(c)中白色区域表示变化区域,其中,变化像素数为2415,非变化像素数为99985。Figure 3(a) and Figure 3(b) are the second set of remote sensing data. This set of real remote sensing data was obtained from the 4th band of the Landsat-5 satellite TM in two phases in August 1994 and September 1994 in the western area of Elba Island, Italy. Spectral image composition. The size of Figure 3(a) and Figure 3(b) is 326×414, 256 gray levels, the change area is caused by forest fires, Figure 3(c) is the corresponding change reference map, in Figure 3(c) The white area represents the changed area, where the number of changed pixels is 2415 and the number of non-changed pixels is 99985.
2.对比试验2. Comparative test
为了说明本发明的有效性,本发明与如下三个对比方法进行了对比。In order to illustrate the effectiveness of the present invention, the present invention is compared with the following three comparative methods.
对比方法1,是差值差异图和相似度差异图模糊隶属度融合的方法。本发明与对比方法1对比是为了验证本发明根据标记图对差值图进行处理的有效性。Compared with method 1, it is a method of fuzzy membership degree fusion of the difference difference map and the similarity difference map. The comparison between the present invention and the comparison method 1 is to verify the effectiveness of the present invention in processing the difference map according to the signature map.
对比方法2,是将He(2010)在文章“Application of Euclidean Norm inMulti-temporal Remote Sensing Image Change Detection”提出的计算两时相遥感图像多个波段对应像素的欧式距离的方法换成计算两时相图单波段像素邻域的欧式距离来构造差异图的方法。本发明与对比方法2对比是为了验证本发明中构造的相似度差异图的有效性。Comparison method 2 is to replace the method of calculating the Euclidean distance of pixels corresponding to multiple bands of two-temporal remote sensing images proposed by He (2010) in the article "Application of Euclidean Norm in Multi-temporal Remote Sensing Image Change Detection" to calculating two-temporal A method for constructing a difference map from the Euclidean distance of a single-band pixel neighborhood. The comparison between the present invention and the comparison method 2 is to verify the validity of the similarity difference map constructed in the present invention.
对比方法3,是Du等(2012)在文章“Fusion of Difference Images for ChangeDetection over Urban Areas”中提出的对多种差异图进行特征级融合的变化检测方法,其中特征级融合方法也采用模糊隶属度融合。本发明与对比方法3对比是为了验证本发明融合相似度差异图和去噪差值图进行变化检测的有效性。Comparative method 3 is a change detection method for feature-level fusion of multiple difference images proposed by Du et al. (2012) in the article "Fusion of Difference Images for Change Detection over Urban Areas", in which the feature-level fusion method also uses fuzzy membership fusion. The purpose of comparing the present invention with the comparison method 3 is to verify the effectiveness of the present invention in fusion of the similarity difference map and the denoising difference map for change detection.
3.实验内容与分析3. Experimental content and analysis
实验1,用本发明和对比方法1、对比方法2和对比方法3对第一组遥感图像进行变化检测,结果如图4所示,其中:Experiment 1, using the present invention and comparison method 1, comparison method 2 and comparison method 3 to carry out change detection on the first group of remote sensing images, the results are shown in Figure 4, wherein:
图4(a)为对比方法1的变化检测结果图;Fig. 4(a) is a diagram of the change detection results of the comparison method 1;
图4(b)为对比方法2的变化检测结果图;Figure 4(b) is a diagram of the change detection results of the comparison method 2;
图4(c)为对比方法3的变化检测结果图;Fig. 4(c) is a diagram of the change detection result of the comparison method 3;
图4(d)为本发明的变化检测结果图。Fig. 4(d) is a diagram of the change detection result of the present invention.
从图4可以看出,三种对比方法的检测结果图中除了变化区域,整幅图还分布着面积很小的杂点;本发明与三种对比方法相比,结果图中变化区域的形状保持好,保留了较完整的边缘信息,且杂点等伪变化信息很少。As can be seen from Figure 4, in addition to the change area in the detection result figures of the three comparison methods, the whole picture is also distributed with very small miscellaneous points; compared with the three comparison methods, the shape of the change area in the result figure of the present invention is Keep it well, retain relatively complete edge information, and have little pseudo-change information such as noise.
实验2,用本发明和对比方法1、对比方法2和对比方法3对第一组遥感图像进行变化检测,结果如图5所示,其中:Experiment 2, using the present invention and comparison method 1, comparison method 2 and comparison method 3 to detect changes in the first group of remote sensing images, the results are shown in Figure 5, wherein:
图5(a)是对比方法1的变化检测结果图;Figure 5(a) is a diagram of the change detection results compared with Method 1;
图5(b)是对比方法2的变化检测结果图;Figure 5(b) is a diagram of the change detection results of the comparison method 2;
图5(c)是对比方法3的变化检测结果图;Figure 5(c) is a diagram of the change detection results of the comparison method 3;
图5(d)是本发明的变化检测结果图。Fig. 5(d) is a diagram of the change detection result of the present invention.
从图5可以看出,对比方法1的检测结果图中只检测出变化区域的一部分,漏检较多;对比方法2的检测结果图中的变化区域的整体形状保持较好,但结果图中有一些小面积的伪变化信息;对比方法3检测出的变化区域是一片白色区域,几乎覆盖了整幅图像,检测结果不能保持变化区域的基本形状,检测结果非常差;本发明与三种对比方法相比,检测结果图中变化区域的形状保持较好,且没有杂点等伪变化信息。It can be seen from Figure 5 that only a part of the change area is detected in the detection result diagram of the comparison method 1, and there are many missed detections; the overall shape of the change area in the detection result diagram of the comparison method 2 remains good, but There are some false change information in a small area; the change area detected by the comparison method 3 is a white area, which almost covers the entire image, and the detection result cannot maintain the basic shape of the change area, and the detection result is very poor; the present invention is compared with the three Compared with the method, the shape of the change area in the detection result map is well maintained, and there is no false change information such as noise points.
表1列出了本发明与三种对比方法仿真第一组遥感图像和第二组遥感图像的定量结果。Table 1 lists the quantitative results of the present invention and three comparison methods simulating the first group of remote sensing images and the second group of remote sensing images.
表1Table 1
从表1可以看出,对于第一组遥感图像,本发明的漏检数比对比方法1少,误检数比其稍多,总错误数少于对比方法1;本发明的漏检数比对比方法2多,但误检数比其少很多,总错误数少于对比方法2;本发明的漏检数比对比方法3稍多,但误检数远远少于对比方法3,总错误数少于对比方法3。对于第二组遥感图像,本发明的总错误数很少,正确率最高。As can be seen from Table 1, for the first group of remote sensing images, the number of missed detections of the present invention is less than that of comparison method 1, the number of false detections is slightly more than it, and the total number of errors is less than that of comparison method 1; Compared method 2 is more, but the number of false detections is much less than it, and the total number of errors is less than that of comparative method 2; The number is less than that of comparative method 3. For the second group of remote sensing images, the total number of errors in the present invention is very small, and the correct rate is the highest.
所以整体来看,本发明能够获得比较准确的变化信息,具有较强的抗噪性,能有效的去除杂点,同时能保持较好的变化区域的边缘信息;无论是视觉效果还是性能指标,本方法都较优。Therefore, on the whole, the present invention can obtain more accurate change information, has strong noise resistance, can effectively remove noise points, and can maintain better edge information of the change area; whether it is visual effects or performance indicators, This method is better.
Claims (4)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310117627.0A CN103198482B (en) | 2013-04-07 | 2013-04-07 | Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310117627.0A CN103198482B (en) | 2013-04-07 | 2013-04-07 | Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103198482A CN103198482A (en) | 2013-07-10 |
CN103198482B true CN103198482B (en) | 2015-10-28 |
Family
ID=48720988
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310117627.0A Expired - Fee Related CN103198482B (en) | 2013-04-07 | 2013-04-07 | Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103198482B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103632155B (en) * | 2013-12-16 | 2016-08-17 | 武汉大学 | Remote sensing image variation detection method based on slow feature analysis |
CN103871039B (en) * | 2014-03-07 | 2017-02-22 | 西安电子科技大学 | Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection |
CN104700374A (en) * | 2015-03-26 | 2015-06-10 | 东莞职业技术学院 | Scene image de-noising method based on Type-2 fuzzy logic system |
CN109003230A (en) * | 2018-06-07 | 2018-12-14 | 西安电子科技大学 | A kind of Cherenkov's fluorescent image impact noise minimizing technology and system |
CN112104878B (en) * | 2020-08-21 | 2024-09-13 | 西安万像电子科技有限公司 | Image coding method, device, coding end equipment and storage medium |
CN114255174A (en) * | 2020-09-25 | 2022-03-29 | 中国石油天然气股份有限公司 | Three-dimensional fault information identification method and device |
CN112995518A (en) * | 2021-03-12 | 2021-06-18 | 北京奇艺世纪科技有限公司 | Image generation method and device |
CN113609990A (en) * | 2021-08-06 | 2021-11-05 | 中国工商银行股份有限公司 | Method and device for determining construction progress of target building and server |
CN114359693B (en) * | 2021-12-10 | 2024-10-22 | 三峡大学 | High-resolution remote sensing image change detection method based on super-pixel fuzzy clustering |
CN115647696B (en) * | 2022-12-14 | 2023-03-21 | 中国华西企业股份有限公司 | Automatic machining device, machining method and machining terminal for large steel structure |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101950364A (en) * | 2010-08-30 | 2011-01-19 | 西安电子科技大学 | Remote sensing image change detection method based on neighbourhood similarity and threshold segmentation |
CN102169584A (en) * | 2011-05-28 | 2011-08-31 | 西安电子科技大学 | Remote sensing image change detection method based on watershed and treelet algorithms |
US8265356B2 (en) * | 2008-01-30 | 2012-09-11 | Computerized Medical Systems, Inc. | Method and apparatus for efficient automated re-contouring of four-dimensional medical imagery using surface displacement fields |
CN102968790A (en) * | 2012-10-25 | 2013-03-13 | 西安电子科技大学 | Remote sensing image change detection method based on image fusion |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120328161A1 (en) * | 2011-06-22 | 2012-12-27 | Palenychka Roman | Method and multi-scale attention system for spatiotemporal change determination and object detection |
-
2013
- 2013-04-07 CN CN201310117627.0A patent/CN103198482B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8265356B2 (en) * | 2008-01-30 | 2012-09-11 | Computerized Medical Systems, Inc. | Method and apparatus for efficient automated re-contouring of four-dimensional medical imagery using surface displacement fields |
CN101950364A (en) * | 2010-08-30 | 2011-01-19 | 西安电子科技大学 | Remote sensing image change detection method based on neighbourhood similarity and threshold segmentation |
CN102169584A (en) * | 2011-05-28 | 2011-08-31 | 西安电子科技大学 | Remote sensing image change detection method based on watershed and treelet algorithms |
CN102968790A (en) * | 2012-10-25 | 2013-03-13 | 西安电子科技大学 | Remote sensing image change detection method based on image fusion |
Non-Patent Citations (1)
Title |
---|
基于Memetic算法的SAR图像变化检测;辛芳芳;《红外与毫米波学报》;20120229;第31卷(第1期);第67-72页 * |
Also Published As
Publication number | Publication date |
---|---|
CN103198482A (en) | 2013-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103198482B (en) | Based on the method for detecting change of remote sensing image that disparity map fuzzy membership merges | |
Sahebjalal et al. | Analysis of land use-land covers changes using normalized difference vegetation index (NDVI) differencing and classification methods | |
CN104851087B (en) | Multiple dimensioned forest litterfall variation monitoring method | |
Estoque et al. | Classification and change detection of built-up lands from Landsat-7 ETM+ and Landsat-8 OLI/TIRS imageries: A comparative assessment of various spectral indices | |
CN102855622B (en) | A kind of infrared remote sensing image sea ship detection method based on significance analysis | |
CN101923711B (en) | SAR (Synthetic Aperture Radar) image change detection method based on neighborhood similarity and mask enhancement | |
CN104182985B (en) | Remote sensing image change detection method | |
CN103198480B (en) | Based on the method for detecting change of remote sensing image of region and Kmeans cluster | |
CN102982313B (en) | The method of Smoke Detection | |
CN109614942B (en) | A long-term monitoring method of forest disturbance based on cloud computing platform | |
CN103226832B (en) | Based on the multi-spectrum remote sensing image change detecting method of spectral reflectivity mutation analysis | |
CN102360500B (en) | Treelet curvelet domain denoising- based method for change detection of remote sensing image | |
CN108053419A (en) | Inhibited and the jamproof multiscale target tracking of prospect based on background | |
CN102800074B (en) | Synthetic aperture radar (SAR) image change detection difference chart generation method based on contourlet transform | |
Zhang et al. | MRSE-Net: multiscale residuals and SE-attention network for water body segmentation from satellite images | |
CN102063720B (en) | Treelets-based method for detecting remote sensing image changes | |
CN103294792B (en) | Based on the polarization SAR terrain classification method of semantic information and polarization decomposing | |
CN104794729B (en) | SAR image change detection based on conspicuousness guiding | |
CN105869146A (en) | Saliency fusion-based SAR image change detection method | |
CN104463881B (en) | A kind of multi-spectrum remote sensing image change detecting method based on spectral reflectivity neighborhood disparity map and neighborhood probability fusion | |
CN103226825B (en) | Based on the method for detecting change of remote sensing image of low-rank sparse model | |
CN102663740B (en) | SAR image change detection method based on image cutting | |
CN105512622B (en) | A kind of visible remote sensing image sea land dividing method based on figure segmentation and supervised learning | |
CN103093472B (en) | Based on the remote sensing image change detecting method of doubledictionary intersection rarefaction representation | |
Guo et al. | Change detection for high-resolution remote sensing imagery based on multi-scale segmentation and fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20151028 Termination date: 20200407 |
|
CF01 | Termination of patent right due to non-payment of annual fee |