CN104123561B - Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model - Google Patents
Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model Download PDFInfo
- Publication number
- CN104123561B CN104123561B CN201410325747.4A CN201410325747A CN104123561B CN 104123561 B CN104123561 B CN 104123561B CN 201410325747 A CN201410325747 A CN 201410325747A CN 104123561 B CN104123561 B CN 104123561B
- Authority
- CN
- China
- Prior art keywords
- pixel
- remote sensing
- fuzzy
- digital image
- classification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 239000011159 matrix material Substances 0.000 claims abstract description 31
- 239000003086 colorant Substances 0.000 claims description 3
- 238000005303 weighing Methods 0.000 claims 1
- 230000005484 gravity Effects 0.000 abstract description 10
- 230000011218 segmentation Effects 0.000 abstract description 6
- 238000003709 image segmentation Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 235000002566 Capsicum Nutrition 0.000 description 2
- 239000006002 Pepper Substances 0.000 description 2
- 241000722363 Piper Species 0.000 description 2
- 235000016761 Piper aduncum Nutrition 0.000 description 2
- 235000017804 Piper guineense Nutrition 0.000 description 2
- 235000008184 Piper nigrum Nutrition 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 150000003839 salts Chemical class 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Image Analysis (AREA)
Abstract
一种基于空间引力模型的模糊C均值遥感影像自动分类方法,适用于遥感影像、医学图像及其他图像的自动分割与分类时使用。步骤为:确定遥感数字影像的像元数,并利用标准FCM模型对图像进行聚类得到初始化,之后依次求每个像元与其邻域窗口内的其他像元之间的空间引力及空间约束惩罚因子,最终得到模糊因子通过将模糊因子加入标准FCM模型从而得到新的聚类目标函数,并循环求模糊矩阵和聚类中心直到聚类中心不再继续变化或运算达到最大迭代次数,然后利用最终求得的模糊隶属度矩阵U={uki}c×N,采用最大化隶属度矩阵准则对遥感影像的每个像元点进行类别标记确定每个像元所述的类别,形成遥感影像分类专题图,从而实现遥感数字影像的自动分类。其方法简单、自动化程度高、受图像噪声影响小、图像分割分类准确性高。
An automatic classification method of fuzzy C-means remote sensing images based on a spatial gravity model, which is suitable for automatic segmentation and classification of remote sensing images, medical images and other images. The steps are: determine the number of pixels in the remote sensing digital image, and use the standard FCM model to cluster the image to obtain initialization, and then calculate the space gravity and space constraint penalty between each pixel and other pixels in its neighborhood window in turn factor, and finally get the fuzzy factor By adding the fuzz factor Add the standard FCM model to get a new clustering objective function, and loop to find the fuzzy matrix and cluster centers Until the cluster center does not continue to change or the operation reaches the maximum number of iterations, then use the finally obtained fuzzy membership matrix U={u ki } c×N , and use the criterion of maximizing the membership matrix to process each pixel of the remote sensing image Points are marked by category to determine the category described by each pixel, forming a remote sensing image classification thematic map, thereby realizing automatic classification of remote sensing digital images. The method is simple, highly automated, less affected by image noise, and has high image segmentation and classification accuracy.
Description
技术邻域technology neighborhood
本发明涉及一种遥感影像自动分类方法,尤其适用于遥感影像、医学图像及其他图像的自动分割与分类中使用的基于空间引力模型的模糊C均值遥感影像自动分类方法。The invention relates to an automatic classification method for remote sensing images, in particular to a fuzzy C-mean automatic classification method for remote sensing images based on a spatial gravity model used in the automatic segmentation and classification of remote sensing images, medical images and other images.
背景技术Background technique
遥感数据分类是从遥感数据中提取专题类别数据的一个重要技术,为各行业信息提取提供了丰富的数据来源。目前的分类方法主要分为监督分类和非监督分类方法,与监督分类方法相比,非监督分类方法可以不需要先验知识即可从遥感数据中提取信息,因此,非监督分类方法在遥感数据分类中具有非常重要的地位。Remote sensing data classification is an important technology for extracting thematic category data from remote sensing data, which provides a rich source of data for information extraction in various industries. The current classification methods are mainly divided into supervised classification and unsupervised classification methods. Compared with supervised classification methods, unsupervised classification methods can extract information from remote sensing data without prior knowledge. classification has a very important position.
现有的非监督分类方法主要有ISODATA,KNN,K-means,Markov random field(MRF)和模糊C-均值聚类算法(FCM)等方法。其中,FCM是一种基于原型的聚类算法,具有简单、高效、数据适应性强等特点,而且与硬分类方法相比较,FCM能得到每个像元属于每个类别的隶属度,可以尽可能多地保留图像的信息,更适合用来表示具有大量混合像元的遥感影像。标准FCM算法是由Dune提出,后由Bezdek所推广,是一种迭代最优化方法。但是,标准FCM聚类过程中没有考虑相邻像元间的影响,在分割信噪比较低的图像时会产生很大的误差。为了克服标准FCM算法对噪声高度敏感这一问题,很多研究人员在标准的FCM模型目标函数中加入了图像的空间约束,提出了许多改进的FCM聚类算法,如FCM_S,FCM_S1,FCM_S2,BCFCM,GG-FCM,EnFCM,FGFCM等。这些改进算法在一定程度上提高了标准FCM的性能,然而这些方法的效果受窗口大小、尺度因子等参数的影响非常严重,而且这些参数的选择具有很大的不确定性,因此这些算法的通用性有待于进一步验证。目前,出现了一种模糊局部信息C均值聚类算法FLICM,采用的局部约束可以同时将局部空间信息和像素特征集成在一起,并且不需要额外参数即可进行计算。然而,当图像噪声较大时,FLICM算法的分割准确性仍然较低,主要原因是因为FLICM中只是简单考虑了中心像元与邻域像元的空间距离及邻域像元的隶属度值,而没有考虑中心像元的隶属度值,而且所提出的模糊因子计算方法没有任何物理意义。在FLICM的分割分类结果中存在区域边缘过平滑,并丢失了大量的细节信息。因此,急需一种既保留了边缘细节信息,又考虑了局部空间信息的高效FCM算法。The existing unsupervised classification methods mainly include methods such as ISODATA, KNN, K-means, Markov random field (MRF) and fuzzy C-means clustering algorithm (FCM). Among them, FCM is a prototype-based clustering algorithm, which has the characteristics of simplicity, high efficiency, and strong data adaptability. Compared with the hard classification method, FCM can obtain the membership degree of each pixel belonging to each category, and can be used as much as possible. It is possible to retain as much image information as possible, and it is more suitable for representing remote sensing images with a large number of mixed pixels. The standard FCM algorithm was proposed by Dune and later promoted by Bezdek, which is an iterative optimization method. However, the influence between adjacent pixels is not considered in the standard FCM clustering process, which will cause large errors when segmenting images with low signal-to-noise ratios. In order to overcome the problem that the standard FCM algorithm is highly sensitive to noise, many researchers have added image space constraints to the standard FCM model objective function, and proposed many improved FCM clustering algorithms, such as FCM_S, FCM_S1, FCM_S2, BCFCM, GG-FCM, EnFCM, FGFCM, etc. These improved algorithms improve the performance of standard FCM to a certain extent. However, the effects of these methods are seriously affected by parameters such as window size and scale factor, and the selection of these parameters has great uncertainty. Therefore, the generality of these algorithms sex awaits further verification. At present, there is a fuzzy local information C-means clustering algorithm FLICM, which adopts local constraints that can integrate local spatial information and pixel features at the same time, and can be calculated without additional parameters. However, when the image noise is large, the segmentation accuracy of the FLICM algorithm is still low. The main reason is that in FLICM, the spatial distance between the central pixel and the neighboring pixels and the membership value of the neighboring pixels are simply considered. The membership degree value of the central pixel is not considered, and the proposed fuzzy factor calculation method has no physical meaning. In the segmentation and classification results of FLICM, the edge of the region is over-smoothed, and a large amount of detailed information is lost. Therefore, there is an urgent need for an efficient FCM algorithm that not only preserves edge detail information, but also considers local spatial information.
发明内容Contents of the invention
技术问题:本发明的主要目的是克服已有技术中的不足之处,提供一种方法简单、自动化程度高、受图像噪声影响小、图像分割分类准确性高的基于空间引力模型的模糊C均值遥感影像自动分类方法。Technical problem: the main purpose of the present invention is to overcome the deficiencies in the prior art, and provide a fuzzy C-mean value based on a spatial gravity model with a simple method, a high degree of automation, little influence by image noise, and high image segmentation and classification accuracy Automatic classification method of remote sensing images.
技术方案:为实现上述发明目的,本发明的基于空间引力模型的模糊C均值遥感影像自动分类方法,包括以下步骤Technical solution: In order to achieve the purpose of the above invention, the fuzzy C-means remote sensing image automatic classification method based on the spatial gravity model of the present invention includes the following steps
步骤1:获取待分类的遥感数字影像,根据图像尺寸和波段数获取遥感数字影像的像元个数,根据实际分类需要确定遥感数字影像的聚类类别个数c和模糊指数m,并利用标准的FCM模型对遥感数字影像进行聚类得到初始化模糊隶属度矩阵U0={uki}c×N和聚类中心V0={vk}c,N为待分类遥感数字影像的像元个数,uki表示待分类遥感数字影像中第i的像元属于第k类的隶属度,vk为待分类遥感数字影像第k类的中心点;Step 1: Obtain the remote sensing digital image to be classified, obtain the number of pixels of the remote sensing digital image according to the image size and the number of bands, determine the number of clusters c and the fuzzy index m of the remote sensing digital image according to the actual classification needs, and use the standard The FCM model of the remote sensing digital image is clustered to obtain the initialized fuzzy membership matrix U 0 ={u ki } c×N and the clustering center V 0 ={v k } c , N is the number of pixels of the remote sensing digital image to be classified number, u ki represents the membership degree of the i-th pixel in the remote sensing digital image to be classified belongs to the k-th class, and v k is the center point of the k-th class of the remote-sensing digital image to be classified;
步骤2:设定邻域窗口大小,根据设定的邻域窗口大小确定遥感数字影像中将遥感数字影像中每个像元作为中心像元的邻域窗口内的像元,通过公式计算遥感数字影像中每个像元和其所属邻域窗口内其它像元之间的空间引力NAij,Step 2: Set the size of the neighborhood window, and determine the pixels in the neighborhood window of the remote sensing digital image with each pixel in the remote sensing digital image as the central pixel according to the set neighborhood window size, through the formula Calculate the spatial gravitational force NA ij between each pixel in the remote sensing digital image and other pixels in its neighborhood window,
其中,G为常数,用来表示调节空间约束对聚类目标函数的贡献,一般设G=1,uki表示邻域窗口中心像元xi属于第k个类别的隶属度,ukj表示邻域窗口内的第j个像元xj属于第k个类别的隶属度,Rij表示像元xi与像元xj之间的欧式空间距离;Among them, G is a constant, which is used to represent the contribution of the adjustment space constraint to the clustering objective function. Generally, G=1, u ki represents the membership degree of the neighborhood window center pixel x i belonging to the kth category, and u kj represents the neighborhood The jth pixel x j in the domain window belongs to the membership degree of the kth category, and R ij represents the Euclidean space distance between the pixel x i and the pixel x j ;
步骤3:利用遥感数字影像中每个像元与其所属邻域窗口内的其它像元之间的空间引力NAij,通过公式得到遥感数字影像中像元之间的空间约束惩罚因子wij,Step 3: Using the spatial gravitational force NA ij between each pixel in the remote sensing digital image and other pixels in the neighborhood window to which it belongs, through the formula Get the spatial constraint penalty factor w ij between pixels in the remote sensing digital image,
其中,wij表示邻域窗口内的第j个像元xj对中心像元xi的影响权重,Ni表示中心像元xi的邻域窗口内的像元,NAij为步骤2中计算得到的图像中每个像元xi与将像元xi作为中心像元的邻域窗口内的其余像元xj之间的空间引力,|xi-xj|表示像元xi与将像元xi作为中心像元的邻域窗口内的像元xj之间的灰度值之差;Among them, w ij represents the influence weight of the jth pixel x j in the neighborhood window on the center pixel x i , N i represents the pixels in the neighborhood window of the center pixel x i , and NA ij is the Calculate the spatial gravitational force between each pixel x i in the image and the remaining pixels x j in the neighborhood window with pixel x i as the central pixel, where | xi -x j | represents the pixel x i The difference between the gray value of the pixel x j in the neighborhood window with the pixel x i as the central pixel;
步骤4:通过公式得到模糊因子 Step 4: Through the formula get fuzz factor
步骤5:通过公式将模糊因子加入到标准FCM模型,得到聚类目标函数 Step 5: Through the formula fuzz factor Add to the standard FCM model to get the clustering objective function
其中,N为待分类遥感数字影像的像元个数,c为分类遥感数字影像的聚类个数,表示待分类遥感数字影像中第i的像元属于第k类的隶属度,m为待分类遥感数字影像的模糊指数,xi表示待分类遥感数字影像的第i个像元,vk为待分类遥感数字影像第k类的中心点;Among them, N is the number of pixels of remote sensing digital images to be classified, c is the number of clusters of classified remote sensing digital images, Indicates the membership degree of the i-th pixel in the remote sensing digital image to be classified belongs to the k-th class, m is the fuzzy index of the remote sensing digital image to be classified, x i represents the i-th pixel of the remote sensing digital image to be classified, v k is the Classify the center point of the kth category of remote sensing digital images;
结合步骤1中得到的初始模糊矩阵U0和聚类中心V0,利用公式计算待分类遥感影像新的聚类中心计为表示当前得到的新的聚类中心;Combining the initial fuzzy matrix U 0 and cluster center V 0 obtained in step 1, use the formula Calculate the new cluster center of the remote sensing image to be classified count as Indicates the new cluster center currently obtained;
再利用公式计算待分类遥感影像新的隶属度矩阵计为表示当前得到的新的模糊矩阵;此时为通过初始的聚类中心V0获得,表示前一次得到的聚类中心;为通过初始的模糊矩阵U0获得,表示前一次得到的模糊矩阵;reuse formula Calculate the new membership degree matrix of remote sensing images to be classified count as Indicates the new fuzzy matrix currently obtained; at this time To obtain through the initial cluster center V 0 , it means The cluster center obtained last time; is obtained through the initial fuzzy matrix U 0 , which means The fuzzy matrix obtained last time;
步骤6:判断聚类中心是否继续变化或运算达到最大迭代次数,设定迭代停止阈值ε=1e-5为一小正数,将待分类遥感数字影像求得的第t个聚类中心与第t-1个聚类中心进行比较,若满足或b>T两者之一的条件,则迭代结束,所述b初始值为0,T为100,否则,设定b=b+1,用当前得到的模糊矩阵和聚类中心分别替代初始模糊矩阵U0和聚类中心V0,并设定为和返回到步骤2,重复执行步骤2~步骤6,直至满足或b>T两者之一的条件;Step 6: Determine the cluster center Whether to continue to change or the operation reaches the maximum number of iterations, set the iteration stop threshold ε=1e-5 as a small positive number, and obtain the tth cluster center of the remote sensing digital image to be classified and the t-1th cluster center for comparison, if or b>T one of the two conditions, then the iteration ends, the initial value of b is 0, T is 100, otherwise, set b=b+1, use the fuzzy matrix currently obtained and cluster centers Replace the initial fuzzy matrix U 0 and cluster center V 0 respectively, and set as and Return to step 2 and repeat steps 2 to 6 until the or the condition of either b>T;
步骤7:利用最终得到的模糊隶属度矩阵U={uki}c×N,根据如下公式确定每个像元xi所属类别,即对于每个像元xi,其所属类别ci为隶属度uki中最大的那个类别:Step 7: Use the final fuzzy membership degree matrix U={u ki } c×N to determine the category of each pixel x i according to the following formula, that is, for each pixel x i , the category c i it belongs to belongs to The category with the largest degree u ki :
Ci=argk{max(uki)},k=1,2,3,,cC i =arg k {max(u ki )}, k=1,2,3,,c
步骤8:根据每个像元xi所属类别ci将不同的聚类类别赋予不同的颜色,形成遥感影像分类专题图,从而实现遥感数字影像的自动分类。Step 8: Assign different colors to different cluster categories according to the category c i to which each pixel x i belongs, and form a remote sensing image classification thematic map, thereby realizing automatic classification of remote sensing digital images.
有益效果:本发明通过计算遥感数字影像中像元之间的空间约束惩罚因子wij,考虑遥感数字影像中局部像元的灰度关系,综合提出了可反映每个像元的空间上下文特性的基于空间引力模型的模糊因子用于抑制噪声。除了考虑了中心像元与邻域像元的空间距离及邻域像元的隶属度值,还考虑中心像元的隶属度值,且引入的模糊因子具有物理意义,既保留了边缘细节信息,又考虑了局部空间信息。本发明使用了中心像元与其对应邻域窗口内其它像元之间的空间引力NAij自适应确定邻域像元对中心像元的影响程度,无需学习经验值;引入基于空间引力模型的模糊因子到目标函数中,使得本分类法具有对含有噪声的遥感数字影像的处理具有鲁棒性,而且对图像细节处理好;其方法简单,所需参数与标准的FCM相同,无需其他参数,受图像噪声影响小、图像分割分类准确性高,具有广泛的实用性。Beneficial effects: the present invention calculates the spatial constraint penalty factor w ij between pixels in the remote sensing digital image, and considers the gray relationship of local pixels in the remote sensing digital image, and comprehensively proposes a method that can reflect the spatial context characteristics of each pixel Fuzzy factor based on space gravity model Used to suppress noise. In addition to considering the spatial distance between the central pixel and the neighboring pixels and the membership value of the neighboring pixels, the membership value of the central pixel is also considered, and the introduced fuzzy factor It has physical meaning, not only retains edge detail information, but also considers local spatial information. The present invention uses the spatial gravity NA ij between the center pixel and other pixels in the corresponding neighborhood window to adaptively determine the degree of influence of the neighborhood pixel on the center pixel without learning experience values; introduces fuzzy based on the spatial gravity model factor Into the objective function, this classification method is robust to the processing of remote sensing digital images containing noise, and it can handle image details well; the method is simple, the required parameters are the same as the standard FCM, no other parameters are required, and it is affected by the image The impact of noise is small, the accuracy of image segmentation and classification is high, and it has a wide range of practicability.
附图说明:Description of drawings:
图1是本发明的流程图;Fig. 1 is a flow chart of the present invention;
图2是本发明的像元邻域窗口结构示意;Fig. 2 is a schematic diagram of the pixel neighborhood window structure of the present invention;
图3是本发明实施例1遥感影像上的分割结果与现有技术的对比图;Fig. 3 is a comparison diagram between the segmentation result on the remote sensing image of Embodiment 1 of the present invention and the prior art;
图4是本发明实施例2遥感影像上的分割结果与现有技术的对比图。Fig. 4 is a comparison diagram of the segmentation result on the remote sensing image of Embodiment 2 of the present invention and the prior art.
具体实施方式:Detailed ways:
下面结合附图对本发明的实施例作进一步的描述:Embodiments of the present invention will be further described below in conjunction with the accompanying drawings:
如图1所示,本发明的基于空间引力模型的模糊C均值遥感影像自动分类方法,包括步骤如下:As shown in Figure 1, the fuzzy C-means remote sensing image automatic classification method based on the spatial gravity model of the present invention comprises the following steps:
步骤1:获取待分类的遥感数字影像(P,Q,S),其中P为图像的行数,Q为图像的列数,S图像的波段数,根据图像尺寸和波段数,将遥感数字影像(P,Q,S)转换为2维矩阵(S,P×Q)形式,即仅考虑待分类遥感数字影像的像元个数和波段数,根据实际分类需要确定感数字影像(S,P×Q)的聚类类别个数c和模糊指数m;利用标准的FCM模型对遥感数字影像(S,P×Q)进行聚类得到初始化模糊隶属度矩阵U0={uki}c×N和聚类中心V0={vk}c,N为待分类遥感数字影像的像元个数,uki表示待分类遥感数字影像中第i的像元属于第k类的隶属度,vk为待分类遥感数字影像第k类的中心点;Step 1: Obtain the remote sensing digital image (P, Q, S) to be classified, where P is the number of rows of the image, Q is the number of columns of the image, and the number of bands of the S image. According to the image size and the number of bands, the remote sensing digital image (P,Q,S) is transformed into a 2-dimensional matrix (S,P×Q) form, that is, only the number of pixels and the number of bands of the remote sensing digital image to be classified are considered, and the remote sensing digital image (S,P ×Q) clustering category number c and fuzzy index m; use the standard FCM model to cluster the remote sensing digital image (S,P×Q) to get the initialized fuzzy membership matrix U 0 ={u ki } c×N and clustering center V 0 ={v k } c , N is the number of pixels in the remote sensing digital image to be classified, u ki represents the membership degree of the i-th pixel in the remote sensing digital image to be classified belonging to the k-th class, v k is the center point of the kth class of the remote sensing digital image to be classified;
步骤2:设定邻域窗口大小,如图2所示为一个窗口大小为5的窗口,图中的数字1,2,4…代表中心像元x的邻域范围,根据设定的邻域窗口大小确定遥感数字影像(S,P×Q)中,将遥感数字影像中每个像元作为中心像元的邻域窗口内的像元,通过公式计算遥感数字影像中每个像元和所属的邻域窗口内其它像元之间的空间引力NAij,Step 2: Set the size of the neighborhood window, as shown in Figure 2, a window with a window size of 5, the numbers 1, 2, 4... in the figure represent the neighborhood range of the central pixel x, according to the set neighborhood The window size is determined in the remote sensing digital image (S, P×Q), and each pixel in the remote sensing digital image is regarded as the pixel in the neighborhood window of the central pixel, through the formula Calculate the spatial gravitational force NA ij between each pixel in the remote sensing digital image and other pixels in the neighborhood window to which it belongs,
其中,G为常数,用来表示调节空间约束对聚类目标函数的贡献,一般设G=1,uki表示邻域窗口中心像元xi属于第k个类别的隶属度,ukj表示邻域窗口内的第j个像元xj属于第k个类别的隶属度,Rij表示像元xi与像元xj之间的欧式空间距离;Among them, G is a constant, which is used to represent the contribution of the adjustment space constraint to the clustering objective function. Generally, G=1, u ki represents the membership degree of the neighborhood window center pixel x i belonging to the kth category, and u kj represents the neighborhood The jth pixel x j in the domain window belongs to the membership degree of the kth category, and R ij represents the Euclidean space distance between the pixel x i and the pixel x j ;
步骤3:利用遥感数字影像中每个像元与所属邻域窗口内的其它像元之间的空间引力NAij,通过公式得到遥感数字影像中像元之间的空间约束惩罚因子wij,Step 3: Using the spatial gravitational force NA ij between each pixel in the remote sensing digital image and other pixels in its neighborhood window, through the formula Get the spatial constraint penalty factor w ij between pixels in the remote sensing digital image,
其中,wij表示邻域窗口内的第j个像元xj对中心像元xi的影响权重,Ni表示中心像元xi的邻域窗口内的像元,NAij为步骤2中计算得到的图像中每个像元xi与将像元xi作为中心像元的邻域窗口内的其余像元xj之间的空间引力,|xi-xj|表示像元xi与将像元xi作为中心像元的邻域窗口内的像元xj之间的灰度值之差;Among them, w ij represents the influence weight of the jth pixel x j in the neighborhood window on the center pixel x i , N i represents the pixels in the neighborhood window of the center pixel x i , and NA ij is the Calculate the spatial gravitational force between each pixel x i in the image and the remaining pixels x j in the neighborhood window with pixel x i as the central pixel, where | xi -x j | represents the pixel x i The difference between the gray value of the pixel x j in the neighborhood window with the pixel x i as the central pixel;
步骤4:通过公式得到模糊因子 Step 4: Through the formula get fuzz factor
步骤5:通过公式将模糊因子加入到标准FCM模型,得到加入模糊因子的聚类目标函数 Step 5: Through the formula fuzz factor Added to the standard FCM model, the added fuzzy factor is obtained The clustering objective function of
其中,N为待分类遥感数字影像的像元个数,既N=P×Q;c为分类遥感数字影像的聚类个数,表示待分类遥感数字影像中第i的像元属于第k类的隶属度,m为待分类遥感数字影像的模糊指数,xi表示待分类遥感数字影像的第i个像元,vk为待分类遥感数字影像第k类的中心点;Among them, N is the number of pixels of remote sensing digital images to be classified, that is, N=P×Q; c is the number of clusters of classified remote sensing digital images, Indicates the membership degree of the i-th pixel in the remote sensing digital image to be classified belongs to the k-th class, m is the fuzzy index of the remote sensing digital image to be classified, x i represents the i-th pixel of the remote sensing digital image to be classified, v k is the Classify the center point of the kth category of remote sensing digital images;
结合步骤1中得到的初始模糊矩阵U0和聚类中心V0,利用公式计算待分类遥感影像新的聚类中心计为表示当前得到的新的聚类中心;Combining the initial fuzzy matrix U 0 and cluster center V 0 obtained in step 1, use the formula Calculate the new cluster center of the remote sensing image to be classified count as Indicates the new cluster center currently obtained;
再利用公式计算待分类遥感影像新的隶属度矩阵计为表示当前得到的新的模糊矩阵;此时为通过初始的聚类中心V0获得,表示前一次得到的聚类中心;为通过初始的模糊矩阵U0获得,表示前一次得到的模糊矩阵;reuse formula Calculate the new membership degree matrix of remote sensing images to be classified count as Indicates the new fuzzy matrix currently obtained; at this time To obtain through the initial cluster center V 0 , it means The cluster center obtained last time; is obtained through the initial fuzzy matrix U 0 , which means The fuzzy matrix obtained last time;
步骤6:判断聚类中心是否继续变化或运算达到最大迭代次数,设定运算次数的迭代停止阈值ε=1e-5为一小正数,将待分类遥感数字影像求得的第t个聚类中心与第t-1个聚类中心进行比较,若满足或b>T两者之一的条件,则迭代结束,所述b初始值为0,T为100,否则,设定b=b+1,用当前得到的模糊矩阵和聚类中心分别替代初始模糊矩阵U0和聚类中心V0,并设定为和返回到步骤2,重复执行步骤2~步骤6,直至满足或b>T两者之一的条件;Step 6: Determine the cluster center Whether to continue to change or the operation reaches the maximum number of iterations, set the iteration stop threshold of the number of operations ε=1e-5 as a small positive number, and obtain the tth clustering center of the remote sensing digital image to be classified and the t-1th cluster center for comparison, if or b>T one of the two conditions, then the iteration ends, the initial value of b is 0, T is 100, otherwise, set b=b+1, use the fuzzy matrix currently obtained and cluster centers Replace the initial fuzzy matrix U 0 and cluster center V 0 respectively, and set as and Return to step 2 and repeat steps 2 to 6 until the or the condition of either b>T;
步骤7:利用最终得到的模糊隶属度矩阵U={uki}c×N,根据如下公式确定每个像元xi所属类别,即对于每个像元xi,其所属类别ci为隶属度uki中最大的那个类别:Step 7: Use the final fuzzy membership degree matrix U={u ki } c×N to determine the category of each pixel x i according to the following formula, that is, for each pixel x i , its category c i is the membership The category with the largest degree u ki :
Ci=argk{max(uki)},k=1,2,3,,cC i =arg k {max(u ki )}, k=1,2,3,,c
步骤8:根据每个像元xi所属类别ci将不同的聚类类别赋予不同的颜色,形成遥感影像分类专题图,从而实现遥感数字影像的自动分类。Step 8: Assign different colors to different cluster categories according to the category c i to which each pixel x i belongs, and form a remote sensing image classification thematic map, thereby realizing automatic classification of remote sensing digital images.
本发明中的所有的算法都是在Matlab7.8下编程实现,选取了两个不同区域的遥感数据作为验证数据,分类的验证样本通过对原始遥感影像进行了严格的几何纠正,并满足纠正误差小于0.5像元要求,然后在纠正的影像上通过目视解译的方法得到所需的验证样本。最后利用了生产精度、总体精度和Kappa系数对分类结果精度进行了评价,并将本发明FLNAICM与标准FCM及FLICM算法进行了对比。All the algorithms in the present invention are programmed under Matlab7.8, and the remote sensing data of two different regions are selected as verification data, and the verified samples of the classification are subjected to strict geometric correction to the original remote sensing images, and satisfy the correction error Less than 0.5 pixel requirements, and then obtain the required verification samples by visual interpretation on the corrected image. Finally, the accuracy of classification results is evaluated by using the production accuracy, overall accuracy and Kappa coefficient, and the FLNAICM of the present invention is compared with the standard FCM and FLICM algorithms.
实施例1、采用了大小为200×200像元的包含了红色、绿色和蓝色三个波段分辨率为0.61米QuickBird遥感影像,此数据位于中国江苏徐州市的城市区域内,获取时间为2005年8月,如图3(a)和(b)分别是原始分类影像和参考数据影像,图像分成了4个类别:构筑物、裸地、水和植被,模糊指数为2。Embodiment 1. The QuickBird remote sensing image with a resolution of 0.61 meters and a size of 200×200 pixels including red, green and blue bands is used. This data is located in the urban area of Xuzhou City, Jiangsu Province, China, and the acquisition time is 2005 In August 2009, Fig. 3(a) and (b) are the original classification images and reference data images respectively. The images are divided into 4 categories: structures, bare land, water and vegetation, and the fuzzy index is 2.
图3(c)-(e)分别代表了FCM、FLICM和FLNAICM的分类结果,图3(c)中,由于光谱的相似性及图像噪声的存在,FCM只是利用了图像的光谱特性,没有考虑空间上下文信息,造成了分类结果中存在许多“椒盐”现象。从图3(d)和图3(e)可看出,FLICM和FLNAICM的分类效果都要好于FCM,分类图中的大部分“杂点”被去除,形成较好的同质性类别区域。从图3(d)中可以发现,分类地块存在过平滑现象,形成了较大区域的分类地块,例如图3(d)中标注A、B、C处,丢失了许多分类地块的细节,而相比于FLICM方法,从图3(e)可以看出,FLNAICM保留了许多细节信息,例如图3(e)中标注A、B、C处,其原因是由于FLNAICM利用了中心像元与邻域像元之间的空间引力作为邻域像元对中心像元空间影响程度,而此空间引力充分考虑了中心像元与邻域像元像素的局部空间关系及灰度关系,且可根据像元的特性自动计算。FLICM只是简单地利用中心像元与邻域像元像素之间的空间距离和隶属度。表1给出FCM、FLICM和FLNAICM的分类结果精度评价结果,相对于FCM与FLICM,FLNAICM给出了最高的分类精度。Figure 3(c)-(e) represent the classification results of FCM, FLICM and FLNAICM respectively. In Figure 3(c), due to the similarity of the spectrum and the existence of image noise, FCM only uses the spectral characteristics of the image and does not consider Spatial context information has caused many "salt and pepper" phenomena in the classification results. From Figure 3(d) and Figure 3(e), it can be seen that the classification effect of FLICM and FLNAICM is better than that of FCM, and most of the "noise points" in the classification map are removed, forming a better homogeneity category area. It can be seen from Figure 3(d) that there is an over-smoothing phenomenon in the classified plots, forming a classified plot in a larger area. For example, in Figure 3(d) marked A, B, and C, many classified plots are lost Compared with the FLICM method, it can be seen from Figure 3(e) that FLNAICM retains a lot of detailed information, for example, A, B, and C are marked in Figure 3(e), the reason is that FLNAICM uses the central image The spatial gravitational force between the central pixel and the neighboring pixels is regarded as the degree of influence of the neighboring pixels on the central pixel space, and this spatial gravitational force fully considers the local spatial relationship and grayscale relationship between the central pixel and the neighboring pixel pixels, and It can be calculated automatically based on the characteristics of the cell. FLICM simply utilizes the spatial distance and membership degree between the central pixel and neighboring pixels. Table 1 shows the accuracy evaluation results of the classification results of FCM, FLICM and FLNAICM. Compared with FCM and FLICM, FLNAICM gives the highest classification accuracy.
表1.实施1中的验证样本数、生产精度、总体精度及Kappa系数Table 1. Number of validation samples, production precision, overall precision and Kappa coefficient in Implementation 1
实施例2Example 2
在实施例2中,采用了大小为200×200像元的包含了红色、绿色和蓝色三个波段分辨率为0.61米QuickBird遥感影像,此数据位于中国江苏徐州市的城郊区域内,采集时间为2005年8月,如图4(a)和图4(b)分别是原始分类影像和参考数据影像,图像分成了4个类别:道路、裸地、水和植被,模糊指数为2。In Example 2, a QuickBird remote sensing image with a size of 200×200 pixels and a resolution of 0.61 meters including red, green and blue bands was used. This data is located in the suburban area of Xuzhou City, Jiangsu Province, China. The acquisition time For August 2005, Figure 4(a) and Figure 4(b) are the original classified images and reference data images respectively, the images are divided into 4 categories: road, bare land, water and vegetation, and the fuzzy index is 2.
图4(c)-(e)分别是FCM、FLICM和FLNAICM的分类结果,图4(c)中,由于光谱的相似性及图像噪声的存在,FCM的分类方法仅利用了图像的光谱特性,没有考虑空间上下文信息,分类结果中出现了大量的“椒盐”现象。从图4(d)和图4(e)可看出,FLICM和FLNAICM的分类效果都要好于FCM,分类图中的大部分“杂点”都被去除了。图4(d)中存在过平滑现象,许多分类地块的细节丢失,例如图4(d)中标注A、B、C处,而相比于FLICM方法,从图4(e)可以看出,FLNAICM保留了许多细节信息,例如图4(e)中标注A、B、C处,其原因是由于FLNAICM利用了中心像元与邻域像元之间的空间引力作为邻域像元对中心像元空间影响程度,而此空间引力充分考虑了中心像元与邻域像元像素的局部空间关系及灰度关系,可根据像元的特性自动计算。FLICM中只是简单地利用中心像元与邻域像元像素之间的空间距离和隶属度。表2给出FCM、FLICM和FLNAICM的分类结果精度评价结果,相比于FCM与FLICM,FLNAICM给出了最高的分类精度。Figure 4(c)-(e) are the classification results of FCM, FLICM and FLNAICM respectively. In Figure 4(c), due to the similarity of the spectrum and the existence of image noise, the classification method of FCM only uses the spectral characteristics of the image. Spatial context information is not considered, and a large number of "salt and pepper" phenomena appear in the classification results. From Figure 4(d) and Figure 4(e), it can be seen that the classification effect of FLICM and FLNAICM is better than that of FCM, and most of the "noise points" in the classification diagram have been removed. There is an over-smoothing phenomenon in Figure 4(d), and the details of many classified plots are lost. For example, A, B, and C are marked in Figure 4(d). Compared with the FLICM method, it can be seen from Figure 4(e) , FLNAICM retains a lot of detailed information, such as A, B, and C in Figure 4(e), the reason is that FLNAICM uses the spatial gravitational force between the central pixel and neighboring pixels as the center The influence degree of the pixel space, and this spatial gravity fully considers the local spatial relationship and gray level relationship between the central pixel and the neighboring pixels, and can be automatically calculated according to the characteristics of the pixel. In FLICM, the spatial distance and membership degree between the central pixel and the neighboring pixels are simply used. Table 2 shows the accuracy evaluation results of the classification results of FCM, FLICM and FLNAICM. Compared with FCM and FLICM, FLNAICM gives the highest classification accuracy.
表2.实施2中的验证样本数、生产精度、总体精度及Kappa系数Table 2. Number of validation samples, production precision, overall precision and Kappa coefficient in Implementation 2
综上,本发明提出了一种基于邻域像元空间引力的FLNAICM分类方法,利用中心像元与邻域像元之间的引力估算邻域像元对中心像元的影响程度,合理地引入空间上下文信息到标准的FCM以提高其分类精度及算法对含有噪声的图像分类的鲁棒性。In summary, the present invention proposes a FLNAICM classification method based on the gravitational force of the neighborhood pixel space, which utilizes the gravitational force between the central pixel and the neighborhood pixel to estimate the degree of influence of the neighborhood pixel on the central pixel, and reasonably introduces Spatial context information is added to the standard FCM to improve its classification accuracy and the robustness of the algorithm to image classification with noise.
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410325747.4A CN104123561B (en) | 2014-07-10 | 2014-07-10 | Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410325747.4A CN104123561B (en) | 2014-07-10 | 2014-07-10 | Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104123561A CN104123561A (en) | 2014-10-29 |
CN104123561B true CN104123561B (en) | 2018-04-13 |
Family
ID=51768966
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410325747.4A Active CN104123561B (en) | 2014-07-10 | 2014-07-10 | Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104123561B (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105654453B (en) * | 2014-11-10 | 2018-09-28 | 华东师范大学 | A kind of FCM image partition methods of robustness |
CN105825222A (en) * | 2015-01-07 | 2016-08-03 | 王伟 | Land automatic classification method based on multisource and multi-temporal satellite image data |
CN104834942B (en) * | 2015-05-22 | 2018-02-09 | 武汉大学 | Remote sensing image variation detection method and system based on mask classification |
CN105320753B (en) * | 2015-09-30 | 2018-07-06 | 重庆大学 | A kind of unbalanced data sorting technique and its system based on level gravity model |
CN106408571B (en) * | 2016-09-20 | 2018-11-02 | 辽宁工程技术大学 | A kind of variable class remote sensing image segmentation method based on the selection of optimal fuzzy factor |
CN106600605A (en) * | 2016-12-14 | 2017-04-26 | 陕西科技大学 | Unsupervised fast image segmentation algorithm |
CN107492092A (en) * | 2017-07-13 | 2017-12-19 | 青岛黄海学院 | The medical image cutting method of GSA algorithms is improved based on FCM algorithm fusions |
CN108830297B (en) * | 2018-05-19 | 2021-04-27 | 烟台大学 | A Multispectral Remote Sensing Image Classification Method |
CN109064470B (en) * | 2018-08-28 | 2022-02-22 | 河南工业大学 | An image segmentation method and device based on adaptive fuzzy clustering |
CN109697466B (en) * | 2018-12-20 | 2022-10-25 | 烟台大学 | Terrain classification method of adaptive interval type spatial fuzzy C-means |
CN109993204A (en) * | 2019-02-25 | 2019-07-09 | 重庆邮电大学 | Remote sensing image classification algorithm based on mRMR selection and improved FCM clustering |
CN109829519B (en) * | 2019-03-22 | 2020-04-03 | 兰州交通大学 | Remote sensing image classification method and system based on self-adaptive spatial information |
CN110135432A (en) * | 2019-05-24 | 2019-08-16 | 哈尔滨工程大学 | A Hyperspectral Remote Sensing Image Segmentation Method Based on K-means Clustering |
CN110415208B (en) * | 2019-06-10 | 2023-10-17 | 西安电子科技大学 | An adaptive target detection method and its device, equipment and storage medium |
CN111898690B (en) * | 2020-08-05 | 2022-11-18 | 山东大学 | A power transformer fault classification method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102289678A (en) * | 2011-08-10 | 2011-12-21 | 武汉大学 | Fuzzy supervised classification method for multiband remote sensing image based on non-equal weight distances |
CN103530875A (en) * | 2013-10-09 | 2014-01-22 | 哈尔滨工程大学 | End member extraction data preprocessing method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060291751A1 (en) * | 2004-12-16 | 2006-12-28 | Peyman Milanfar | Robust reconstruction of high resolution grayscale images from a sequence of low-resolution frames (robust gray super-resolution) |
-
2014
- 2014-07-10 CN CN201410325747.4A patent/CN104123561B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102289678A (en) * | 2011-08-10 | 2011-12-21 | 武汉大学 | Fuzzy supervised classification method for multiband remote sensing image based on non-equal weight distances |
CN103530875A (en) * | 2013-10-09 | 2014-01-22 | 哈尔滨工程大学 | End member extraction data preprocessing method |
Non-Patent Citations (2)
Title |
---|
《A Robust Fuzzy Local Information C-Means Clustering Algorithm》;Stelios Krinidis,Vassilios Chatzis;《IEEE TRANSACTIONS ON IMAGE PROCESSING》;20100531;第29卷(第5期);第1328-1337页 * |
《遥感数据可靠性分类方法研究》;张华;《中国博士学位论文全文数据库(信息科技辑)》;20130615(第09期);正文第106-111页 * |
Also Published As
Publication number | Publication date |
---|---|
CN104123561A (en) | 2014-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104123561B (en) | Fuzzy C-mean algorithm remote sensing image automatic classification method based on spatial attraction model | |
CN107316061B (en) | An Imbalanced Classification Ensemble Method for Deep Transfer Learning | |
CN111368896A (en) | A classification method of hyperspectral remote sensing images based on dense residual 3D convolutional neural network | |
CN111489370B (en) | A segmentation method of remote sensing images based on deep learning | |
CN113408605A (en) | Hyperspectral image semi-supervised classification method based on small sample learning | |
CN103761726B (en) | Block adaptive image partition method based on FCM | |
CN110874590B (en) | Adapter-based mutual learning model training and visible light infrared vision tracking method | |
CN108510057A (en) | A kind of constructive method of the neural network model of ultra-deep confrontation study | |
CN102663436A (en) | Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images | |
CN109784392A (en) | A kind of high spectrum image semisupervised classification method based on comprehensive confidence | |
CN109829519B (en) | Remote sensing image classification method and system based on self-adaptive spatial information | |
CN105930877A (en) | Multimodal depth learning-based remote sensing image classification method | |
CN103353989A (en) | SAR image change detection method based on priori, fusion gray level and textural feature | |
CN109741340B (en) | Ice cover radar image ice layer refined segmentation method based on FCN-ASPP network | |
CN105279519A (en) | Remote sensing image water body extraction method and system based on cooperative training semi-supervised learning | |
CN108596204B (en) | Improved SCDAE-based semi-supervised modulation mode classification model method | |
CN103246894B (en) | A kind of ground cloud atlas recognition methods solving illumination-insensitive problem | |
CN106056165B (en) | A saliency detection method based on superpixel correlation-enhanced Adaboost classification learning | |
CN102169631A (en) | Manifold-learning-based traffic jam event cooperative detecting method | |
CN102208037A (en) | Hyper-spectral image classification method based on Gaussian process classifier collaborative training algorithm | |
CN104063713A (en) | Semi-autonomous on-line studying method based on random fern classifier | |
CN108010048A (en) | A kind of hippocampus dividing method of the automatic brain MRI image based on multichannel chromatogram | |
CN108491864A (en) | Based on the classification hyperspectral imagery for automatically determining convolution kernel size convolutional neural networks | |
CN106778683A (en) | Based on the quick Multi-angle face detection method for improving LBP features | |
CN107563406A (en) | A kind of image sophisticated category method of autonomous learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CP02 | Change in the address of a patent holder | ||
CP02 | Change in the address of a patent holder |
Address after: 221116 Research Institute of China University of Mining and Technology,, Jiangsu Patentee after: China University of Mining & Technology Address before: 221116 Research Institute, China University of Mining and Technology, Xuzhou University, Jiangsu, China, Patentee before: China University of Mining & Technology |
|
GR01 | Patent grant | ||
GR01 | Patent grant |