CN110781954A - Adaptive fuzzy superpixel generation method for polarized SAR image classification - Google Patents

Adaptive fuzzy superpixel generation method for polarized SAR image classification Download PDF

Info

Publication number
CN110781954A
CN110781954A CN201911017077.9A CN201911017077A CN110781954A CN 110781954 A CN110781954 A CN 110781954A CN 201911017077 A CN201911017077 A CN 201911017077A CN 110781954 A CN110781954 A CN 110781954A
Authority
CN
China
Prior art keywords
pixels
pixel
determined
superpixel
super
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911017077.9A
Other languages
Chinese (zh)
Inventor
郭雨薇
孙壮壮
范林玉
焦李成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Electronic Science and Technology
Original Assignee
Xian University of Electronic Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Electronic Science and Technology filed Critical Xian University of Electronic Science and Technology
Priority to CN201911017077.9A priority Critical patent/CN110781954A/en
Publication of CN110781954A publication Critical patent/CN110781954A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a superpixel generation method for polarized SAR image classification, which mainly solves the problems that the superpixel generation result in the prior art is low in precision and cannot adaptively adjust the proportion of pixels to be determined. The implementation scheme is as follows: 1) setting parameters and inputting a polarized SAR picture; 2) initializing a clustering center of a polarized SAR picture, and finding an overlapped search area and a non-overlapped search area; 3) iteratively calculating the clustering center and the membership matrix of the pixels in the overlapped search area until the maximum iteration times is reached or the change of the clustering center between two iterations is smaller than a set threshold value; 4) adaptively determining the proportion of pixels to be determined; 5) generating a fuzzy superpixel according to the membership matrix and the proportion of the pixels to be determined; the method improves the accuracy of the super-pixel generation result, can adaptively adjust the proportion of the pixels to be determined, and can be applied to polarized SAR image classification.

Description

Adaptive fuzzy superpixel generation method for polarized SAR image classification
Technical Field
The invention belongs to the technical field of remote sensing images, and particularly relates to a self-adaptive fuzzy superpixel generation method. Can be used for classifying the polarized SAR image.
Background
The purpose of the super-pixel method is to divide the original image into some smaller regions with similar characteristics so as to reserve some spatial neighborhood information, so that the efficiency of reclassifying the image on the basis of the spatial neighborhood information is higher than that of directly using pixel classification. Today, superpixels have been widely used in the field of computer vision, such as image classification, image segmentation, image enhancement, foreground extraction, and visual target tracking.
With the development of the polarimetric Synthetic Aperture Radar (SAR), the polarimetric SAR image is widely applied to land resource monitoring and damage assessment, the polarimetric SAR image classification is also an important research direction, and as the superpixel can reduce the calculation complexity while maintaining the region consistency, the polarimetric SAR image classification method based on the superpixel is also widely researched.
Almost all super-pixel generation methods are versatile and can be used as preprocessing methods for various applications. Diversity is an advantage, but since the specific properties of the application scenario itself are not taken into account in generating the superpixel, the superpixel generation result may not perform best in a particular application scenario. That is, the superpixel as a pre-processing of the classification of the polarized SAR image should consider two main aspects, on the one hand, the characteristics of the polarized SAR image, i.e., the polarized scattering information, and on the other hand, the classification accuracy requirement.
Achanta et al, at "Slic superpixels matched to state-of-the-array iterative pixel methods" propose a simple linear iterative clustering Slic method that uses a k-means clustering algorithm to generate superpixels, the number and compactness of which can be adjusted by parameters. As a general super-pixel generation method, SLIC does not utilize polarization scattering information in a polarized SAR image, so that the precision of super-pixels generated on the polarized SAR image is not high.
Guo et al, in "Fuzzy superpixels for polar imaging scientific", propose a Fuzzy superpixel FS method, which proposes the concept of Fuzzy superpixels on the basis of the SLIC method, and Fuzzy superpixels divide some pixels into undetermined pixels and do not belong to specific superpixels, thus improving the proportion of pure superpixels. However, since the FS method only uses the color and spatial information of the polarized SAR image and does not use the polarization scattering information, the accuracy of the superpixel generated on the polarized SAR image also increases the space, and the FS method cannot adaptively adjust the ratio of the pixels to be determined, but needs to be manually adjusted for different images, and when the number of images to be processed increases, parameter adjustment becomes very inconvenient.
Disclosure of Invention
The invention aims to provide a superpixel generation method for polarized SAR image classification aiming at the defects of the prior art, so as to improve the superpixel classification precision and realize the self-adaptive adjustment of the proportion of pixels to be determined.
The technical idea of the invention is as follows: generating superpixels from the polarization scattering information of the SAR image in addition to color and spatial information; the proportion of the pixels to be determined in the whole pixels is determined by the correlation self-adaptation among the pixels, and the implementation steps comprise the following steps:
(1) setting an expected super-pixel number K, a threshold value E, a maximum iteration number I and a distance weight parameter mf;
(2) initializing a clustering center of the superpixels, and finding out non-overlapping search areas and overlapping search areas;
(3) generating a fuzzy superpixel, namely dividing the pixel into a superpixel part and a to-be-determined pixel part:
(3a) updating the corresponding cluster centers of the pixels in the overlapped search area as follows:
Figure BDA0002246041300000021
wherein c is jRepresenting the cluster center of the pixels in the overlap region corresponding to superpixel j, n being the total number of pixels in the overlap search region, x iRepresents the characteristic of the ith pixel point in the overlapping search region, u ijRepresenting the membership degree of a pixel point i and a clustering center corresponding to the super pixel j, wherein m is a regularization parameter of the membership degree, and the value of i is from 1 to n;
(3b) updating the membership degree between the pixel i and the jth super-pixel cluster center in the overlapped search area as follows:
Figure BDA0002246041300000022
wherein c represents the overlapping search area as c clustersOverlapping parts of the class-centric search area, D ijIs the distance from the ith pixel to the jth pixel, D ikThe distance from the ith pixel to the kth clustering center is defined, m is a regularization parameter of membership degree, and k takes the value from 1 to c;
(3c) repeating the steps (3a) to (3b) until the maximum iteration number I or the change of the clustering center between two iterations is less than E;
(3d) calculating a fuzzy similarity relation matrix: wherein:
Figure BDA0002246041300000024
representing a blur similarity relation value, fea, between the ith pixel and the jth pixel tT channel, fea, representing SAR profile tmax,fea tmin represents the maximum and minimum values in the t channel of the characteristic diagram, if r ijIf it is less than zero, let r ij=0;
(3e) Calculating a correlation matrix of the overlapped search area:
Figure BDA0002246041300000031
wherein U is a matrix formed by membership degrees between all pixels and each clustering center in the overlapped search area, and the size of the matrix is nxc;
(3f) calculating the difference degree between the clustering centers:
Figure BDA0002246041300000032
wherein E tqThe element of the t row and the q column of the relevance matrix E, and sum (diag (E)) represents the sum of diagonal elements of the relevance matrix E;
(3g) determining the proportion of pixels to be determined in the overlapped search area according to the equation P which is 0.5 multiplied by Fand P < 1;
(3h) generating fuzzy superpixels according to the membership matrix U and the adaptively determined proportion P of the to-be-determined pixels, namely sequencing the maximum elements of each row in the membership matrix U from large to small, classifying the pixels sequenced at the n multiplied by P position into the superpixels corresponding to the row where the maximum elements of the row are located, and determining the pixels sequenced at the n multiplied by P position as the to-be-determined pixels, wherein n is the row number of the membership matrix U;
(4) taking each pixel to be determined as the center, and searching the number of superpixels in the region in the M multiplied by M surrounding the pixel: if the number of the super pixels is larger than 1, all pixels in the area are marked as pixels to be determined, and if the number of the super pixels is equal to 1, the pixels to be determined are classified into the super pixels.
Compared with the prior art, the invention has the following advantages:
first, the present invention utilizes color features, spatial features and polarization scattering features of polarized SAR images to generate superpixels, and is therefore more suitable for superpixel generation of polarized SAR images.
Secondly, the proportion of the pixels to be determined in the overlapped search area is adaptively adjusted through the correlation degree between the pixels in the overlapped search area, so that the complexity of parameter adjustment is reduced, and the accuracy of the super-pixel segmentation result is improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a graph of simulation results for producing superpixel results on a Fleviland image using the present invention;
FIG. 3 is a graph of simulation results for generating superpixels on an ESAR image using the present invention;
FIG. 4 is a graph of simulation results for generating superpixels on a San Francisco image using the present invention;
Detailed Description
Examples and effects of the present invention will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, the specific implementation steps of the present invention are further described in detail:
step 1, setting appropriate hyper-parameters according to data to be processed.
Acquiring a polarized SAR image Flevoland with the size of 300 multiplied by 270, setting the number K of expected superpixels of the image as 500, setting the weight mf of a fuzzy similarity relation as 0.6, setting a threshold value E as 0.001, setting a regularization parameter m of membership degree as 2, and setting the maximum iteration number it as 500;
and 2, initializing a clustering center, and finding out a non-overlapped search area and an overlapped search area.
(2a) Selecting K cluster centers at S pixel intervals on a regular grid according to the required number of superpixels, n is the total number of pixels in the image;
(2b) moving the clustering centers to the lowest gradient position in a 3 multiplied by 3 neighborhood, and setting the size of a search area for searching similar pixels to be 2S multiplied by 2S for each clustering center;
(2c) forming an overlapped search area by using overlapped pixels in the search areas corresponding to two or more clustering centers; forming a non-overlapping search area by using pixels only contained in a search area of one cluster center;
and 3, generating the fuzzy superpixel, namely dividing the pixel into a superpixel part and a to-be-determined pixel part.
(3a) Updating the corresponding cluster centers of the pixels in the overlapped search area as follows:
Figure BDA0002246041300000042
wherein c is jRepresenting the cluster center of the pixels in the overlap region corresponding to superpixel j, n being the total number of pixels in the overlap search region, x iRepresents the characteristic of the ith pixel point in the overlapping search region, u ijRepresenting the membership degree of a pixel point i and a clustering center corresponding to a super pixel j, wherein m is a regularization parameter of the membership degree, and the values of i and j are all from 1 to n;
(3b) updating membership between ith pixel and jth superpixel cluster center in overlapped search area
Figure BDA0002246041300000043
Wherein c represents the number of clustered centers in the overlapping search area, D ijIs the distance from the ith pixel to the jth pixel, D ikIs the distance from the ith pixel to the kth cluster center, and m is the degree of membershipRegularizing parameters, wherein k takes the value from 1 to c;
(3b1) respectively calculating the space distance ds from the ith pixel to the jth pixel ijAnd a color feature distance dc ij
Figure BDA0002246041300000044
Wherein sx i,sy iRepresenting spatial features of the ith pixel, sx j,sy jRepresenting the spatial characteristics of the jth pixel, l i,a i,b iRepresenting the color characteristics of the ith pixel in the color space lab, l j,a j,b jRepresenting the color feature of the jth pixel in the color space lab;
(3b2) according to the result of (3b1), calculating the distance D between the ith pixel and the jth super-pixel cluster center ij
Figure BDA0002246041300000051
Dsc therein ij=ds ij+dc ijFor a spatial characteristic distance ds ijDistance dc from color feature ijAnd r of ijRepresenting the fuzzy similarity relation value between the ith pixel and the jth pixel calculated according to the polarization scattering information, and mf is the weight of the fuzzy similarity relation value.
(3b3) Calculating the distance D from the ith pixel to the kth cluster center ikCalculation method and D ijThe calculation method is the same;
(3b4) updating the membership degree between the ith pixel and the jth super-pixel cluster center of the overlapped search area
Figure BDA0002246041300000052
(3c) Repeating the steps (3a) to (3b) until the maximum iteration time it or the change of the clustering center between two iterations is less than a threshold value E;
(3d) computing fuzzy similarity relation matrix
(3d1) Calculating the fuzzy similarity relation value of the ith pixel and the jth pixel:
Figure BDA0002246041300000054
wherein fea tT channel, fea, representing SAR profile tmax,fea tmin represents the maximum and minimum values in the t channel of the characteristic diagram, if r ijIf it is less than zero, let r ij0, t ranges from 1 to 9;
(3d2) calculating fuzzy similarity relation values among all n pixels in the overlapped search area to form a fuzzy similarity relation matrix
Figure BDA0002246041300000055
(3e) Calculating a correlation matrix of the overlapped search area:
Figure BDA0002246041300000056
wherein, U is a membership matrix between all pixels and each clustering center in the overlapped search area, the matrix size is n × c, n is the number of pixels in the overlapped search area, c represents the number of clustering centers in the overlapped search area, U is the number of clustering centers in the overlapped search area TRepresents the transpose of the matrix U;
(3f) calculating the difference degree between the clustering centers:
Figure BDA0002246041300000057
wherein E is tqThe element of the t row and q column of the relevance matrix E, sum (diag (E)) represents the sum of diagonal elements of the relevance matrix E;
(3g) Determining the proportion of pixels to be determined in the overlapped search area according to the equation P which is 0.5 multiplied by Fand P < 1;
(3h) and generating fuzzy superpixels according to the membership matrix U and the adaptively determined proportion P of the to-be-determined pixels, namely sequencing the maximum elements of each row in the membership matrix U from large to small, classifying the pixels sequenced at the n multiplied by P position into the superpixels corresponding to the columns where the maximum elements of the row are positioned, and determining the pixels sequenced at the n multiplied by P position as the to-be-determined pixels, wherein n is the row number of the membership matrix U.
And 4, carrying out category marking according to the number of the super pixels.
Taking each pixel to be determined as the center, and searching the number of superpixels in an M multiplied by M area around the pixel:
if the number of the super pixels is more than 1, marking all the pixels in the area as pixels to be determined;
if the number of the super pixels is equal to 1, the pixel to be determined is classified into the super pixel;
if the number of the super pixels is less than 1, no processing is performed.
The effects of the present invention will be described in further detail below with reference to simulation experiments.
1. Simulation experiment conditions
The simulation used three polarized SAR images with published data Fleviland, ESAR and San Francisco. The processor used for the simulation is AMD @Ryzen 52600, the simulation platform is windows10, and the simulation software is matlab 2018 b.
The existing superpixel generation method used in simulation comprises the following steps: SLIC, LSC, USEAQ, LearnedS, FS.
2. Analyzing simulation experiment contents and simulation experiment results:
simulation experiment one, the invention and the existing super-pixel generation method are used for generating super-pixels on a polarized SAR image Flevoland, then the super-pixel result is used for classification, the classification precision is compared, and the result when the number of generated super-pixels is 500 is shown in FIG. 2, wherein:
figure 2(f) shows the result of the super-pixel generation on flevioland data using the method of the present invention,
figure 2(a) shows the result of super-pixel generation on flevioland data using the SLIC method,
figure 2(b) shows the result of superpixel generation on fleviolnd data using the LSC method,
figure 2(c) shows the result of superpixel generation on flevioland data using the USEAQ method,
figure 2(d) shows the result of superpixel generation on Flevoland data using the lernneds method,
fig. 2(e) shows the result of super-pixel generation on Flevoland data by the FS method.
As can be seen from fig. 2, compared with the five conventional methods, the method of the present invention generates a superpixel on Flevoland data, where the proportion of superpixels in which all pixels are in the same class is higher, and the superpixel outline is more fit to the class boundary in the image.
The accuracy of the classification using the results of the 6 superpixel generation methods described above on the Flevoland data is counted, as in table one
TABLE-comparison of accuracy of classification of results in Flevoland using different superpixel generation methods
As can be seen from the table I, the classification precision of the super-pixel result on the Flevoland data by using the method is higher than that of the conventional SLIC, LSC, USEAQ, LearnedS and FS methods, and the precision fluctuation range is smaller and more stable.
Simulation experiment two, using the invention and the existing super-pixel generation method to generate super-pixels on the polarized SAR image ESAR, then classifying by using the super-pixel result, comparing the classification precision, and generating the result when the number of the super-pixels is 500 as shown in FIG. 3, wherein:
FIG. 3(f) is a diagram of the super-pixel generation results on ESAR data using the method of the present invention,
figure 3(a) is a diagram of the super-pixel generation result on ESAR data using the SLIC method,
FIG. 3(b) shows the result of superpixel generation on ESAR data using the LSC method,
FIG. 3(c) shows the result of superpixel generation on ESAR data using the USEAQ method,
FIG. 3(d) shows the result of superpixel generation on ESAR data using the LearnedS method,
FIG. 3(e) shows the result of superpixel generation on ESAR data using the FS method.
As can be seen from fig. 3, compared with the five conventional methods, the method of the present invention generates a superpixel generation result on the ESAR data, where all the pixels in the generated superpixel are of the same class, and the superpixel outline is more closely attached to the class boundary in the image.
The accuracy of classification of the superpixel results on ESAR data using the above 6 methods is counted, as shown in Table two
TABLE II accuracy comparison of classification of superpixel results in ESAR using different methods
Figure BDA0002246041300000072
As can be seen from the table II, the classification precision of the super-pixel result on ESAR data by using the method is higher than that of the conventional SLIC, LSC, USEAQ, LearnedS and FS methods, and the precision fluctuation range is smaller and more stable.
A third simulation experiment, namely performing superpixel generation on the polarized SAR image San Francisco by using the method of the invention and the existing superpixel generation method, classifying by using superpixel results, comparing the classification accuracy, and generating a result when the number of the generated superpixels is 500 as shown in FIG. 4, wherein:
FIG. 4(f) shows the super pixel generation results on San Francisco data using the method of the present invention,
FIG. 4(a) shows the result of superpixel generation on San Francisco data using the SLIC method,
FIG. 4(b) shows the super-pixel generation results on San Francisco data using the LSC method,
FIG. 4(c) shows the result of superpixel generation using the USEAQ method on San Francisco data,
FIG. 4(d) shows the result of superpixel generation on San Francisco data using the LearnedS method,
FIG. 4(e) shows the result of super-pixel generation on San Francisco data by the FS method,
as can be seen from fig. 4, compared with the five conventional methods, the super-pixel generation result of the method on the San Francisco data is higher in proportion of the super-pixels in which all the pixels are of the same class, and the contour of the super-pixels is more fit with the class boundary in the image.
The accuracy of the 6 methods described above for generating the same number of superpixels on the San Francisco data is counted as in Table three.
TABLE III comparison of accuracy of classification using different methods on superpixel results of San Francisco
Figure BDA0002246041300000081
As can be seen from the table III, the classification precision of the super-pixel result on the San Francisco data by using the method is higher than that of the existing SLIC, LSC, USEAQ, LearnedS and FS methods, and the precision fluctuation range is smaller and more stable.
In conclusion, the performance of the method on the three polarized SAR images is higher than that of the existing superpixel generation method.

Claims (3)

1. A superpixel generation method for polarized SAR image classification is characterized by comprising the following steps:
(1) setting an expected super-pixel number K, a threshold value E, a maximum iteration number I and a distance weight parameter mf;
(2) initializing a clustering center of the superpixels, and finding out non-overlapping search areas and overlapping search areas;
(3) generating a fuzzy superpixel, namely dividing the pixel into a superpixel part and a to-be-determined pixel part:
(3a) updating the corresponding cluster centers of the pixels in the overlapped search area as follows: wherein c is jRepresenting the cluster center of the pixels in the overlap region corresponding to superpixel j, n being the total number of pixels in the overlap search region, x iRepresents the characteristic of the ith pixel point in the overlapping search region, u ijRepresenting the membership degree of a pixel point i and a clustering center corresponding to the super pixel j, wherein m is a regularization parameter of the membership degree, and the value of i is from 1 to n;
(3b) updating the membership degree between the ith pixel and the jth super-pixel cluster center in the overlapped search area as follows:
Figure FDA0002246041290000012
wherein c denotes the overlapping part of the overlapping search area as c search areas, D ijIs the distance from the ith pixel to the jth pixel, D ikThe distance from the ith pixel to the kth clustering center is defined, m is a regularization parameter of membership degree, and k takes the value from 1 to c;
(3c) repeating the steps (3a) to (3b) until the maximum iteration number I or the change of the clustering center between two iterations is less than E;
(3d) calculating a fuzzy similarity relation matrix:
Figure FDA0002246041290000013
wherein:
Figure FDA0002246041290000014
representing a blur similarity relation value, fea, between the ith pixel and the jth pixel tT channel, fea, representing SAR profile tmax,fea tmin represents the maximum and minimum values in the t channel of the characteristic diagram, if r ijIf it is less than zero, let r ij=0;
(3e) Calculating a correlation matrix of the overlapped search area: wherein U is a matrix formed by membership degrees between all pixels and each clustering center in the overlapped search area, and the size of the matrix is nxc;
(3f) calculating the difference degree between the clustering centers:
Figure FDA0002246041290000016
wherein E tqThe element of the t row and the q column of the relevance matrix E, and sum (diag (E)) represents the sum of diagonal elements of the relevance matrix E;
(3g) determining the proportion of pixels to be determined in the overlapped search area according to the equation P which is 0.5 multiplied by Fand P < 1;
(3h) generating fuzzy superpixels according to the membership matrix U and the adaptively determined proportion P of the to-be-determined pixels, namely sequencing the maximum elements of each row in the membership matrix U from large to small, classifying the pixels sequenced at the n multiplied by P position into the superpixels corresponding to the row where the maximum elements of the row are located, and determining the pixels sequenced at the n multiplied by P position as the to-be-determined pixels, wherein n is the row number of the membership matrix U;
(4) taking each pixel to be determined as the center, and searching the number of superpixels in the region in the M multiplied by M surrounding the pixel: if the number of the super pixels is larger than 1, all pixels in the area are marked as pixels to be determined, and if the number of the super pixels is equal to 1, the pixels to be determined are classified into the super pixels.
2. The method of claim 1, wherein the cluster centers of superpixels are initialized in (2) and non-overlapping search regions and overlapping search regions are found as follows:
(2a) selecting K cluster centers at S pixel intervals on a regular grid according to the required number of superpixels,
Figure FDA0002246041290000024
n is the total number of pixels in the image;
(2b) moving the clustering centers to the lowest gradient position in a 3 multiplied by 3 neighborhood, and setting the size of a search area for searching similar pixels to be 2S multiplied by 2S for each clustering center;
(2c) forming an overlapped search area by using overlapped pixels in the search areas corresponding to two or more clustering centers; non-overlapping search areas are formed with pixels contained in search areas of only one cluster center.
3. The method of claim 1, wherein the distance D between the pixels in (3b) ijIt is calculated as follows:
(3b1) calculating the spatial distance ds from the ith pixel to the jth pixel ijAnd a color feature distance dc ij
Figure FDA0002246041290000021
Figure FDA0002246041290000022
Wherein sx i,sy iRepresenting spatial features of the ith pixel, sx j,sy jRepresenting the spatial characteristics of the jth pixel, l i,a i,b iRepresenting the color characteristics of the ith pixel in the color space lab, l j,a j,b jRepresenting the color feature of the jth pixel in the color space lab;
(3b2) calculating the distance D between the ith pixel and the jth pixel according to the result of (3b1) ij
Dsc therein ij=ds ij+dc ijFor a spatial characteristic distance ds ijDistance dc from color feature ijAnd r of ijRepresenting the fuzzy similarity relation value between the ith pixel and the jth pixel calculated according to the polarization scattering information, and mf is the weight of the fuzzy similarity relation value.
CN201911017077.9A 2019-10-24 2019-10-24 Adaptive fuzzy superpixel generation method for polarized SAR image classification Pending CN110781954A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911017077.9A CN110781954A (en) 2019-10-24 2019-10-24 Adaptive fuzzy superpixel generation method for polarized SAR image classification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911017077.9A CN110781954A (en) 2019-10-24 2019-10-24 Adaptive fuzzy superpixel generation method for polarized SAR image classification

Publications (1)

Publication Number Publication Date
CN110781954A true CN110781954A (en) 2020-02-11

Family

ID=69387381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911017077.9A Pending CN110781954A (en) 2019-10-24 2019-10-24 Adaptive fuzzy superpixel generation method for polarized SAR image classification

Country Status (1)

Country Link
CN (1) CN110781954A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413316A (en) * 2013-08-24 2013-11-27 西安电子科技大学 SAR image segmentation method based on superpixels and optimizing strategy
CN103824300A (en) * 2014-03-12 2014-05-28 西安电子科技大学 SAR (synthetic aperture radar) image segmentation method based on spatial correlation feature ultra-pixel block

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103413316A (en) * 2013-08-24 2013-11-27 西安电子科技大学 SAR image segmentation method based on superpixels and optimizing strategy
CN103824300A (en) * 2014-03-12 2014-05-28 西安电子科技大学 SAR (synthetic aperture radar) image segmentation method based on spatial correlation feature ultra-pixel block

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Y. GUO等: "Fuzzy Superpixels for Polarimetric SAR Images Classification", 《IEEE TRANSACTIONS ON FUZZY SYSTEMS》 *
郭雨薇: "基于超像素表征和模糊特征学习的图像分类与识别", 《中国博士学位论文全文数据库 信息科技辑》 *

Similar Documents

Publication Publication Date Title
Zhang et al. Exploiting clustering manifold structure for hyperspectral imagery super-resolution
Teodoro et al. A convergent image fusion algorithm using scene-adapted Gaussian-mixture-based denoising
Yuan et al. Factorization-based texture segmentation
Arun et al. CNN-based super-resolution of hyperspectral images
WO2022199583A1 (en) Image processing method and apparatus, computer device, and storage medium
Lin et al. Hyperspectral image denoising via matrix factorization and deep prior regularization
CN107146228B (en) A kind of super voxel generation method of brain magnetic resonance image based on priori knowledge
CN111160407B (en) Deep learning target detection method and system
US9449395B2 (en) Methods and systems for image matting and foreground estimation based on hierarchical graphs
Khan et al. A customized Gabor filter for unsupervised color image segmentation
JP2017199235A (en) Focus correction processing method by learning type algorithm
CN113344103B (en) Hyperspectral remote sensing image ground object classification method based on hypergraph convolution neural network
Ying-Hua et al. Polsar image segmentation by mean shift clustering in the tensor space
CN109190511A (en) Hyperspectral classification method based on part Yu structural constraint low-rank representation
CN107423771B (en) Two-time-phase remote sensing image change detection method
CN113838104B (en) Registration method based on multispectral and multimodal image consistency enhancement network
US8208731B2 (en) Image descriptor quantization
Zhao et al. Asymmetric bidirectional fusion network for remote sensing pansharpening
CN114049491A (en) Fingerprint segmentation model training method, fingerprint segmentation device, fingerprint segmentation equipment and fingerprint segmentation medium
CN113627481A (en) Multi-model combined unmanned aerial vehicle garbage classification method for smart gardens
Krupiński et al. Binarization of degraded document images with generalized Gaussian distribution
Nair et al. Hyperspectral image fusion using fast high-dimensional denoising
CN110781954A (en) Adaptive fuzzy superpixel generation method for polarized SAR image classification
Fan et al. Infrared image enhancement based on saliency weight with adaptive threshold
CN112465837B (en) Image segmentation method for sparse subspace fuzzy clustering by utilizing spatial information constraint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200211