CN112329818A - Hyperspectral image unsupervised classification method based on graph convolution network embedded representation - Google Patents

Hyperspectral image unsupervised classification method based on graph convolution network embedded representation Download PDF

Info

Publication number
CN112329818A
CN112329818A CN202011124146.9A CN202011124146A CN112329818A CN 112329818 A CN112329818 A CN 112329818A CN 202011124146 A CN202011124146 A CN 202011124146A CN 112329818 A CN112329818 A CN 112329818A
Authority
CN
China
Prior art keywords
representation
vertex
pixel points
hyperspectral
graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011124146.9A
Other languages
Chinese (zh)
Other versions
CN112329818B (en
Inventor
孙玉宝
陈逸
周旺平
闫培新
雷铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Information Science and Technology
Original Assignee
Nanjing University of Information Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Information Science and Technology filed Critical Nanjing University of Information Science and Technology
Priority to CN202011124146.9A priority Critical patent/CN112329818B/en
Publication of CN112329818A publication Critical patent/CN112329818A/en
Application granted granted Critical
Publication of CN112329818B publication Critical patent/CN112329818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a hyperspectral image unsupervised classification method based on graph convolution network embedded representation, which comprises the following steps of: sequentially performing EMP and spectral feature extraction on a to-be-hyperspectral image to obtain a spatial-spectral combined feature; performing superpixel segmentation on the space spectrum combination characteristics to obtain superpixel points of the image to be hyperspectral; solving the representation of the super pixel points by the aid of a de-elastic network, and constructing a graph model of the super pixel points by taking the super pixel points related to non-zero components in representation coefficients represented by the de-elastic network as neighbors of a current point; performing graph convolution network embedding representation learning based on a graph model, and obtaining low-dimensional features through hierarchical vertex convergence operation; according to the low-dimensional feature representation, unsupervised classification of the hyperspectral images is achieved by using a K-means algorithm, and the purpose of accurately classifying the hyperspectral images can be achieved.

Description

Hyperspectral image unsupervised classification method based on graph convolution network embedded representation
Technical Field
The invention relates to the technical field of image processing, in particular to a hyperspectral image unsupervised classification method based on graph convolution network embedded representation.
Background
In the last 80 th century, the hyperspectral remote sensing technology began to rise and rapidly developed, and the capability of human beings to observe and know earth surface things was qualitatively leaped. The hyperspectral remote sensing technology can acquire the space image of the observed ground object and simultaneously capture the corresponding spectral information, so that the hyperspectral image is presented as a three-dimensional cube of data, and the first-time map-integrated imaging in the true sense is realized. The plurality of spectral bands of each pixel form a spectral curve, which contains rich information of the earth surface object components and can be used for identifying different earth surface object types. Currently, hyperspectral image classification has become a popular research in the field of hyperspectral remote sensing at present. Due to the characteristics of the hyperspectral image, the classification of the hyperspectral image also faces certain challenges.
The hyperspectral image classifier learning aspect comprises a supervised model, an unsupervised model and a semi-supervised classification model, and the difference is whether a data sample with a label is used in a model training stage. The cost of the hyperspectral image is high when enough samples with marks are required to be obtained. The unsupervised method does not depend on label class information of the samples, only needs to mine the internal attributes and rules of the data of the uncalibrated samples through the classifier, divides the uncalibrated samples into different clusters according to the difference between different pixels, does not need to train the model, and does not need to train the number of the training samples. Therefore, unsupervised classification models become an attractive alternative for hyperspectral image classification. The typical unsupervised classification algorithm comprises a K-means clustering algorithm, an FCM algorithm, a spectral clustering algorithm and the like, wherein the K-means clustering algorithm continuously and iteratively updates the class center to minimize the square sum value by calculating the average sum value of the distances from the pixels near the class center to the pixels, and is a simple, common and effective algorithm.
Unsupervised classification of hyperspectral images has relied on efficient characterization by researchers in a number of attempts. The original method usually uses spectral information directly, and it is difficult to obtain a robust classification result. After the importance of spatial information in hyperspectral classification is recognized, the method of morphological transformation is adopted by the two people of the Pesaresi and the Benedicktsson to construct morphological distribution characteristics for extracting spatial structure information. Considering the characteristic of integrating the space and the spectrum of the hyperspectral image, Fauvel and Chanussot combine the extensible morphological space characteristic EMP and the spectral characteristic to extract a space and spectrum combined characteristic representation, and then a support vector machine model is adopted for classification, so that the performance improvement of algorithm classification is realized.
For the problem of overhigh dimension of the spatial-spectral combined feature, dimension reduction must be carried out on the feature, and the conventional dimension reduction method includes PCA and the like. However, the classical graph model cannot process a large-size hyperspectral image because of the SVD classification of the graph Laplace matrix, and meanwhile, the classical graph model is a shallow learning model and is not beneficial to extracting inherent low-dimensional features. The graph convolution network operates on one graph and can be suitable for non-Euclidean irregular data based on the graph, fully utilizes image characteristics and flexibly keeps class boundaries.
Disclosure of Invention
Aiming at the problems, the invention provides a hyperspectral image unsupervised classification method based on graph convolution network embedded representation.
In order to realize the aim of the invention, the invention provides a hyperspectral image unsupervised classification method based on graph convolution network embedded representation, which comprises the following steps:
s10, sequentially performing EMP and spectral feature extraction on the image to be hyperspectral to obtain a space-spectrum combined feature;
s20, performing superpixel segmentation on the space spectrum combined features to obtain superpixel points of the image to be hyperspectral;
s30, solving the representation of the super pixel points by the elastic resolution network, and constructing a graph model of the super pixel points by taking the super pixel points associated with the nonzero component in the representation coefficients represented by the elastic resolution network as the neighbors of the current point;
s40, carrying out graph convolution network embedding characterization learning based on the graph model, and obtaining low-dimensional features through hierarchical vertex convergence operation;
and S50, according to the low-dimensional feature representation, realizing unsupervised classification of the hyperspectral images by using a K-means algorithm.
In one embodiment, performing EMP and spectral feature extraction on an image to be hyperspectral sequentially to obtain a spatial-spectral combined feature includes:
Figure BDA0002733037650000021
wherein V represents a space-spectrum combined feature matrix, X represents a spectrum feature matrix, EMP represents an EMP feature matrix, m is the number of principal components, N is the number of circular structural elements with different radiuses, d is the number of spectrum bands, and N is the number of samples.
In one embodiment, the performing superpixel segmentation on the spatial-spectral combination feature to obtain a superpixel point of the image to be hyperspectral includes:
Di,c=(1-λ)×Dspectral+λ×Dspatial
Dspectral=tan(SAD(xi,xc)),
Figure BDA0002733037650000022
wherein D isspectralInter-spectral distance, SAD (x), representing a measure of tani,xc) Representing the angular distance of the spectrum, DspatialRepresenting normalized space Euclidean distance, r is the diagonal length of the search neighborhood, for constraining the search range to a local neighborhood around each cluster center, Di,cRepresenting a weighted sum of spectral and spatial distances, λ representing an action parameter balancing the spectral and spatial distances, xc=(xc1,xc2,L,xcn) Represents the center of the cluster, and the corresponding spatial coordinates are (cx, cy), xi=(xi1,xi2,L,xil) The representation is located at the cluster center xcThe local neighborhood of l pixels has corresponding spatial coordinates (ix, iy). The patent suggests setting the parameter lambda to less than 0.5.
In one embodiment, solving the representation of the super pixel points, and taking the super pixel points associated with the non-zero component in the representation coefficients of the representation of the super pixel points as the neighbors of the current point, and constructing the graph model of the super pixel points includes:
s31, selecting all other super pixel points to construct a dictionary based on each super pixel point, and finding out the elastic network representation of all pixel points in the data set by solving the following constraint optimization problem:
Figure BDA0002733037650000031
s.t.sxi=SDisci+ei
wherein, SDiIs a dictionary formed by all super pixel points, sciIs a super pixel sxiBased on dictionary SDiObtained expression coefficient, SC ═ SC1,sc2,L,scN]Is a coefficient matrix, E is a characterizing error matrix, λ and γ are regularizing parameters, EiIs an error vector;
s32, constructing an elastic network graph model of the super-pixel points according to the elastic network sparse representation coefficients of each sample point, and defining a coefficient matrix SC according to the elastic network sparse representation coefficients
Figure BDA0002733037650000032
And as an adjacency matrix of the graph model, establishing edge connection between the hyperspectral pixel points to obtain the graph model of the hyper-pixel points.
In one embodiment, graph model-based graph convolution network embedded characterization learning is performed, and obtaining low-dimensional features through a hierarchical vertex convergence operation comprises the following steps:
s41, sampling each vertex of the graph model layer by layer, randomly sampling a fixed number of neighborhood vertices on each layer of the graph model by GraphSAGE in a random walk mode, and approximating the vertex which is not sampled by using the historical expression of the vertex;
s42, updating self information by the graph SAGE through aggregating neighborhood vertex information, wherein each aggregation is to aggregate the characteristics of each neighbor vertex obtained by the previous layer once, then combine the characteristics of the vertex and the previous layer to obtain the embedded characteristics of the layer, and repeat aggregation for K times to obtain the final embedded characteristics of the vertex, wherein the vertex characteristics of the initial layer are input sample characteristics;
s43, convolution of definition diagram:
Figure BDA0002733037650000041
where σ is a non-linear activation function, WkIs a weight matrix to be learned, is used for information propagation between different layers of the model,
Figure BDA0002733037650000042
representing the embedded feature obtained from a layer above the vertex v, and the final low-dimensional feature obtained by obtaining the K layer is represented as
Figure BDA0002733037650000043
N (v) represents a set of domain points for vertex v, CONCAT () represents a concatenation of the two matrices;
s44, designing a loss function:
Figure BDA0002733037650000044
wherein z isuRepresenting the final embedded feature representation of any vertex in the graph model, the superscript T representing the transposition, v representing the vertex that appears with vertex u on a fixed-length random walk, PnIs a negative sample distribution, Q represents the number of negative samples, zvRepresenting the final low dimensional features obtained by acquiring the K-th layer.
In one embodiment, the unsupervised classification of the hyperspectral image according to the low-dimensional feature representation by using the K-means algorithm comprises the following steps:
s51, clustering the low-dimensional features of the super pixel points by using a K-means algorithm to obtain a label matrix of the super pixel points;
and S52, restoring the super pixel points to the original pixel points, and matching the clustering result with the real category in an optimal mode through the Hungarian algorithm to realize unsupervised classification of the hyperspectral images.
According to the hyperspectral image unsupervised classification method based on the graph convolution network embedded representation, the spatial features and the spectral features are combined to form the empty-spectrum combined features, superpixel segmentation is carried out, then elastic network decomposition is carried out on each superpixel point, a superpixel graph model is constructed, the subsequent calculation complexity is reduced, the graph convolution network is utilized to learn an embedding method in deep learning, unsupervised classification learning is carried out based on the more excellent embedded representation of the graph convolution network, and the purpose of hyperspectral image accurate classification is achieved.
Drawings
FIG. 1 is a flowchart of a hyperspectral image unsupervised classification method based on graph convolution network embedded representation according to an embodiment;
FIG. 2 is a diagram illustrating simulation results according to an embodiment;
FIG. 3 is a diagram illustrating simulation results according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a flowchart of an embodiment of an unsupervised classification method for hyperspectral images based on graph convolution network embedded representation, which introduces a superpixel idea, performs elastic network decomposition on each superpixel point and constructs a graph model of the superpixel, introduces a graph convolution network idea, processes the graph model through the graph convolution network, well learns the characteristics of graph vertexes and neighborhoods thereof, learns a more optimal embedded representation, and further performs unsupervised method classification based on the graph convolution network embedded representation, so as to obtain a more optimal classification result, and specifically includes the following steps:
and S10, sequentially performing EMP and spectral feature extraction on the image to be hyperspectral to obtain a space-spectrum combined feature.
And S20, performing superpixel segmentation on the space spectrum combined features to obtain superpixel points of the image to be hyperspectral.
S30, solving the representation of the super pixel points, and constructing a graph model of the super pixel points by taking the super pixel points associated with the nonzero component in the representation coefficients represented by the representation of the super pixel points as the neighbors of the current point.
And S40, carrying out graph convolution network embedding characterization learning based on the graph model, and obtaining low-dimensional features through hierarchical vertex convergence operation.
And S50, according to the low-dimensional feature representation, realizing unsupervised classification of the hyperspectral images by using a K-means algorithm.
According to the hyperspectral image unsupervised classification method based on the graph convolution network embedded representation, the spatial features and the spectral features are combined to form the empty-spectrum combined features, superpixel segmentation is carried out, then elastic network decomposition is carried out on each superpixel point, a superpixel graph model is constructed, the subsequent calculation complexity is reduced, the graph convolution network is utilized to learn an embedding method in deep learning, unsupervised classification learning is carried out based on the more excellent embedded representation of the graph convolution network, and the purpose of hyperspectral image accurate classification is achieved.
In one embodiment, performing EMP and spectral feature extraction on an image to be hyperspectral sequentially to obtain a spatial-spectral combined feature includes:
Figure BDA0002733037650000051
wherein V represents a space-spectrum combined feature matrix, X represents a spectrum feature matrix, EMP represents an EMP feature matrix, m is the number of principal components, N is the number of circular structural elements with different radiuses, d is the number of spectrum bands, and N is the number of samples.
In one embodiment, the performing superpixel segmentation on the spatial-spectral combination feature to obtain a superpixel point of the image to be hyperspectral includes:
Di,c=(1-λ)×Dspectral+λ×Dspatial
Dspectral=tan(SAD(xi,xc)),
Figure BDA0002733037650000061
wherein D isspectralInter-spectral distance, SAD (x), representing a measure of tani,xc) Representing the angular distance of the spectrum, DspatialRepresenting normalized space Euclidean distance, r is the diagonal length of the search neighborhood, for constraining the search range to a local neighborhood around each cluster center, Di,cRepresenting a weighted sum of spectral and spatial distances, λ representing an action parameter balancing the spectral and spatial distances, xc=(xc1,xc2,L,xcn) Represents the center of the cluster, and the corresponding spatial coordinates are (cx, cy), xi=(xi1,xi2,L,xil) The representation is located at the cluster center xcThe local neighborhood of l pixels has corresponding spatial coordinates (ix, iy). Specifically, the parameter λ is set to be less than 0.5.
In this embodiment, the similarity calculation standard in superpixel segmentation is as follows:
Di,c=(1-λ)×Dspectral+λ×Dspatial
Dspectral=tan(SAD(xi,xc)),
Figure BDA0002733037650000062
wherein D isspectralTo representDistance between spectra, SAD (x), measured as tani,xc) Represents the Spectral Angle (SAD) Distance with a value range of 0, pi]。DspatialAnd (3) expressing the normalized space Euclidean distance, wherein r is the diagonal length of a search neighborhood, and the search range is restricted to be a local neighborhood around the center of each cluster. Di,cRepresenting a weighted sum of spectral and spatial distances, a parameter λ balancing the effect of the spectral and spatial distances, xc=(xc1,xc2,L,xcn) Represents the center of the cluster, and the corresponding spatial coordinates are (cx, cy), xi=(xi1,xi2,L,xil) The representation is located at the cluster center xcThe local neighborhood of l pixels has corresponding spatial coordinates (ix, iy). The present embodiment suggests that the parameter λ is set to be less than 0.5.
Further, a cluster center is initialized randomly, and each pixel is judged to belong to the cluster center with the shortest distance according to the distance between the cluster center and the local neighborhood pixels. The characteristic of each super pixel point is represented by the average value of the space spectrum combined characteristics of all pixel points in the local neighborhood, and the label of each super pixel point is determined by the category with the maximum number of labels of all pixel points in the local neighborhood. The specific process can comprise the following steps: firstly, randomly selecting a pixel point as a dinner gathering center, selecting a part of pixel points with the shortest distance to belong to the center according to the distance between the center and the neighborhood pixel points obtained by calculation, then calculating the average value of various points to obtain a new clustering center, and determining the final center and the pixel points belonging to the center to form super pixel points through repeated iteration.
In one embodiment, solving the representation of the super pixel points, and taking the super pixel points associated with the non-zero component in the representation coefficients of the representation of the super pixel points as the neighbors of the current point, and constructing the graph model of the super pixel points includes:
s31, selecting all other super pixel points to construct a dictionary based on each super pixel point, and finding out the elastic network representation of all pixel points in the data set by solving the following constraint optimization problem:
Figure BDA0002733037650000071
s.t.sxi=SDisci+ei
wherein, SDiIs a dictionary formed by all super pixel points, sciIs a super pixel sxiBased on dictionary SDiObtained expression coefficient, SC ═ SC1,sc2,L,scN]Is a coefficient matrix, E is a characterizing error matrix, λ and γ are regularizing parameters, EiIs an error vector;
s32, constructing an elastic network graph model of the super-pixel points according to the elastic network sparse representation coefficients of each sample point, and defining a coefficient matrix SC according to the elastic network sparse representation coefficients
Figure BDA0002733037650000072
And as an adjacency matrix of the graph model, establishing edge connection between the hyperspectral pixel points to obtain the graph model of the hyper-pixel points.
In one embodiment, graph model-based graph convolution network embedded characterization learning is performed, and obtaining low-dimensional features through a hierarchical vertex convergence operation comprises the following steps:
s41, sampling each vertex of the graph model in a layering mode (K layers), randomly sampling a fixed number of neighborhood vertices at each layer of the graph model by using a graph and aggreGatE mode, and approximating the vertex without sampling by using a history expression of the vertex;
s42, updating self information by the graph SAGE through aggregating neighborhood vertex information, wherein each aggregation is to aggregate the characteristics of each neighbor vertex obtained by the previous layer once, then combine the characteristics of the vertex and the previous layer to obtain the embedded characteristics of the layer, and repeat aggregation for K times to obtain the final embedded characteristics of the vertex, wherein the vertex characteristics of the initial layer are input sample characteristics; wherein the vertex features of the initial layer are the input sample features. The present invention contemplates two polymerization modes: average polymerization and pond polymerization;
s43, convolution of definition diagram:
Figure BDA0002733037650000081
where σ is a non-linear activation function, WkIs a weight matrix to be learned, is used for information propagation between different layers of the model,
Figure BDA0002733037650000082
representing the embedded feature obtained from a layer above the vertex v, and the final low-dimensional feature obtained by obtaining the K layer is represented as
Figure BDA0002733037650000083
N (v) represents a set of domain points for vertex v, CONCAT () represents a concatenation of the two matrices;
s44, designing a loss function to obtain low-dimensional characteristics according to the graph convolution and the loss function, wherein the loss function specifically comprises:
Figure BDA0002733037650000084
wherein z isuRepresenting the final embedded feature representation of any vertex in the graph model, the superscript T representing the transposition, v representing the vertex that appears with vertex u on a fixed-length random walk, PnIs a negative sample distribution, Q represents the number of negative samples, zvRepresenting the final low dimensional features obtained by acquiring the K-th layer.
In one example, where the vertex features of the initial layer are the input sample features, this example attempts two aggregation approaches: average polymerization and pond polymerization.
The average aggregation is achieved by solving the average value of the last layer of embedding of the neighborhood vertices, defined as follows:
Figure BDA0002733037650000085
wherein N (v) represents the neighborhood of vertex v,
Figure BDA0002733037650000086
represents aggregated information of any adjacent vertex to vertex v.
And in the pooling aggregation, vectors of all adjacent vertexes share weight, and the vectors pass through a nonlinear full-connection layer and then are subjected to maximum pooling to aggregate neighborhood information, so that more effective embedding characteristics are obtained. The specific formula is as follows:
Figure BDA0002733037650000087
where max is the maximum operator and δ is a non-linear activation function. In principle, each neighborhood vertex can get a vector by passing through a multilayer perceptron of any depth independently, and the multilayer perceptron can be regarded as a group of functions WpoolAnd calculating the embedded characteristic of each neighborhood vertex.
In one embodiment, the unsupervised classification of the hyperspectral image according to the low-dimensional feature representation by using the K-means algorithm comprises the following steps:
s51, clustering the low-dimensional features of the super pixel points by using a K-means algorithm to obtain a label matrix of the super pixel points;
and S52, restoring the super pixel points to the original pixel points, and matching the clustering result with the real category in an optimal mode through the Hungarian algorithm to realize unsupervised classification of the hyperspectral images.
The hyperspectral image unsupervised classification method based on the graph convolution network embedded representation utilizes the graph convolution network to establish a hyperspectral image unsupervised classification model based on the graph convolution network embedded representation, and the model combines spatial features and spectral features to form a spatial-spectral combined feature on feature representation. And super pixel segmentation is carried out, so that elastic network decomposition is carried out on each super pixel point and a super pixel graph model is constructed, and the subsequent calculation complexity is reduced. The graph convolution network learning embedding method in deep learning is used, unsupervised classification learning is carried out based on the better embedding representation of the graph convolution network, and the purpose of accurately classifying the hyperspectral images is achieved.
In one embodiment, in order to verify the effect of the hyperspectral image unsupervised classification method based on graph convolution network embedded representation, a simulation experiment is performed, the specifications of testing sequences of Indian Pines-13(IP-13) and S-Pavia University Scene (S-PUS) are 145 × 145 and 306 × 340 respectively, a related parameter λ of superpixel segmentation is set to 0.3, the number K of sampling layers of graph convolution network GraphSAGE is set to 2, and the number of neighborhood samples is set to S125 and S2For 10, 25 first-order neighbors are sampled, 10 second-order neighbors are sampled, 50 random walks with step size 5 are made for each vertex, the word2vec is negatively sampled, and 20 are sampled for each node. The Batchsize is set to 512, the epochs is set to 5, the weight attenuation is set to 0.0001, the implementation is realized under a TensorFlow platform, an Adam optimizer is selected, and the dimensionalities of the output embedding representation are all C +1, wherein C is the category number of the corresponding data set.
The evaluation of the experiment used both qualitative and quantitative analytical methods.
The comparison of the hyperspectral image classification effect of the method and each algorithm shows that the classification effect of the method is obviously superior to that of other algorithms for different hyperspectral image data sets.
For quantitative comparative analysis, OA, AA and were used for evaluation. Wherein, OA is the total Accuracy (Overall Accuracy) of all sample classifications, AA is the Average Accuracy (Average Accuracy) of all sample classifications, and Kappa coefficient, which are calculated as follows:
Figure BDA0002733037650000091
Figure BDA0002733037650000101
Figure BDA0002733037650000102
where c is the number of sample classes, miiRepresenting the number of samples classified into the ith class from the ith class in the classification process, N is the total number of samples, piIndicates the accuracy of each sample class classification, NiIndicating the total number of class i samples.
After the model is used for classification, the method has a great effect in the classification of hyperspectral images, and the graph convolution network learning embedding is used for obtaining more excellent low-dimensional features, so that the classification accuracy is improved.
When quantitative comparison is carried out, classification is carried out on two hyperspectral image data sets, the classification result of each data set is compared with the Grountruth, and the corresponding OA, AA and value are calculated. FIGS. 2 and 3 show OA, AA, and values of the inventive algorithm and other algorithms in the data sets Indian pipes-13 (IP-13) and S-Pavia University Scene (S-PUS), respectively.
In summary, compared with the conventional composition method, the super-pixel segmentation is introduced, the super-pixels are used for replacing a plurality of pixel points of the local neighborhood, redundant information is removed, block composition is avoided, the number of the top points is greatly reduced, and the composition complexity is reduced. In addition, the graph model is processed by using the graph convolution network, the deep embedding method on the graph model is provided, the characteristics of the graph vertex and the neighborhood of the graph vertex can be well learned, a better embedding representation can be obtained, and the accuracy of subsequent clustering is further improved. The algorithm of the embodiment has certain advantages in view of the accuracy of classification and visual effect.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
It should be noted that the terms "first \ second \ third" referred to in the embodiments of the present application merely distinguish similar objects, and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may exchange a specific order or sequence when allowed. It should be understood that "first \ second \ third" distinct objects may be interchanged under appropriate circumstances such that the embodiments of the application described herein may be implemented in an order other than those illustrated or described herein.
The terms "comprising" and "having" and any variations thereof in the embodiments of the present application are intended to cover non-exclusive inclusions. For example, a process, method, apparatus, product, or device that comprises a list of steps or modules is not limited to the listed steps or modules but may alternatively include other steps or modules not listed or inherent to such process, method, product, or device.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (6)

1. A hyperspectral image unsupervised classification method based on graph convolution network embedded representation is characterized by comprising the following steps:
s10, sequentially performing EMP and spectral feature extraction on the image to be hyperspectral to obtain a space-spectrum combined feature;
s20, performing superpixel segmentation on the space spectrum combined features to obtain superpixel points of the image to be hyperspectral;
s30, solving the representation of the super pixel points by the elastic resolution network, and constructing a graph model of the super pixel points by taking the super pixel points associated with the nonzero component in the representation coefficients represented by the elastic resolution network as the neighbors of the current point;
s40, carrying out graph convolution network embedding characterization learning based on the graph model, and obtaining low-dimensional features through hierarchical vertex convergence operation;
and S50, according to the low-dimensional feature representation, realizing unsupervised classification of the hyperspectral images by using a K-means algorithm.
2. The method for unsupervised classification of hyperspectral images based on graph convolution network embedded representation according to claim 1, wherein the step of performing EMP and spectral feature extraction on the hyperspectral images in sequence to obtain the spatio-spectral combined features comprises the steps of:
Figure FDA0002733037640000011
wherein V represents a space-spectrum combined feature matrix, X represents a spectrum feature matrix, EMP represents an EMP feature matrix, m is the number of principal components, N is the number of circular structural elements with different radiuses, d is the number of spectrum bands, and N is the number of samples.
3. The hyperspectral image unsupervised classification method based on graph convolution network embedded representation according to claim 1 is characterized in that the superpixel segmentation is performed on the space-spectrum combined features, and the obtaining of superpixel points of the hyperspectral image comprises the following steps:
Di,c=(1-λ)×Dspectral+λ×Dspatial
Dspectral=tan(SAD(xi,xc)),
Figure FDA0002733037640000012
wherein D isspectralInter-spectral distance, SAD (x), representing a measure of tani,xc) Representing the angular distance of the spectrum, DspatialRepresenting normalized space Euclidean distance, r is the diagonal length of the search neighborhood, for constraining the search range to a local neighborhood around each cluster center, Di,cRepresenting a weighted sum of spectral and spatial distances, λ representing an action parameter balancing the spectral and spatial distances, xc=(xc1,xc2,L,xcn) Represents the center of the cluster, and the corresponding spatial coordinates are (cx, cy), xi=(xi1,xi2,L,xil) The representation is located at the cluster center xcThe local neighborhood of l pixels has corresponding spatial coordinates (ix, iy).
4. The hyperspectral image unsupervised classification method based on graph convolution network embedded representation according to claim 1 is characterized in that solving the representation of the superelastic network of the superpixel points, taking the superpixel points associated with nonzero components in the representation coefficients represented by the superelastic network as neighbors of the current point, and constructing the graph model of the superpixel points comprises:
s31, selecting all other super pixel points to construct a dictionary based on each super pixel point, and finding out the elastic network representation of all pixel points in the data set by solving the following constraint optimization problem:
Figure FDA0002733037640000021
s.t.sxi=SDisci+ei
wherein, SDiIs a dictionary formed by all super pixel points, sciIs a super pixel sxiBased on dictionary SDiObtained expression coefficient, SC ═ SC1,sc2,L,scN]Is a coefficient matrix, E is a characterizing error matrix, λ and γ are regularizing parameters, EiIs an error vector;
s32, constructing an elastic network graph model of the super-pixel points according to the elastic network sparse representation coefficients of each sample point, and defining a coefficient matrix SC according to the elastic network sparse representation coefficients
Figure FDA0002733037640000022
And as an adjacency matrix of the graph model, establishing edge connection between the hyperspectral pixel points to obtain the graph model of the hyper-pixel points.
5. The method for unsupervised classification of hyperspectral images based on graph convolution network embedded representation according to claim 1 is characterized in that graph convolution network embedded representation learning is performed based on a graph model, and obtaining low-dimensional features through hierarchical vertex convergence operation comprises:
s41, sampling each vertex of the graph model layer by layer, randomly sampling a fixed number of neighborhood vertices on each layer of the graph model by GraphSAGE in a random walk mode, and approximating the vertex which is not sampled by using the historical expression of the vertex;
s42, updating self information by the graph SAGE through aggregating neighborhood vertex information, wherein each aggregation is to aggregate the characteristics of each neighbor vertex obtained by the previous layer once, then combine the characteristics of the vertex and the previous layer to obtain the embedded characteristics of the layer, and repeat aggregation for K times to obtain the final embedded characteristics of the vertex, wherein the vertex characteristics of the initial layer are input sample characteristics;
s43, convolution of definition diagram:
Figure FDA0002733037640000023
where σ is a non-linear activation function, WkIs a weight matrix to be learned, is used for information propagation between different layers of the model,
Figure FDA0002733037640000024
representing the embedded feature obtained from a layer above the vertex v, and the final low-dimensional feature obtained by obtaining the K layer is represented as
Figure FDA0002733037640000025
N (v) represents a set of domain points for vertex v, CONCAT () represents a concatenation of the two matrices;
s44, designing a loss function:
Figure FDA0002733037640000031
wherein z isuRepresenting the final embedded feature representation of any vertex in the graph model, the superscript T representing the transposition, and v representing the feature representation in oneFixed-length random walks up to a vertex, P, that appears with vertex unIs a negative sample distribution, Q represents the number of negative samples, zvRepresenting the final low dimensional features obtained by acquiring the K-th layer.
6. The method for unsupervised classification of hyperspectral images based on graph convolution network embedded representation according to claim 1, wherein the implementation of unsupervised classification of hyperspectral images by using a K-means algorithm according to low-dimensional feature representation comprises:
s51, clustering the low-dimensional features of the super pixel points by using a K-means algorithm to obtain a label matrix of the super pixel points;
and S52, restoring the super pixel points to the original pixel points, and matching the clustering result with the real category in an optimal mode through the Hungarian algorithm to realize unsupervised classification of the hyperspectral images.
CN202011124146.9A 2020-10-20 2020-10-20 Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization Active CN112329818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011124146.9A CN112329818B (en) 2020-10-20 2020-10-20 Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011124146.9A CN112329818B (en) 2020-10-20 2020-10-20 Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization

Publications (2)

Publication Number Publication Date
CN112329818A true CN112329818A (en) 2021-02-05
CN112329818B CN112329818B (en) 2023-07-07

Family

ID=74312064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011124146.9A Active CN112329818B (en) 2020-10-20 2020-10-20 Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization

Country Status (1)

Country Link
CN (1) CN112329818B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113298129A (en) * 2021-05-14 2021-08-24 西安理工大学 Polarized SAR image classification method based on superpixel and graph convolution network
CN113628289A (en) * 2021-07-21 2021-11-09 武汉大学 Hyperspectral image nonlinear unmixing method and system based on graph convolution self-encoder
CN114022786A (en) * 2021-12-10 2022-02-08 深圳大学 Hyperspectral image classification method based on graph-in-graph convolution network
CN117853769A (en) * 2024-01-08 2024-04-09 西安交通大学 Hyperspectral remote sensing image rapid clustering method based on multi-scale image fusion

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392454A (en) * 2014-12-03 2015-03-04 复旦大学 Merging method of membership scoring based on ground object categories under spatial-spectral combined classification frame for hyper-spectral remote sensing images
CN108009559A (en) * 2016-11-02 2018-05-08 哈尔滨工业大学 A kind of Hyperspectral data classification method based on empty spectrum united information
CN110111338A (en) * 2019-04-24 2019-08-09 广东技术师范大学 A kind of visual tracking method based on the segmentation of super-pixel time and space significance
CN110363236A (en) * 2019-06-29 2019-10-22 河南大学 The high spectrum image extreme learning machine clustering method of sky spectrum joint hypergraph insertion
CN110399909A (en) * 2019-07-08 2019-11-01 南京信息工程大学 A kind of hyperspectral image classification method based on label constraint elastic network(s) graph model
US20200019817A1 (en) * 2018-07-11 2020-01-16 Harbin Institute Of Technology Superpixel classification method based on semi-supervised k-svd and multiscale sparse representation
CN111126463A (en) * 2019-12-12 2020-05-08 武汉大学 Spectral image classification method and system based on local information constraint and sparse representation
US20200250428A1 (en) * 2019-02-04 2020-08-06 Farmers Edge Inc. Shadow and cloud masking for remote sensing images in agriculture applications using a multilayer perceptron
CN111695636A (en) * 2020-06-15 2020-09-22 北京师范大学 Hyperspectral image classification method based on graph neural network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392454A (en) * 2014-12-03 2015-03-04 复旦大学 Merging method of membership scoring based on ground object categories under spatial-spectral combined classification frame for hyper-spectral remote sensing images
CN108009559A (en) * 2016-11-02 2018-05-08 哈尔滨工业大学 A kind of Hyperspectral data classification method based on empty spectrum united information
US20200019817A1 (en) * 2018-07-11 2020-01-16 Harbin Institute Of Technology Superpixel classification method based on semi-supervised k-svd and multiscale sparse representation
US20200250428A1 (en) * 2019-02-04 2020-08-06 Farmers Edge Inc. Shadow and cloud masking for remote sensing images in agriculture applications using a multilayer perceptron
CN110111338A (en) * 2019-04-24 2019-08-09 广东技术师范大学 A kind of visual tracking method based on the segmentation of super-pixel time and space significance
CN110363236A (en) * 2019-06-29 2019-10-22 河南大学 The high spectrum image extreme learning machine clustering method of sky spectrum joint hypergraph insertion
CN110399909A (en) * 2019-07-08 2019-11-01 南京信息工程大学 A kind of hyperspectral image classification method based on label constraint elastic network(s) graph model
CN111126463A (en) * 2019-12-12 2020-05-08 武汉大学 Spectral image classification method and system based on local information constraint and sparse representation
CN111695636A (en) * 2020-06-15 2020-09-22 北京师范大学 Hyperspectral image classification method based on graph neural network

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
DONGQING LI等: "Spatial-spectral neighbour graph for dimensionality reduction of hyperspectral image classification", 《INTERNATIONAL JOURNAL OF REMOTE SENSING》, pages 1 - 24 *
SUJUAN WANG等: "Spatial-Spectral Locality-Constrained Low-Rank Representation with Semi-Supervised Hypergraph Learning for Hyperspectral Image Classification", 《INTERNATIONAL JOURNAL OF REMOTE SENSING》, vol. 38, no. 23, pages 7374 *
丁福光: "基于深度结构化学习的语义图像分割方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 02, pages 138 - 1308 *
付光远等: "基于卷积神经网络的高光谱图像谱-空联合分类", 《科学技术与工程》, vol. 17, no. 21, pages 268 - 274 *
刘胜男: "基于多特征与改进子空间聚类的SAR图像分割", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 02, pages 136 - 1435 *
邓彬: "基于超像素分割的高光谱图像特征变换和分类算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 07, pages 138 - 882 *
郭志民等: "谱-空图嵌入的高光谱图像多核分类算法", 《小型微型计算机系统》, vol. 39, no. 11, pages 2545 - 2550 *
陈逸: "基于图模型的高光谱图像分类", 《中国优秀硕士学位论文全文数据库 工程科技II辑》, no. 02, pages 028 - 298 *
黄鸿等: "半监督多图嵌入的高光谱影像特征提取", 《光学精密工程》, vol. 28, no. 02, pages 443 - 456 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113298129A (en) * 2021-05-14 2021-08-24 西安理工大学 Polarized SAR image classification method based on superpixel and graph convolution network
CN113298129B (en) * 2021-05-14 2024-02-02 西安理工大学 Polarized SAR image classification method based on superpixel and graph convolution network
CN113628289A (en) * 2021-07-21 2021-11-09 武汉大学 Hyperspectral image nonlinear unmixing method and system based on graph convolution self-encoder
CN113628289B (en) * 2021-07-21 2023-10-27 武汉大学 Hyperspectral image nonlinear unmixing method and system based on graph convolution self-encoder
CN114022786A (en) * 2021-12-10 2022-02-08 深圳大学 Hyperspectral image classification method based on graph-in-graph convolution network
CN117853769A (en) * 2024-01-08 2024-04-09 西安交通大学 Hyperspectral remote sensing image rapid clustering method based on multi-scale image fusion

Also Published As

Publication number Publication date
CN112329818B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN110399909B (en) Hyperspectral image classification method based on label constraint elastic network graph model
Awad et al. Multicomponent image segmentation using a genetic algorithm and artificial neural network
CN112329818B (en) Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization
CN110728192B (en) High-resolution remote sensing image classification method based on novel characteristic pyramid depth network
CN107563422B (en) A kind of polarization SAR classification method based on semi-supervised convolutional neural networks
CN108682017B (en) Node2Vec algorithm-based super-pixel image edge detection method
CN110348399B (en) Hyperspectral intelligent classification method based on prototype learning mechanism and multidimensional residual error network
CN107590515B (en) Hyperspectral image classification method of self-encoder based on entropy rate superpixel segmentation
CN108052966B (en) Remote sensing image scene automatic extraction and classification method based on convolutional neural network
CN104751191B (en) A kind of Hyperspectral Image Classification method of sparse adaptive semi-supervised multiple manifold study
Almogdady et al. A flower recognition system based on image processing and neural networks
WO2020062360A1 (en) Image fusion classification method and apparatus
CN110110596B (en) Hyperspectral image feature extraction, classification model construction and classification method
CN103996047B (en) Hyperspectral image classification method based on squeezed spectra clustering ensemble
CN104732244B (en) The Classifying Method in Remote Sensing Image integrated based on wavelet transformation, how tactful PSO and SVM
CN106096655B (en) A kind of remote sensing image airplane detection method based on convolutional neural networks
CN107832797B (en) Multispectral image classification method based on depth fusion residual error network
CN109598306A (en) Hyperspectral image classification method based on SRCM and convolutional neural networks
CN109784205B (en) Intelligent weed identification method based on multispectral inspection image
CN112200123B (en) Hyperspectral open set classification method combining dense connection network and sample distribution
CN109671019B (en) Remote sensing image sub-pixel mapping method based on multi-objective optimization algorithm and sparse expression
Chi Self‐organizing map‐based color image segmentation with k‐means clustering and saliency map
CN113239938A (en) Hyperspectral classification method and system based on graph structure
CN108960276B (en) Sample expansion and consistency discrimination method for improving spectral image supervision classification performance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant