CN115578599A - Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network - Google Patents
Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network Download PDFInfo
- Publication number
- CN115578599A CN115578599A CN202211322174.0A CN202211322174A CN115578599A CN 115578599 A CN115578599 A CN 115578599A CN 202211322174 A CN202211322174 A CN 202211322174A CN 115578599 A CN115578599 A CN 115578599A
- Authority
- CN
- China
- Prior art keywords
- feature
- superpixel
- matrix
- network
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
- G06V10/765—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network, which comprises the following steps of: preprocessing input data of a polarized SAR image; step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation; generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic correlation matrix and a spatial characteristic correlation matrix; constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion; constructing a feature reconstruction network and a local feature extraction network, and fusing reconstruction features and local features to realize feature enhancement; and step six, training the network by using the training set and outputting a classification result. The superpixel-hypergraph feature enhancement network provided by the invention can fully fuse global information and local information by utilizing the polarization feature and the spatial feature of the polarized SAR image, and effectively improves the classification precision of the polarized SAR image.
Description
Technical Field
The invention belongs to the field of polarized SAR image processing, and particularly relates to a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network.
Background
The polarized synthetic aperture radar can obtain high-resolution radar images under all weather conditions. The polarized SAR image contains abundant polarization information and can reflect the physical properties of an irradiated object. Therefore, the polarized SAR images have been widely used in the fields of ocean monitoring, urban planning, and geoscience.
With the continuous development of the polarized synthetic aperture radar, the polarized SAR image classification task is also increasingly concerned. Polarimetric SAR image classification is a pixel-level classification task that classifies all pixel points in an image into corresponding classes based on the information of each pixel unit. The existing polarization SAR image classification methods are mainly classified into a polarization decomposition-based classification method, a statistical characteristic-based classification method and a machine learning-based classification method. At present, in the process of classifying the polarized SAR image by using the convolutional neural network, the input of the network is a square sampling block with a fixed size, and a category label corresponding to a central pixel point of the sampling block is used as a label of the whole sampling block. The method only focuses on the local information of the image and ignores the global information of the image, thereby limiting the further improvement of the classification precision. The atlas neural network has the capability of capturing image global information, but due to the problem that similar polarization features exist in different objects, only using polarization feature correlation can lead to erroneous classification of parts of the objects.
Disclosure of Invention
The invention aims to solve the problems that under the deep learning background, the polarized SAR image feature information is not fully utilized, and the classification precision is difficult to further improve, and provides a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network, which has a simple structure and reasonable design, wherein a superpixel segmentation technology is used for segmenting a polarized SAR image into a series of superpixels, the polarized feature relevance and the spatial feature relevance among the superpixels are constructed, and the superpixel-hypergraph convolution neural network is used for extracting the global feature of the polarized SAR image; and a feature reconstruction network and a local feature extraction network are constructed, the global features and the local features of the polarized SAR image are fused, and the image classification precision is further improved.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows: a polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network is characterized by comprising the following steps: the method comprises the following steps:
step one, preprocessing input data of a polarized SAR image:
step 101, polarimetric scattering matrix of polarimetric SARConverting under Pauli base to obtain coherent matrix T, S hh And S vv Representing a component of the same polarization, S hv And S vh Represents cross polarization components, h and v represent horizontal polarization and vertical polarization, respectively;
102, converting the coherent matrix T to obtain a 6-dimensional initial polarization characteristic vector, and further obtaining a size I h ×I w X 6 polarized SAR input data, where I h And I w Respectively representing the length and the width of the polarized SAR image;
step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation:
step 201, segmenting the polarized SAR image into M superpixel blocks by using a superpixel segmentation algorithm of simple linear iterative clustering;
step 202, forming a polarized SAR superpixel set S = { S = { S = } 1 ,…,S i ,…,S M In which S is i Representing the ith superpixel block;
generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic incidence matrix and a spatial characteristic incidence matrix:
step 301, in the polarized SAR super-pixel set S, calculating the mean value of the polarized characteristics of all pixel points in each super-pixel to generate a polarized characteristic matrixCalculating the mean value of the horizontal and vertical coordinates of all pixel points in each super pixel to generate a spatial characteristic matrixWherein the content of the first and second substances,andrespectively representing super-pixels S i The polarization eigenvectors and the spatial eigenvectors of (a);
step 302, in the super-pixel set S, calculating the similarity between the polarization characteristics of the super-pixels by using a polarization characteristic matrix, for each super-pixel, selecting k values with the highest similarity by using a k-nearest neighbor algorithm, setting the rest similarity values to be 0, and generating a polarization characteristic association matrix H pol ∈R M×M ;
Step 303, in the superpixel set S, calculating the similarity between spatial features of the superpixels by using the spatial feature matrix, finding out the similarity value of each superpixel with the adjacent superpixel in the spatial position, setting the similarity values of the rest spatial features as 0, and generating a spatial feature correlation matrix H spa ∈R M×M ;
Step four, constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion:
step 401, constructing a two-layer superpixel-hypergraph convolution neural network, and associating a polarization characteristic matrix H pol Spatial feature correlation matrix H spa Polarization feature matrix X pol And spatial feature matrix X spa Inputting the super-pixel characteristics into the network, and learning to a higher level by using the networkWherein D is c A feature dimension representing a network output superpixel;
step 402, utilizing a superpixel-to-pixel conversion matrixSuperpixel features output by superpixel-hypergraph convolutional neural networkConversion to pixel level featuresWherein, I h I w Representing the number of all pixel points of the polarized SAR image;
step five, constructing a feature reconstruction network and a local feature extraction network, and fusing reconstruction features and local features to realize feature enhancement:
step 501, constructing a feature reconstruction network, inputting the obtained pixel-level features into the feature reconstruction network, wherein the feature reconstruction expression isWherein the content of the first and second substances,represents the output of the l-th hidden layer, anb (l) Representation of the feature reconstruction network bias, f rec () represents an activation function of the feature reconstruction network;
step 502, constructing a local feature extraction network, and extracting local feature X of the polarized SAR image local The local feature extraction expression isWherein the content of the first and second substances,local features extracted by the layer I network are shown,which represents the kernel of the convolution,representing local feature extraction network bias, f (-) represents an activation function of the feature extraction network;
step 503, reconstructing the output of the network from the featuresAnd local feature X local Splicing is carried out, the enhancement of the characteristics is realized, and the overall characteristics X finally used for classification are obtained total ;
Step six, training the network by using a training set, and outputting a classification result:
step 601, integrating the overall characteristics X total Inputting the prediction result into a Softmax classifier to obtain a prediction result P of each pixel point in the image jc ;
Step 602, randomly selecting training samples with a proportion r to form a training set for training the network, wherein a loss function in the network training process is L = alpha L rec +L c Wherein L is rec Is the reconstruction loss, L c Is the classification loss and alpha is the balance parameter.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the formula of the coherence matrix T obtained by converting the polarized scattering matrix S under Pauli basis in step 101 is as follows:
the polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the calculation formula of the 6-dimensional initial polarization eigenvector obtained by the conversion of the coherence matrix T in step 102 is as follows:
f 1 =10log 10 (T 11 +T 22 +T 33 )
f 2 =T 22 /(T 11 +T 22 +T 33 )
f 3 =T 33 /(T 11 +T 22 +T 33 )
wherein, T ij (i =1,2,3, j =1,2,3) represents an element corresponding to the ith row and the jth column of the matrix T, f is i (i =1, …, 6) represents a polarization characteristic value of the i-th dimension.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: generating a polarization signature correlation matrix H in step 302 pol The calculation formula of (a) is as follows:
wherein h is pol (i, j) represents a polarization characteristic correlation matrix H pol The element of row i, column j, beta represents an adjustable parameter,representing a super-pixel S i The 6-dimensional polarization feature vector, KNN (·) represents the k-nearest neighbor of the polarization feature.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of:generating a spatial feature correlation matrix H in step 303 spa The calculation formula of (a) is as follows:
wherein h is spa (i, j) represents a spatial feature correlation matrix H spa The element of row i, column j, gamma denotes an adjustable parameter,the 2-dimensional spatial feature vector representing the superpixel i, neighbor (·) represents the spatial neighborhood.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: the propagation rule of the superpixel-hypergraph convolutional neural network in step 401 is as follows:
where σ (·) is the ReLU activation function, W is the trainable transfinite weight, H fuse Correlation matrix H by polarization characteristics pol And spatial feature correlation matrix H spa The components are spliced to form the composite material,andrespectively representing input characteristics and output characteristics of the l-th layer superpixel-hypergraph convolutional neural network, and initial input characteristicsFrom a polarization characteristic matrix X pol And spatial feature matrix X spa Formed by splicing D v =∑ e∈E W(e)H fuse (v, e) denotes the degree of super-graph edge, D e =∑ v∈V H fuse (v, e) represents the degree of the super edge node, Θ (l) Representing trainableAnd (5) filtering the matrix.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: superpixel-to-pixel conversion matrix in step 402The calculation formula of (a) is as follows:
where Q (I, j) represents an element of the ith row and jth column in the super pixel-to-pixel conversion matrix Q, and I =1 h ×I w ,j=1,…,M,p i Representing the ith pixel point of the PolSAR image, S j Representing the jth super pixel.
The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network is characterized by comprising the following steps of: reconstruction of loss L in step 602 rec And a classification loss L c The calculation formulas of (A) are respectively as follows:
wherein N represents the number of training samples, C represents the number of classes, Y jc True value, P, of class c jc Representing a predictive label.
Compared with the prior art, the invention has the following advantages:
the method can fully utilize the polarization characteristics and the spatial characteristics of the polarized SAR image, and extract the global characteristics of the image by using the superpixel-hypergraph convolution neural network; a feature reconstruction network and a local feature extraction network are constructed, the global features and the local features of the polarized SAR image are fused, and the classification accuracy of the polarized SAR image is effectively improved. The method of the invention has simple structure and convenient realization, use and operation.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is an image of the area of Oberpfaffenhofen Germany used in the simulation of the present invention;
FIG. 3 is a diagram illustrating the classification effect obtained in the simulation of the present invention.
Detailed Description
The method of the present invention will be described in further detail below with reference to the accompanying drawings and embodiments of the invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For ease of description, spatially relative terms such as "over … …", "over … …", "over … …", "over", etc. may be used herein to describe the spatial positional relationship of one device or feature to another device or feature as shown in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" may include both orientations of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
As shown in fig. 1, the polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network of the present invention is characterized in that: the method comprises the following steps:
step one, preprocessing input data of a polarized SAR image:
step 101, polarimetric scattering matrix of polarimetric SARConverting under Pauli base to obtain coherent matrix T, S hh And S vv Representing a homopolar component, S hv And S vh And (3) expressing cross polarization components, and h and v respectively expressing horizontal polarization and vertical polarization, wherein a calculation formula of a coherent matrix T obtained by converting a polarization scattering matrix S in a Pauli base is as follows:
step (ii) of102. The coherent matrix T is converted to obtain 6-dimensional initial polarization eigenvector, and then the size I is obtained h ×I w X 6 polarized SAR image input data, wherein I h And I w Respectively representing the length and the width of the polarized SAR image, and the calculation formula of the 6-dimensional initial polarization eigenvector obtained by the conversion of the coherence matrix T is as follows:
f 1 =10log 10 (T 11 +T 22 +T 33 )
f 2 =T 22 /(T 11 +T 22 +T 33 )
f 3 =T 33 /(T 11 +T 22 +T 33 )
wherein, T ij (i =1,2,3, j =1,2,3) represents an element corresponding to the ith row and the jth column of the matrix T, f is i (i =1, …, 6) represents a polarization characteristic value of i-dimension;
step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation:
step 201, for the I obtained in the step one h ×I w The polarized SAR image of x 6 is segmented into M superpixel blocks by using a superpixel segmentation algorithm of simple linear iterative clustering;
step 202 forming polarized SAR superordinate set of pixels S = { S = { S = 1 ,…,S M },S i Representing the ith superpixel block;
in specific implementation, the number M of the superpixel blocks is determined by the size of an input image and the segmentation size set in the process of using the simple linear iterative clustering superpixel segmentation method;
generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic incidence matrix and a spatial characteristic incidence matrix:
step 301, in the polarized SAR super-pixel set S, calculating the average value of the polarization characteristics of all pixel points in each super-pixel to generate a polarization characteristic matrixCalculating the mean value of the horizontal and vertical coordinates of all pixel points in each super pixel to generate a spatial characteristic matrixWherein the content of the first and second substances,andrespectively representing super-pixels S i The polarization eigenvectors and the spatial eigenvectors of (a);
step 302, in the super-pixel set S, calculating the similarity between the polarization features of the super-pixels by using a polarization feature matrix, selecting the k values with the highest similarity by using a k-nearest neighbor algorithm for each super-pixel, setting the rest similarity values to be 0, and generating a polarization feature association matrix H pol ∈R M×M Polarization characteristic correlation matrix H pol The calculation formula of (a) is as follows:
wherein h is pol (i, j) represents a polarization characteristic correlation matrix H pol The element of row i, column j, beta represents an adjustable parameter,representing a super-pixel S i The 6-dimensional polarization feature vector of (1), KNN (-) represents the k-nearest neighbor of the polarization feature;
in specific implementation, the neighbor number k set by the k neighbor method is 3, and the adjustable parameter beta is set to be 100;
step 303, in the super-pixel set S, calculating the similarity between the spatial features of the super-pixels by using the spatial feature matrix, finding out the similarity value of each super-pixel with the adjacent super-pixels in the spatial position, setting the similarity values of the rest spatial features as 0, and generating a spatial feature correlation matrix H spa ∈R M×M Polarization characteristic correlation matrix H spa The calculation formula of (a) is as follows:
wherein h is spa (i, j) represents a spatial feature correlation matrix H spa The ith row and the jth column in the drawing, gamma represents an adjustable parameter,a 2-dimensional spatial feature vector representing a superpixel i, neighbor (·) representing spatial neighbors;
in specific implementation, the adjustable parameter gamma is set to 80;
step four, constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion:
step 401, constructing a two-layer superpixel-hypergraph convolution neural network, and associating a polarization characteristic matrix H pol Spatial feature correlation matrix H spa Polarization feature matrix X pol And spatial feature matrix X spa Inputting the super-pixel characteristics into the network, and learning to a higher level by using the networkWherein D is c The characteristic dimensions of the superpixel representing the network output, and the propagation rule of the superpixel-hypergraph convolutional neural network is as follows:
where σ (·) is the ReLU activation function, W is the trainable transfinite weight, H fuse Correlation matrix H by polarization characteristics pol And spatial feature correlation matrix H spa The components are spliced to form the composite material,andrespectively representing input characteristics and output characteristics of the l-th layer superpixel-hypergraph convolutional neural network, and initial input characteristicsFrom a polarization characteristic matrix X pol And spatial feature matrix X spa Formed by splicing D v =Σ e∈E W(e)H fuse (v, e) denotes the degree of super-graph edge, D e =Σ v∈V H fuse (v, e) represents the degree of the super-edge node, Θ (l) A filter matrix representing a size trainable;
in specific implementation, the trainable super-edge weight W is 2 Mx 2M, and the trainable filtering matrix theta in the first layer of superpixel-hypergraph convolutional neural network (0) Is set to be 8 multiplied by 32, and a trainable filtering matrix theta in the second layer superpixel-hypergraph convolutional neural network (1) Is set to be 23 multiplied by 64, and the characteristic dimension D of the super pixel output by the two-layer super pixel-super graph convolution neural network c Is 64;
step 402, utilizing a superpixel-to-pixel conversion matrixThe calculation formula of the conversion matrix Q is:where Q (I, j) represents the element in row I and column j in the super pixel-pixel conversion matrix Q, and I =1, …, I h ×I w ,j=1,…,M,p i Representing the ith of a polarized SAR imagePixel point, S j Representing the jth superpixel, and then outputting the superpixel-superpixel convolutional neural networkConversion to pixel level featuresWherein, I h I w Representing the number of all pixel points of the polarized SAR image;
step five, constructing a feature reconstruction network and a local feature extraction network, and fusing reconstruction features and local features to realize feature enhancement:
step 501, constructing a feature reconstruction network, inputting the obtained pixel-level features into the feature reconstruction network, wherein the feature reconstruction expression isWherein the content of the first and second substances,represents the output of the l-th hidden layer, anb (l) Representation of the feature reconstruction network bias, f rec () represents an activation function of the feature reconstruction network;
in specific implementation, the feature reconstruction network consists of three layers of fully connected networks, and the output dimensions of the three layers of fully connected networks are respectively 64 and 32,6;
step 502, constructing a local feature extraction network, and extracting local feature X of the polarized SAR image local The local feature extraction expression isWherein the content of the first and second substances,local features extracted by the layer I network are shown,which represents the kernel of the convolution,representing local feature extraction network bias, f (-) represents an activation function of the feature reconstruction network;
in specific implementation, the local feature extraction network consists of four layers of convolutional neural networks, the sizes of convolution kernels are 1 multiplied by 1,5 multiplied by 5,1 multiplied by 1,5 multiplied by 5 respectively, the output dimensionality of each layer of convolutional neural network is 32, 32, 64 and 64, and the input of the local feature extraction network is an initial whole polarization SAR image;
step 503, reconstructing the output of the network from the featuresAnd local feature X local Splicing is carried out, the enhancement of the characteristics is realized, and the overall characteristics X finally used for classification are obtained total ;
In specific implementation, the first layer of the characteristic reconstruction network is outputAnd local feature X local Spliced to obtain X total ,Is 64, local feature X local Is 64, resulting in X total Is 128;
step six, training the network by using a training set, and outputting a classification result:
step 601, integrating the overall characteristics X total Inputting the prediction result into a Softmax classifier to obtain a prediction result P of each pixel point in the image jc ;
Step 602, randomly selecting training samples with the proportion of r =5% from each class to form a training set, training the network, wherein a loss function in the network training process is L = alpha L rec +L c Wherein L is rec Is to reconstructLoss, L c Is the classification loss, α is the equilibrium parameter, set to 0.05, and the reconstruction loss L rec And a classification loss L c The calculation formula of (2) is as follows:
wherein N represents the number of training samples, C represents the number of classes, Y jc True value, P, of class c jc Representing a predictive label.
The effectiveness of the invention can be further confirmed by the following simulation experiments:
1. experimental conditions and methods
The hardware platform is as follows: inter (R) Core (TM) i5-10600K CPU@4.10GHZ, 16.0GB RAM;
the software platform is as follows: pytrch 1.10;
the experimental method comprises the following steps: respectively, a convolutional neural network, a graph convolutional neural network and the method of the present invention.
2. Simulation content and results
The image of the area of Oberpfaffenhofen, germany, shown in fig. 2 is taken as a test image, and classification simulation is carried out on the image of fig. 2 by respectively using a convolutional neural network, a graph convolutional neural network and the method of the invention, and the classification result is shown in fig. 3. Where fig. 3 (a) is the result of classification using a convolutional neural network, fig. 3 (b) is the result of classification using a graph convolutional neural network, and fig. 3 (c) is the result of classification using the present invention. As can be seen from FIG. 3, compared with the convolutional neural network method, the method of the present invention has many fewer classification noise pixel points in the classification result, and compared with the convolutional neural network method, the classification result is more accurate. Table 1 shows the classification accuracy of the images in the area Oberpfaffenhofen, germany, where OA represents the overall classification accuracy, and it can be seen from table 1 that the method of the present invention can achieve higher classification accuracy than the convolutional neural network and the graph convolutional neural network.
TABLE 1 Oberpfaffenhofen region of Germany image classification results
The above embodiments are only examples of the present invention, and are not intended to limit the present invention, and all simple modifications, changes and equivalent structural changes made to the above embodiments according to the technical spirit of the present invention still fall within the protection scope of the technical solution of the present invention.
Claims (8)
1. A polarized SAR image classification method based on a superpixel-hypergraph feature enhancement network is characterized by comprising the following steps: the method comprises the following steps:
step one, preprocessing input data of a polarized SAR image:
step 101, polarising a polarimetric scattering matrix of the SARConverting under Pauli base to obtain coherent matrix T, S hh And S vv Representing a homopolar component, S hv And S vh Represents cross polarization components, h and v represent horizontal polarization and vertical polarization, respectively;
102, converting the coherent matrix T to obtain a 6-dimensional initial polarization characteristic vector, and further obtaining a size I h ×I w X 6 polarized SAR input data, where I h And I w Respectively representing the length and the width of the polarized SAR image;
step two, obtaining a polarized SAR super-pixel set by utilizing super-pixel segmentation:
step 201, segmenting the polarized SAR image into M superpixel blocks by using a superpixel segmentation algorithm of simple linear iterative clustering;
step 202, forming a polarized SAR superpixel set S = { S = { S = } 1 ,…,S i ,…,S M In which S is i Representing the ith superpixel block;
generating a super-pixel polarization characteristic matrix and a spatial characteristic matrix, and constructing a polarization characteristic incidence matrix and a spatial characteristic incidence matrix:
step 301, in the polarized SAR super-pixel set S, calculating the mean value of the polarized characteristics of all pixel points in each super-pixel to generate a polarized characteristic matrixCalculating the mean value of horizontal and vertical coordinates of all pixel points in each super pixel to generate a spatial characteristic matrixWherein the content of the first and second substances,andrespectively representing super-pixels S i The polarization eigenvectors and the spatial eigenvectors of (a);
step 302, in the super-pixel set S, calculating the similarity between the polarization characteristics of the super-pixels by using a polarization characteristic matrix, for each super-pixel, selecting k values with the highest similarity by using a k-nearest neighbor algorithm, setting the rest similarity values to be 0, and generating a polarization characteristic association matrix H pol ∈R M×M ;
Step 303, in the superpixel set S, calculating the similarity between spatial features of the superpixels by using the spatial feature matrix, finding out the similarity value of each superpixel with the adjacent superpixel in the spatial position, setting the similarity values of the rest spatial features as 0, and generating a spatial feature correlation matrix H spa ∈R M×M ;
Step four, constructing two layers of superpixel-hypergraph convolution neural networks, and performing superpixel-level to pixel-level feature conversion:
step 401, constructing a two-layer superpixel-hypergraph convolution neural network, and associating a polarization characteristic matrix H pol Spatial feature correlation matrix H spa Polarization feature matrix X pol And spatial feature matrix X spa Inputting the super-pixel characteristics into the network, and learning to a higher level by using the networkWherein D is c A feature dimension representing a network output superpixel;
step 402, utilizing a superpixel-to-pixel conversion matrixSuperpixel features output by superpixel-hypergraph convolutional neural networkConversion to pixel level featuresWherein, I h I w Representing the number of all pixel points of the polarized SAR image;
constructing a feature reconstruction network and a local feature extraction network, and fusing the reconstruction features and the local features to realize feature enhancement:
step 501, constructing a feature reconstruction network, inputting the obtained pixel-level features into the feature reconstruction network, wherein the feature reconstruction expression isWherein the content of the first and second substances,represents the output of the l-th hidden layer, anb (l) Representation of the characteristic reconstruction network bias, f rec () represents an activation function of the feature reconstruction network;
step 502, constructing a local feature extraction network, and extracting local feature X of the polarized SAR image local The local feature extraction expression isWherein the content of the first and second substances,local features extracted by the layer I network are shown,which represents the kernel of the convolution,representing local feature extraction network bias, f (-) represents an activation function of the feature extraction network;
step 503, reconstructing the output of the network from the featuresAnd local feature X local Splicing is carried out, the enhancement of the characteristics is realized, and the overall characteristics X finally used for classification are obtained total ;
Step six, training the network by using a training set, and outputting a classification result:
step 601, integrating the overall characteristics X total Inputting the prediction result into a Softmax classifier to obtain a prediction result P of each pixel point in the image jc ;
Step 602, randomly selecting training samples with a proportion of r to form a training set, training the network, wherein a loss function in the network training process is L = alpha L rec +L c Wherein L is rec Is the reconstruction loss, L c Is the classification loss and alpha is the balance parameter.
3. the polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: the calculation formula of the 6-dimensional initial polarization eigenvector obtained by the conversion of the coherence matrix T in step 102 is as follows:
f 1 =10log 10 (T 11 +T 22 +T 33 )
f 2 =T 22 /(T 11 +T 22 +T 33 )
f 3 =T 33 /(T 11 +T 22 +T 33 )
wherein, T ij (i =1,2,3, j =1,2,3) represents an element corresponding to the ith row and the jth column of the matrix T, and f is equal to f i (i =1, …, 6) represents a polarization characteristic value of the i-th dimension.
4. The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: generating polarization feature associations in step 302Matrix H pol The calculation formula of (a) is as follows:
wherein h is pol (i, j) represents a polarization characteristic correlation matrix H pol The element of row i, column j, beta represents an adjustable parameter,representing a super-pixel S i The 6-dimensional polarization feature vector, KNN (·) represents the k-nearest neighbor of the polarization feature.
5. The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: generating a spatial feature correlation matrix H in step 303 spa The calculation formula of (a) is as follows:
6. The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: the propagation rule of the superpixel-hypergraph convolutional neural network in step 401 is as follows:
where σ (·) is the ReLU activation function, W is the trainable transfinite weight, H fuse Correlation matrix H by polarization characteristics pol And spatial feature correlation matrix H spa The components are spliced to form the composite material,andrespectively representing input characteristics and output characteristics of the l-th layer superpixel-hypergraph convolutional neural network, and initial input characteristicsFrom a polarization characteristic matrix X pol And spatial feature matrix X spa The components are spliced to form the composite material,representing degree of super graph, D e =∑ v∈V H fuse (v, e) represents the degree of the super-edge node, Θ (l) A trainable filter matrix is represented.
7. The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: superpixel-to-pixel conversion matrix in step 402The calculation formula of (c) is as follows:
where Q (I, j) represents the element in row I and column j in the super pixel-pixel conversion matrix Q, and I =1, …, I h ×I w ,j=1,…,M,p i The ith pixel point, S, representing the polarized SAR image j Representing the jth super pixel.
8. The polarized SAR image classification method based on the superpixel-hypergraph feature enhancement network as claimed in claim 1, characterized in that: reconstruction of loss L in step 602 rec And a classification loss L c The calculation formulas of (A) are respectively as follows:
wherein N represents the number of training samples, C represents the number of classes, Y jc True value, P, of class c jc Representing a predictive label.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211322174.0A CN115578599A (en) | 2022-10-27 | 2022-10-27 | Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211322174.0A CN115578599A (en) | 2022-10-27 | 2022-10-27 | Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115578599A true CN115578599A (en) | 2023-01-06 |
Family
ID=84587433
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211322174.0A Pending CN115578599A (en) | 2022-10-27 | 2022-10-27 | Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115578599A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117315381A (en) * | 2023-11-30 | 2023-12-29 | 昆明理工大学 | Hyperspectral image classification method based on second-order biased random walk |
-
2022
- 2022-10-27 CN CN202211322174.0A patent/CN115578599A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117315381A (en) * | 2023-11-30 | 2023-12-29 | 昆明理工大学 | Hyperspectral image classification method based on second-order biased random walk |
CN117315381B (en) * | 2023-11-30 | 2024-02-09 | 昆明理工大学 | Hyperspectral image classification method based on second-order biased random walk |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
He et al. | Foreground-aware pyramid reconstruction for alignment-free occluded person re-identification | |
Gao et al. | Change detection from synthetic aperture radar images based on channel weighting-based deep cascade network | |
Yang et al. | Multitask dictionary learning and sparse representation based single-image super-resolution reconstruction | |
CN107977661B (en) | Region-of-interest detection method based on FCN and low-rank sparse decomposition | |
Li et al. | Semisupervised classification of hurricane damage from postevent aerial imagery using deep learning | |
Wu et al. | Multiscale CNN with autoencoder regularization joint contextual attention network for SAR image classification | |
Yang et al. | A deep multiscale pyramid network enhanced with spatial–spectral residual attention for hyperspectral image change detection | |
CN104778476A (en) | Image classification method | |
Verma et al. | Wild animal detection from highly cluttered images using deep convolutional neural network | |
Ren et al. | Clustering-oriented multiple convolutional neural networks for single image super-resolution | |
Li et al. | Enhanced bird detection from low-resolution aerial image using deep neural networks | |
CN116091946A (en) | Yolov 5-based unmanned aerial vehicle aerial image target detection method | |
Khurshid et al. | A residual-dyad encoder discriminator network for remote sensing image matching | |
CN115578599A (en) | Polarized SAR image classification method based on superpixel-hypergraph feature enhancement network | |
Li et al. | Image decomposition with multilabel context: Algorithms and applications | |
Lu et al. | Efficient object detection for high resolution images | |
Singh et al. | Wavelet based histogram of oriented gradients feature descriptors for classification of partially occluded objects | |
Qiao et al. | LiteSCANet: An efficient lightweight network based on spectral and channel-wise attention for hyperspectral image classification | |
Song et al. | HDTFF-Net: Hierarchical deep texture features fusion network for high-resolution remote sensing scene classification | |
Hou et al. | BFFNet: a bidirectional feature fusion network for semantic segmentation of remote sensing objects | |
Chaudhri et al. | Mirror Mosaicking Based Reduced Complexity Approach for the Classification of Hyperspectral Images | |
Pan et al. | Locality constrained encoding of frequency and spatial information for image classification | |
Chen et al. | Hyperspectral remote sensing IQA via learning multiple kernels from mid-level features | |
CN112101084B (en) | Automatic polarized SAR building earthquake hazard information extraction method based on convolutional neural network | |
Ansari et al. | GPU based building footprint identification utilising self-attention multiresolution analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |