CN113792761A - Remote sensing image classification method based on Gabor characteristics and EMAP characteristics - Google Patents

Remote sensing image classification method based on Gabor characteristics and EMAP characteristics Download PDF

Info

Publication number
CN113792761A
CN113792761A CN202110957831.8A CN202110957831A CN113792761A CN 113792761 A CN113792761 A CN 113792761A CN 202110957831 A CN202110957831 A CN 202110957831A CN 113792761 A CN113792761 A CN 113792761A
Authority
CN
China
Prior art keywords
gabor
emap
features
feature
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110957831.8A
Other languages
Chinese (zh)
Other versions
CN113792761B (en
Inventor
江玲
周付根
史洁玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110957831.8A priority Critical patent/CN113792761B/en
Publication of CN113792761A publication Critical patent/CN113792761A/en
Application granted granted Critical
Publication of CN113792761B publication Critical patent/CN113792761B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a remote sensing image classification method based on Gabor characteristics and EMAP characteristics, which comprises the following steps: extracting Gabor characteristics from the remote sensing image and constructing a Gabor characteristic kernel, and simultaneously extracting EMAP characteristics from the remote sensing image and constructing an EMAP characteristic kernel; fusing the constructed Gabor characteristic nucleus and the EMAP characteristic nucleus by using the characteristic composite nucleus framework to obtain a characteristic composite nucleus; and classifying the samples in the remote sensing image through a multinomial logistic regression model based on the obtained characteristic composite kernel. According to the method, the spectrum information and the spatial information are combined by fusing the Gabor characteristics and the EMAP characteristics, so that the adverse effect of original spectrum data is avoided, the problems of unbalanced categories and small number of samples are solved, the problem of information redundancy caused by strong correlation of spectrum information is effectively solved, and the classification precision is improved.

Description

Remote sensing image classification method based on Gabor characteristics and EMAP characteristics
Technical Field
The invention belongs to the field of interpretation, identification and processing of remote sensing images, and particularly relates to a method for classifying hyperspectral remote sensing images based on Gabor features and extended multi-attribute profile (EMAP) features.
Background
In the hyperspectral remote sensing image, the spectral information of different types of ground objects is different. The hyperspectral image classification technology has great significance for promoting the development of the application of the remote sensing technology, but the technology faces the problems of high dimensionality of hyperspectral image data, few marked samples, mixed pixels, low data quality and the like.
Aiming at the problem that the number of labeled samples of the hyperspectral images is small, scholars at home and abroad propose a series of hyperspectral image classification algorithms. Currently, the most widely used is the supervised learning algorithm. The supervised learning algorithm trains the classifier by using prior information in the hyperspectral image, namely the characteristics in the hyperspectral image, so as to achieve the purpose of classifying the sample.
When using supervised learning algorithms, feature extraction is usually required first. At present, spectral feature extraction methods widely applied include principal component analysis, independent component analysis and the like. In order to better improve the classification accuracy, a spatial feature extraction method is proposed to acquire spatial context information. Common spatial features include features based on mathematical morphological transformation, such as morphological contour features, extended multi-attribute contour features improved based on the morphological contour features, and the like. In addition, a transform domain-based feature extraction algorithm is available, which transforms an original hyperspectral image and uses the transformed features as spatial features, such as Gabor features. In the prior art methods, the original spectral data is retained. The original spectrum data quality is low, the phenomena of 'same object and different spectrum' and 'same foreign object and spectrum' exist, and when the number of samples is small, the influence on the classification precision is large. In order to achieve the fusion of spectral and spatial features, some methods of fusion are proposed. The method is a flexible method for fusing the spectral characteristics and the spatial characteristics by utilizing the generalized composite nuclear framework. In the selection of the classifier, a support vector machine and a joint sparse representation method are generally adopted. But the calculation amount of the support vector machine is large, and the classification precision of the joint sparse representation method is low for the condition that the number of different types of samples is not balanced.
Deep learning methods have also begun to be applied to the remote sensing image classification problem in recent years. However, under the condition of a small sample, the classification precision of the deep learning method is not high, and the time consumption is longer than that of the traditional deep learning method.
Disclosure of Invention
Therefore, the invention provides a fused feature expression method, which combines the spectral information and the spatial information by fusing the Gabor features and the EMAP features to avoid the adverse effect of original spectral data, overcome the problems of unbalanced categories and small number of samples, effectively solve the problem of information redundancy caused by strong correlation of spectral information and improve the classification precision.
The invention provides a remote sensing image classification method based on Gabor characteristics and EMAP characteristics, which comprises the following steps:
step S1: extracting Gabor characteristics from the remote sensing image and constructing a Gabor characteristic kernel, and simultaneously extracting EMAP characteristics from the remote sensing image and constructing an EMAP characteristic kernel;
step S2: fusing the Gabor characteristic core and the EMAP characteristic core constructed in the step S1 by using the characteristic composite core framework to obtain a characteristic composite core;
step S3: and classifying the samples in the remote sensing image through a multinomial logistic regression model based on the characteristic composite kernel obtained in the step S2.
Further, step S1 further includes: processing the remote sensing image by a principal component analysis method to obtain the first three principal components of the remote sensing image; gabor features are then extracted using Gabor filters on the first three principal components obtained, and EMAP features are extracted using attribute filters on the first three principal components obtained.
Further, the step of extracting Gabor features using Gabor filters on the first three principal components includes:
step S101: convolving the first three main components of the remote sensing image with a series of Gabor filters of different frequencies and multiple directions to extract Gabor features, wherein the Gabor filters are expressed as follows:
Figure BDA0003220910780000031
Figure BDA0003220910780000032
Figure BDA0003220910780000033
Figure BDA0003220910780000034
wherein the content of the first and second substances,g (x, y) represents a Gabor filter; x represents a coordinate in the direction of the horizontal axis of the image; y represents a coordinate in the vertical axis direction of the image; i represents an imaginary number; f. ofuRepresenting the frequency of a sinusoidal function; θ represents the direction of the gaussian kernel function; delta and
Figure BDA0003220910780000035
is a constraint to maintain the ratio between width and wavelength;
Figure BDA0003220910780000036
Figure BDA0003220910780000037
w represents the size of the Gabor filter; f. ofmaxIs the highest peak frequency of the Gabor function; u is the number of scales of the Gabor filter; v is the number of directions of the Gabor filter.
Component I of the first three principal components3The convolution process of (x, y) with the Gabor filter G (x, y) is expressed as:
FGabor=|I3(x,y)*G(x,y)|
wherein, FGaborTo represent a three-dimensional matrix of Gabor features, the size is MxNxD1Wherein D is1Each of the images U × V × 3 and M, N represents the length and width of the image.
Step S102: three-dimensional matrix F for obtaining Gabor characteristics from remote sensing imageGabor
Figure BDA0003220910780000038
Wherein the content of the first and second substances,
Figure BDA0003220910780000039
a feature vector corresponding to each pixel in the Gabor feature; n is the number of the eigenvectors, where N is M × N,
step S103: selecting training samples as training set
Figure BDA00032209107800000310
Wherein the content of the first and second substances,
Figure BDA00032209107800000311
representing the feature vector corresponding to each training sample pixel, n' is the number of training samples,
step S104: calculating the characteristic vector corresponding to each training sample pixel
Figure BDA00032209107800000312
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DG
Figure BDA00032209107800000313
DG=(dc,d)n×n
Wherein m is a feature vector dimension; x'Gck、x′GdkRespectively representing feature vectors
Figure BDA0003220910780000041
The (c) th dimension of (a),
step S105: computing Gabor feature Kernel KGabor
Figure BDA0003220910780000042
Wherein the content of the first and second substances,
Figure BDA0003220910780000043
represents a distance matrix DGIs measured.
Further, the step of extracting the EMAP features from the first three principal components by using an attribute filter comprises the following steps:
step S111: extracting attribute contour features on the first three principal components, and expanding the attribute contour features:
EAP(f1,f2,f3)={AP(f1),AP(f2),AP(f3)}
wherein, AP (f)j)={TKm(fj),…TK1(fj),fj,TH1(fj),…THm(fj)j=1,2,3
PP(fj) Representing attribute profile features; TKiAnd THiI ═ 1,2, …, m, representing the augmented and refined transforms, respectively; f. ofjRepresenting features extracted from the original sample information,
extended attribute profile feature EAP for generating different attributes in the first three principal componentsa(f1,f2,f3)、EAPd(f1,f2,f3)、EAPi(f1,f2,f3)、EAPs(f1,f2,f3) Stacking to obtain three-dimensional matrices F of EMAP featuresEMAPThe size is MxNxD2Wherein D is2Is the dimension of the EMAP feature.
FEMP={EPa(f1,f2,f3),EPd(f1,f2,f3),EPi(f1,f2,f3),E Ps(f1,f2,f3)}
Step S112: calculating a three-dimensional matrix F of the EMAP characteristics of the remote sensing imageEMAP
Figure BDA0003220910780000051
Wherein the content of the first and second substances,
Figure BDA0003220910780000052
a feature vector corresponding to each pixel in the EMAP features; n is the number of the eigenvectors, where N is M × N,
step S113: selecting training samples as training set
Figure BDA0003220910780000053
Wherein the content of the first and second substances,
Figure BDA0003220910780000054
representing the feature vector corresponding to each training sample pixel, n' is the number of training samples,
step S114: calculating the characteristic vector corresponding to each training sample pixel
Figure BDA0003220910780000055
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DE
Step S115: constructing EMAP feature kernel KEMAP
Figure BDA0003220910780000056
Wherein the content of the first and second substances,
Figure BDA0003220910780000057
represents a distance matrix DEIs measured.
Further, computing a feature composite kernel includes:
stacking the constructed Gabor characteristic core and the EMAP characteristic core by using the characteristic composite core frame to obtain a characteristic composite core:
Figure BDA0003220910780000058
where the superscript T represents the transpose of the matrix.
Further, the step of classifying the samples in the remote sensing image through a multinomial logistic regression model comprises the following steps:
step S301: calculating an input function of the multinomial logistic regression model as a training sample set phi:
Φ=[1,KF T]T
step S302: calculating a logistic regression coefficient ω:
Figure BDA0003220910780000061
Figure BDA0003220910780000062
Figure BDA0003220910780000063
wherein the content of the first and second substances,
Figure BDA0003220910780000064
an estimator representing a logistic regression coefficient ω; l represents the number of features; p (ω) represents the posterior probability estimate; y isjDenotes xjThe corresponding label amount; x is the number ofjRepresenting a sample;
Figure BDA0003220910780000065
a mark representing a label amount k;
Figure BDA0003220910780000066
representing a logistic regression;
Figure BDA0003220910780000067
a k-th dimension representing a logistic regression coefficient ω; k represents the number of categories; x is the number ofiThe characteristic vector corresponding to the sample pixel is obtained; y isiIs xi1,2, …, n;
the logistic regression quantity
Figure BDA0003220910780000068
Considered as a random vector with a laplacian density,
p(ω)exp(λ||ω||1)
where λ is a regularization parameter that controls the degree of sparsity of the logistic regression coefficient ω.
The invention has the beneficial effects that:
the Gabor features can be used for expressing the spatial structure features of the images in different scales and directions, effectively extracting texture information and having a good classification effect on the images with rich texture information; the EMAP features comprise scale information, geometric shape information and the like, the spatial profile information of the hyperspectral image is extracted relatively comprehensively, and the Gabor features and the EMAP features are combined with the spectral information and the spatial information to avoid adverse effects of original spectral data.
Drawings
FIG. 1 is a flow chart of a classification method of remote sensing images based on Gabor features and EMAP features in an embodiment of the present invention;
FIG. 2 is a flow chart of Gabor feature extraction according to an embodiment of the present invention;
FIG. 3 is a flow chart of EMAP feature extraction according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating the classification of samples by a multiple logistic regression model according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings and examples, it being understood that the examples described below are intended to facilitate the understanding of the invention, and are not intended to limit it in any way.
In the classification method provided by the invention, in order to reduce the data dimension and shorten the calculation time, firstly, the hyperspectral images are processed through principal component analysis to obtain the first three principal components of the hyperspectral images. The principal component analysis is used as an unsupervised spectral feature extraction algorithm, spectral features can be extracted from a hyperspectral image, and meanwhile, the feature dimension reduction effect is achieved. And the first three main components can retain most effective information, and the influence of noise is reduced.
The size of the hyperspectral image is defined to be mn D, so the size of each principal component is M N. Gabor features are then extracted on the first three principal components to express texture information. Then, extended multi-attribute contour features are extracted on the first three principal components using an attribute filter to express spatial contour information. In order to fuse the features, the Gabor features are fused with the extended multi-attribute profile features using a feature composite kernel. The characteristic composite core can retain effective information and has strong flexibility. The problem of large sample quantity difference of different types of ground objects can be solved by adopting a multinomial logistic regression model as a classifier.
As shown in fig. 1, a remote sensing image classification method based on Gabor features and EMAP features provided in an embodiment of the present invention includes the following steps:
step S1: and extracting Gabor characteristics and EMAP characteristics. Specifically, Gabor features are extracted from the remote sensing images and a Gabor feature kernel is constructed, and EMAP features are extracted from the remote sensing images and an EMAP feature kernel is constructed.
In an optional embodiment, the remote sensing image is processed by a principal component analysis method to obtain the first three principal components of the remote sensing image; gabor features are then extracted using Gabor filters on the first three principal components obtained, and EMAP features are extracted using attribute filters on the first three principal components obtained.
In an alternative embodiment, Gabor features are extracted from the remotely sensed image and a Gabor feature kernel is constructed.
Fig. 2 is a flow chart of extracting Gabor features according to an embodiment of the present invention, and as shown in fig. 2, the specific process is as follows:
step S101: gabor features of the first three principal components are extracted. Specifically, the first three main components of the remote sensing image are convolved with a series of Gabor filters with different frequencies and multiple directions to extract Gabor features:
Figure BDA0003220910780000081
Figure BDA0003220910780000082
Figure BDA0003220910780000083
Figure BDA0003220910780000084
wherein G (x, y) represents a Gabor filter; x represents a coordinate in the direction of the horizontal axis of the image; y tableCoordinates in the vertical axis direction of the pictorial image; i represents an imaginary number; f. ofuRepresenting the frequency of a sinusoidal function; θ represents the direction of the gaussian kernel function; delta and
Figure BDA0003220910780000085
is a constraint to maintain the ratio between width and wavelength;
Figure BDA0003220910780000086
Figure BDA0003220910780000087
w represents the size of the Gabor filter; f. ofmaxIs the highest peak frequency of the Gabor function; u is the number of scales of the Gabor filter; v is the number of directions of the Gabor filter. Optionally, mixing delta with
Figure BDA0003220910780000088
Is arranged as
Figure BDA0003220910780000089
w is set to 55; the highest peak frequency fmaxSet to 0.1; the number of dimensions U of the filter is set to 5; theta is set to be 0-180 DEG respectively
Figure BDA00032209107800000810
The number of directions V of the filter is therefore 8.
Component I of the first three principal components3The convolution process of (x, y) with the Gabor filter G (x, y) is expressed as:
FGabor=|I3(x,y)*G(x,y)|
wherein, FGaborTo represent a three-dimensional matrix of Gabor features, the size is MxNxD1Wherein D is1Each of the images U × V × 3 and M, N represents the length and width of the image.
Step S102: three-dimensional matrix F of Gabor features obtained from remote sensing imagesGaborThe method specifically comprises the following steps:
Figure BDA0003220910780000091
wherein the content of the first and second substances,
Figure BDA0003220910780000092
a feature vector corresponding to each pixel in the Gabor feature; n is the number of eigenvectors, where N is M × N
Step S103: selecting training samples as training set
Figure BDA0003220910780000093
Wherein the content of the first and second substances,
Figure BDA0003220910780000094
representing the feature vector corresponding to each training sample pixel, n' is the number of training samples,
step S104: calculating the characteristic vector corresponding to each training sample pixel
Figure BDA0003220910780000095
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DG
Figure BDA0003220910780000096
DG=(dc,d)n×n
Wherein m is a feature vector dimension; x'Gck、x′GdkRespectively representing feature vectors
Figure BDA0003220910780000097
The (c) th dimension of (a),
step S105: computing Gabor feature Kernel KGabor
Figure BDA0003220910780000098
Wherein the content of the first and second substances,
Figure BDA0003220910780000099
represents a distance matrix DGIs measured.
In an optional embodiment, the EMAP features are extracted from the remote sensing image and an EMAP feature kernel is constructed, and the specific process is as follows:
combining the attribute features generated on different principal components to obtain extended attribute profile features, stacking different types of extended attribute profile features to generate EMAP features,
the attribute A is compared with a reference value λ in each region of each principal component, and a filtering operation is performed, giving a series of thresholds { λ }12,,λmAnd obtaining attribute contour features through attribute refining and extending operations:
AP(fj)={TKm(fj),…TK1(fj),fj,TH1(fj),…THm(fj)j=1,2,3
wherein, AP (f)j) Representing attribute profile features; TKiAnd THiI ═ 1,2, …, m, representing the augmented and refined transforms, respectively; f. ofjRepresenting features extracted from the original sample information.
In an alternative embodiment, the EMAP features are extracted from the first three principal components using attribute filters.
Fig. 3 is a flowchart of extracting EMAP features according to an embodiment of the present invention. As shown in fig. 3, the specific process is as follows:
step S111: and obtaining the extended attribute profile characteristics. Specifically, attribute profile features are extracted from the first three principal components and combined to obtain extended attribute profile features:
EAP(f1,f2,f3)={AP(f1),AP(f2),AP(f3)}
and extracting the extended attribute profile features of four different attributes from the first three principal components. In this embodiment, a region attribute, a bounding box diagonal length attribute, an inertia attribute, and a standard deviation attribute are selected. In the region attribute, willaSet at 100, 500, 1000, 5000; in the bounding box diagonal length attribute, will λdSet at 10, 25, 50, 100; in the inertial property, williSet to 0.2, 0.3, 0.4, 0.5; in the standard deviation property, willsAre arranged as 20, 30, 40, 50.
Extended attribute profile feature EAP for generating different attributes in the first three principal componentsa(f1,f2,f3)、EAPd(f1,f2,f3)、EAPi(f1,f2,f3)、EAPs(f1,f2,f3) Stacking to obtain three-dimensional matrices F of EMAP featuresEMAPThe size is MxNxD2Wherein D is2Is the dimension of the EMAP feature.
FEMP={EPa(f1,f2,f3),EPd(f1,f2,f3),EPi(f1,f2,f3),EPs(f1,f2,f3)}
Step S112: three-dimensional matrix F of EMAP features obtained from remote sensing imagesEMAPThe method specifically comprises the following steps:
Figure BDA0003220910780000101
wherein the content of the first and second substances,
Figure BDA0003220910780000102
a feature vector corresponding to each pixel in the EMAP features; n is the number of the eigenvectors, where N is M × N,
step S113: selecting training samples as training set
Figure BDA0003220910780000111
Wherein the content of the first and second substances,
Figure BDA0003220910780000112
representing the feature vector corresponding to each training sample pixel, n' is the number of training samples,
step S114: computing each training sample pixel pairCorresponding feature vector
Figure BDA0003220910780000113
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DE
Step S115: constructing EMAP feature kernel KEMAP
Figure BDA0003220910780000114
Wherein the content of the first and second substances,
Figure BDA0003220910780000115
represents a distance matrix DEIs measured.
In the present embodiment, σ is set to 1.5, for example.
Step S2: and fusing a Gabor characteristic core and an EMAP characteristic core. Specifically, the constructed Gabor feature core and EMAP feature core are stacked using a feature composite core framework.
In an alternative embodiment, the Gabor feature core and the EMAP feature core are stacked to obtain a feature composite core:
Figure BDA0003220910780000116
where the superscript T represents the transpose of the matrix.
Compared with the modes of stacking, summing and the like, the method has the advantages that the characteristic kernel is constructed, the characteristic composite kernel framework is adopted for characteristic fusion, the effective information can be kept, the flexibility is improved, and the information redundancy and the calculation complexity increase are effectively avoided.
Step S3: the samples are classified by a polynomial logistic regression model. Specifically, based on the obtained feature composite kernel, samples in the remote sensing image are classified through a multinomial logistic regression model.
In consideration of the high-dimensional characteristics of the hyperspectral image, the embodiment of the invention processes the high-dimensional features by using a LORSAL algorithm.
In an alternative embodiment, the samples in the remote sensing image are classified by a multiple logistic regression model.
FIG. 4 is a flowchart illustrating the classification of samples by a multiple logistic regression model according to an embodiment of the present invention. The method comprises the following specific steps:
step S301: calculating an input function of the multinomial logistic regression model as a training sample set phi:
Φ=[1,KT]T
step S302: calculating a logistic regression coefficient ω:
Figure BDA0003220910780000121
Figure BDA0003220910780000122
Figure BDA0003220910780000123
wherein the content of the first and second substances,
Figure BDA0003220910780000124
an estimator representing a logistic regression coefficient ω; l represents the number of features; p (ω) represents the posterior probability estimate; y isjDenotes xjThe corresponding label amount; x is the number ofjRepresenting a sample;
Figure BDA0003220910780000125
a mark representing a label amount k;
Figure BDA0003220910780000126
representing a logistic regression;
Figure BDA0003220910780000127
a k-th dimension representing a logistic regression coefficient ω; k represents the number of categories; x is the number ofiThe characteristic vector corresponding to the sample pixel is obtained; y isiIs xi1,2, …, n; because the distribution density is not dependent on the regression omega(k)Take omega(K)=0。
To control the complexity of the algorithm and its generalization ability, logistic regressors are used
Figure BDA0003220910780000128
Figure BDA0003220910780000129
Considered as a random vector with a laplacian density,
p(ω)exp(λ||ω||1)
where λ is a regularization parameter that controls the degree of sparsity of the logistic regression coefficient ω. According to the experiment, it was set to 0.000001 and the number of iterations was set to 700.
The open structure of the input function of the multiple logistic regression model makes the model more flexible, the input function can be of a linear structure, a nonlinear structure and a kernel structure, and the sample weight parameters can solve the problem that the quantity difference of different ground object type samples is large to a certain extent.
It will be apparent to those skilled in the art that various modifications and improvements can be made to the embodiments of the present invention without departing from the inventive concept thereof, and these modifications and improvements are intended to be within the scope of the invention.

Claims (6)

1. A remote sensing image classification method based on Gabor characteristics and EMAP characteristics is characterized by comprising the following steps:
step S1: extracting Gabor characteristics from the remote sensing image and constructing a Gabor characteristic kernel, and extracting EMAP characteristics from the remote sensing image and constructing an EMAP characteristic kernel;
step S2: fusing the Gabor characteristic core and the EMAP characteristic core constructed in the step S1 by using a characteristic composite core framework to obtain a characteristic composite core;
step S3: classifying the samples in the remote sensing image through a multinomial logistic regression model based on the feature composite kernel obtained in the step S2.
2. The method according to claim 1, wherein the step S1 further comprises:
extracting the first three principal components of the remote sensing image by a principal component analysis method, extracting Gabor characteristics on the first three principal components by using a Gabor filter, and extracting EMAP characteristics on the first three principal components by using an attribute filter.
3. The method of claim 2, wherein the step of extracting Gabor features using Gabor filters on the first three principal components comprises:
step S101: convolving the first three principal components with a series of Gabor filters of different frequencies and multiple directions to extract Gabor features, the Gabor filters being represented by the following equation:
Figure FDA0003220910770000011
Figure FDA0003220910770000012
Figure FDA0003220910770000013
Figure FDA0003220910770000014
wherein G (x, y) represents a Gabor filter; x represents a coordinate in the direction of the horizontal axis of the image; y represents a coordinate in the vertical axis direction of the image; i represents an imaginary number; f. ofuRepresenting the frequency of a sinusoidal function; θ represents the direction of the gaussian kernel function; delta and
Figure FDA0003220910770000016
is a constraint to maintain the ratio between width and wavelength;
Figure FDA0003220910770000015
Figure FDA0003220910770000021
w represents the size of the Gabor filter; f. ofmaxIs the highest peak frequency of the Gabor function; u is the number of scales of the Gabor filter; v is the number of directions of the Gabor filter,
component I of the first three principal components3The convolution process of (x, y) with the Gabor filter is expressed as:
FGabor=|I3(x,y)*G(x,y)|
wherein, FGaborTo represent a three-dimensional matrix of Gabor features, the size is MxNxD1Wherein D is1U × V × 3, M, N denote the length and width of an image, respectively;
step S102: obtaining a three-dimensional matrix F of Gabor characteristics from the remote sensing imageGabor
Figure FDA0003220910770000022
Wherein the content of the first and second substances,
Figure FDA0003220910770000023
a feature vector corresponding to each pixel in the Gabor feature; n is the number of Gabor eigenvectors, where N is M × N,
step S103: selecting training samples as training set
Figure FDA0003220910770000024
Wherein the content of the first and second substances,
Figure FDA0003220910770000025
representing a feature vector corresponding to each of the training sample pixels, n' being the number of the training samples,
step (ii) ofS104: calculating the characteristic vector corresponding to each training sample pixel
Figure FDA0003220910770000026
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DG
Figure FDA0003220910770000027
DG=(dc,d)n×n
Wherein m is a feature vector dimension; x'Gck、x′GdkRespectively representing feature vectors
Figure FDA0003220910770000028
The (c) th dimension of (a),
step S105: computing Gabor feature Kernel KGabor
Figure FDA0003220910770000029
Wherein the content of the first and second substances,
Figure FDA00032209107700000210
represents a distance matrix DGRepresents the radial extent of the characteristic kernel.
4. The method of claim 2, wherein the step of extracting EMAP features from the first three principal components using an attribute filter comprises:
step S111: extracting attribute contour features on the first three principal components, and expanding the attribute contour features:
EAP(f1,f2,f3)={AP(f1),AP(f2),AP(f3)}
wherein, AP (f)j)={TKm(fj),…TK1(fj),fj,TH1(fj),…THm(fj)j=1,2,3AP(fj) Representing attribute profile features; TKiAnd THiI ═ 1,2, …, m, representing the augmented and refined transforms, respectively; f. ofjRepresenting features extracted from the original sample information,
generating extended attribute profile features EAP of different attributes in the first three principal componentsa(f1,f2,f3)、EAPd(f1,f2,f3)、EAPi(f1,f2,f3)、EAPs(f1,f2,f3) Stacking to obtain three-dimensional matrices F of EMAP featuresEMAPThe size is MxNxD2Wherein D is2Is the dimension of the EMAP feature(s),
FEMAP={EAPa(f1,f2,f3),EAPd(f1,f2,f3),EAPi(f1,f2,f3),EAPs(f1,f2,f3)}
step S112: calculating a three-dimensional matrix F of the EMAP characteristics of the remote sensing imageEMAP
Figure FDA0003220910770000031
Wherein the content of the first and second substances,
Figure FDA0003220910770000032
a feature vector corresponding to each pixel in the EMAP features; n is the number of the EMAP eigenvectors, where N is M multiplied by N,
step S113: selecting training samples as training set
Figure FDA0003220910770000033
Wherein the content of the first and second substances,
Figure FDA0003220910770000034
representing the feature vector corresponding to each training sample pixel, n' is the number of training samples,
step S114: calculating the characteristic vector corresponding to each training sample pixel
Figure FDA0003220910770000035
A distance d betweenc,dC is 1,2, …, n'; d is 1,2, …, n', and forms a distance matrix DE
Step S115: constructing EMAP feature kernel KEMAP
Figure FDA0003220910770000041
Wherein the content of the first and second substances,
Figure FDA0003220910770000042
represents a distance matrix DERepresents the radial extent of the characteristic kernel.
5. The method of claim 1, wherein computing the feature composite kernel comprises:
stacking the Gabor feature cores and the EMAP feature cores to obtain a feature composite core:
Figure FDA0003220910770000043
wherein, KFFor feature complex verification, superscript T represents the transpose of the matrix.
6. The method of claim 1, wherein the step of classifying the samples in the remotely sensed image by a multiple logistic regression model comprises:
step S301: calculating an input function of the multinomial logistic regression model as a training sample set phi:
Φ=[1,KF T]T
step S302: calculating a logistic regression coefficient ω:
Figure FDA0003220910770000044
Figure FDA0003220910770000045
Figure FDA0003220910770000046
wherein the content of the first and second substances,
Figure FDA0003220910770000047
an estimator representing a logistic regression coefficient ω; l represents the number of features; p (ω) represents the posterior probability estimate; y isjDenotes xjThe corresponding label amount; x is the number ofjRepresenting a sample;
Figure FDA0003220910770000048
a mark representing a label amount k;
Figure FDA0003220910770000049
representing a logistic regression;
Figure FDA00032209107700000410
a k-th dimension representing a logistic regression coefficient ω; k represents the number of categories; x is the number ofiThe characteristic vector corresponding to the sample pixel is obtained; y isiIs xi1,2, …, n;
the logistic regression quantity
Figure FDA0003220910770000051
Considered as a random vector with a laplacian density,
p(ω)exp(λ||ω||1)
where λ is a regularization parameter that controls the degree of sparsity of the logistic regression coefficient ω.
CN202110957831.8A 2021-08-20 2021-08-20 Remote sensing image classification method based on Gabor features and EMAP features Active CN113792761B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110957831.8A CN113792761B (en) 2021-08-20 2021-08-20 Remote sensing image classification method based on Gabor features and EMAP features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110957831.8A CN113792761B (en) 2021-08-20 2021-08-20 Remote sensing image classification method based on Gabor features and EMAP features

Publications (2)

Publication Number Publication Date
CN113792761A true CN113792761A (en) 2021-12-14
CN113792761B CN113792761B (en) 2024-04-05

Family

ID=79181994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110957831.8A Active CN113792761B (en) 2021-08-20 2021-08-20 Remote sensing image classification method based on Gabor features and EMAP features

Country Status (1)

Country Link
CN (1) CN113792761B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107633264A (en) * 2017-09-02 2018-01-26 南京理工大学 Linear common recognition integrated fusion sorting technique based on empty spectrum multiple features limit study
CN112115795A (en) * 2020-08-21 2020-12-22 河海大学 Hyperspectral image classification method based on Triple GAN

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107633264A (en) * 2017-09-02 2018-01-26 南京理工大学 Linear common recognition integrated fusion sorting technique based on empty spectrum multiple features limit study
CN112115795A (en) * 2020-08-21 2020-12-22 河海大学 Hyperspectral image classification method based on Triple GAN

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘冰;左溪冰;谭熊;余岸竹;郭文月;: "高光谱影像分类的深度少样例学习方法", 测绘学报, no. 10, 15 October 2020 (2020-10-15) *
詹曙;张启祥;蒋建国;SHIGERU ANDO;: "基于Gabor特征核协作表达的三维人脸识别", 光子学报, no. 12, 15 December 2013 (2013-12-15) *
赛里克・都曼;阿斯娅・曼力克;崔恒心;张江玲;靳瑰丽;: "卫星遥感影像中草地信息自动提取方法的研究", 草食家畜, no. 03, 25 September 2008 (2008-09-25) *

Also Published As

Publication number Publication date
CN113792761B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
Merkurjev et al. An MBO scheme on graphs for classification and image processing
Wang et al. Decoupling noise and features via weighted ℓ1-analysis compressed sensing
Montazer et al. An improved radial basis function neural network for object image retrieval
Chen et al. An improved method for semantic image inpainting with GANs: Progressive inpainting
CN111126256B (en) Hyperspectral image classification method based on self-adaptive space-spectrum multi-scale network
Barbu Training an active random field for real-time image denoising
Lozes et al. Partial difference operators on weighted graphs for image processing on surfaces and point clouds
US11182644B2 (en) Method and apparatus for pose planar constraining on the basis of planar feature extraction
CN108182449A (en) A kind of hyperspectral image classification method
Strange et al. Open problems in spectral dimensionality reduction
Scharr et al. Image statistics and anisotropic diffusion
Qu et al. Hyperspectral and panchromatic image fusion via adaptive tensor and multi-scale retinex algorithm
CN113095333A (en) Unsupervised feature point detection method and unsupervised feature point detection device
Wei et al. Mesh defiltering via cascaded geometry recovery
Ahmed et al. PIQI: perceptual image quality index based on ensemble of Gaussian process regression
Dinesh et al. Point cloud sampling via graph balancing and Gershgorin disc alignment
Feng et al. Research on infrared and visible image fusion based on tetrolet transform and convolution sparse representation
US9159123B2 (en) Image prior as a shared basis mixture model
Ghorai et al. An image inpainting method using pLSA-based search space estimation
Singh et al. Texture and structure incorporated scatternet hybrid deep learning network (ts-shdl) for brain matter segmentation
Rai et al. Low-light robust face image super-resolution via neuro-fuzzy inferencing based locality constrained representation
Jiang et al. Huber-L_1 L 1-based non-isometric surface registration
Wang et al. Feature-aware trilateral filter with energy minimization for 3D mesh denoising
CN113792761A (en) Remote sensing image classification method based on Gabor characteristics and EMAP characteristics
Zhu et al. Non-local neighbor embedding for image super-resolution through FoE features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant