CN111160392A - Hyperspectral classification method based on wavelet width learning system - Google Patents

Hyperspectral classification method based on wavelet width learning system Download PDF

Info

Publication number
CN111160392A
CN111160392A CN201911223284.XA CN201911223284A CN111160392A CN 111160392 A CN111160392 A CN 111160392A CN 201911223284 A CN201911223284 A CN 201911223284A CN 111160392 A CN111160392 A CN 111160392A
Authority
CN
China
Prior art keywords
hyperspectral
wavelet
learning system
feature
nodes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911223284.XA
Other languages
Chinese (zh)
Inventor
刘治
黄会芳
林佳泰
章云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201911223284.XA priority Critical patent/CN111160392A/en
Publication of CN111160392A publication Critical patent/CN111160392A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The method applies a width learning system to train and classify in hyperspectral classification, combines the width learning system, can dynamically expand nodes and improve the recognition rate of the system without completely rebuilding and training a model compared with a method based on a traditional deep learning network, and has the advantages of rapidness and high efficiency; collecting a hyperspectral image through imaging spectrometer equipment, denoising the collected image, selecting and extracting characteristic values in the image, and classifying the image through wavelet width learning; in addition, a kernel principal component analysis method is adopted during feature extraction, and a wavelet basis function is used as an excitation function of a feature layer, so that the nonlinear fitting capability of the network is improved.

Description

Hyperspectral classification method based on wavelet width learning system
Technical Field
The invention relates to the technical field of machine learning, signal classification and image processing, in particular to a hyperspectral classification method based on a wavelet width learning system.
Background
The hyperspectral technology combines the imaging technology and the spectrum technology, and forms a plurality of narrow wave bands through dispersion on each spatial pixel to be covered by continuous spectrums when the spatial characteristics of a target are imaged, so that the formed remote sensing data can be visually described by an image cube. Therefore, the method has certain research significance on hyperspectral classification.
At present, the method for hyperspectral classification mainly comprises the following steps: hyper-spectral classification based on deep convolutional neural networks and hyper-spectral classification based on capsule networks. The former deep convolutional neural network uses a method of local perception and parameter sharing, but the time consumption for network structure training and fine tuning is large, the complexity is high due to a large number of parameters, and a large number of training samples are needed. The calculation amount of hyperspectral classification based on the capsule network is very large, if two detection targets are close to each other, two objects of the same type cannot be detected, and the performance of processing a large image is inferior to that of a neural network. At present, most of the existing hyperspectral classification methods are researched based on deep learning.
In recent years, a great deal of research on hyperspectral classification has been carried out at home and abroad, including: the method comprises the following steps of collecting and researching hyperspectral images, preprocessing, extracting and calculating features, classifying a feature model and the like. Currently, the classification models commonly used are based on classical machine learning algorithms, such as BP neural networks, convolutional neural networks, capsule networks, and the like. However, these classical network models consume a lot of time in the training process.
Disclosure of Invention
The application aims to provide a hyperspectral classification method based on a wavelet width learning system, which is used for improving the classification speed and efficiency on the premise of ensuring the recognition rate.
In order to realize the task, the following technical scheme is adopted in the application:
a hyperspectral classification method based on a wavelet width learning system comprises the following steps:
acquiring a plurality of hyperspectral images to be classified through an imaging spectrometer, and classifying the types of the hyperspectral images; preprocessing the hyperspectral images to be classified to obtain preprocessed hyperspectral images;
and (3) carrying out feature selection and extraction on the preprocessed hyperspectral image, wherein the feature selection and extraction comprise the following steps: selecting color features, texture features and shape features of a hyperspectral image, taking the color features, the texture features and the shape features of the hyperspectral image as input of a kernel principal component analysis method, mapping data in a low-dimensional input space to a high-dimensional feature space, performing space conversion by using a kernel function, calculating feature values and feature vectors by using principal component analysis in the high-dimensional feature space, then sequencing all components, and selecting main feature vectors as input vectors of a wavelet width learning system;
the method comprises the following steps of taking the number of categories of hyperspectral images as the number of output nodes of a wavelet width learning system, generating characteristic nodes of the wavelet width learning system by using characteristic vectors screened by a principal component analysis method, taking the number of the characteristic nodes as the number of input nodes of the wavelet width learning system, and carrying out the following classification processes:
using the features mapped by the input feature vector as feature nodes of the network:
Figure BDA0002301445560000021
wherein x represents an input vector, n is the total number of wavelet basis functions,
Figure BDA0002301445560000022
is the ith wavelet basis function, aiAnd biRespectively are mapped weight values, transfer parameters and scaling parameters, and the parameters are randomly generated in initialization and are updated by using a k-mean algorithm;
each time the parameters in the wavelet basis function are randomly generated and updated, a group of characteristic node sets Z are obtainedn
Using feature node set ZnMapping incremental nodes, wherein the mapped characteristic nodes are enhanced into enhanced nodes which randomly generate weights:
Hm=ξ(ZnWhh)
wherein WhAnd βhξ (-) is an excitation function, m represents the number of enhanced nodes;
training the wavelet width learning system by using a training set, and in the training process, obtaining weight parameters of output nodes by using a pseudo-inverse value and a ridge regression algorithm:
Wall=[Zn|Hm]+Y
where Y is the reference output of the training set and [ Z ]n|Hm]+Is [ Z ]n|Hm]The pseudo-inverse of (d);
for a feature vector input into a wavelet width learning system, the output of the system is
Figure BDA0002301445560000023
And identifying specific things in the hyperspectral images according to the output of the system so as to achieve the purpose of classifying the hyperspectral images.
Further, the pre-processing comprises:
according to the characteristics of strong inter-spectrum correlation and spectrum integration of hyperspectral images, performing wavelet denoising on each hyperspectral image to be classified by adopting the following formula:
Figure BDA0002301445560000031
where λ is the filter threshold, W represents the pre-processing wavelet coefficient, sgn () represents the sign-taking function, and h represents the filter factor.
Further, the color characteristics of the hyperspectral image comprise:
features representing the color mean:
Figure BDA0002301445560000032
features representing color variance:
Figure BDA0002301445560000033
features representing color asymmetry:
Figure BDA0002301445560000034
wherein, PjiThe ith color component value for the jth pixel of the image, N represents the number of pixels contained in the image.
A terminal device comprises a memory, a processor and a computer program which is stored in the memory and can run on the processor, wherein the processor realizes the steps of the hyperspectral classification method based on the wavelet width learning system when executing the computer program.
A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the aforementioned hyperspectral classification method based on a wavelet width learning system.
The application has the following technical characteristics:
1. in the method provided by the application, because the width learning does not adopt a deep structure and is built based on a single hidden layer neural network, the width learning system is different from the deep neural network in that the neural network can achieve the purpose by adopting a mode of increasing the number of layers or adjusting the number of parameters when the precision is not accurate enough, and the width learning system can adopt a mode of transverse expansion and does not need to retrain a complete network at the same time, so that the method can more quickly finish the training of a model and the determination of the parameters based on a traditional deep learning network.
2. Compared with a traditional deep learning network-based method, the method provided by the application can dynamically expand the nodes to improve the recognition rate of the system without completely rebuilding and training the model.
Drawings
FIG. 1 is a schematic flow diagram of the process;
fig. 2 is a schematic structural diagram of a wavelet width learning system.
Detailed Description
The method applies a width learning system to train and classify in hyperspectral classification, and combines the advantages that the width learning system is faster and more efficient than a deep learning network; collecting a hyperspectral image through imaging spectrometer equipment, denoising the collected image, selecting and extracting characteristic values in the image, and classifying the image through wavelet width learning; in addition, a kernel principal component analysis method is adopted during feature extraction, and a wavelet basis function is used as an excitation function of a feature layer, so that the nonlinear fitting capability of the network is improved.
As shown in fig. 1, a hyperspectral classification method based on a wavelet width learning system of the present application includes the following steps:
step 1, obtaining a plurality of hyperspectral images to be classified through an imaging spectrometer, and classifying the types of the hyperspectral images.
In the step, the hyperspectral image types are classified according to the corresponding images, and in the embodiment, the hyperspectral image types are recorded as n types; and taking the hyperspectral image to be classified as input.
And 2, preprocessing each hyperspectral image to be classified.
Due to the environmental influence, a large amount of noise is introduced when the imaging spectrometer collects a hyperspectral image, so that adverse influence is brought to image analysis. In this embodiment, according to the strong inter-spectrum correlation of the hyperspectral images and the feature of spectrum integration, wavelet denoising is performed on each hyperspectral image to be classified, and the following formula is adopted:
Figure BDA0002301445560000041
where λ is the filter threshold, W represents the pre-processing wavelet coefficient, sgn () represents the sign-taking function, and h represents the filter factor. Due to the fact that the hyperspectral image is subjected to wavelet transformation, a sparse matrix is obtained, most coefficients are very small (or 0 value), only a few coefficients have large values, and signal energy is concentrated on the coefficients with the large values. After the noise is subjected to wavelet transform, a discrete distribution of noise energy is generated in the whole wavelet domain, and the noise wavelet coefficients are all small. A large number of experiments show that the denoising method is very effective in denoising the hyperspectral image.
And 3, selecting and extracting the characteristics of the preprocessed hyperspectral image.
Due to the characteristic of integrating the data information and the spectrogram of the hyperspectral image, the spatial distribution characteristic information of the sample can be extracted based on the characteristic extraction of the image, and the characteristic can also be used for reflecting the sample information. Image features are typically selected to be color, texture, and shape features.
Wherein the color features reflect the overall features of the image, and are expressed as:
features representing the color mean:
Figure BDA0002301445560000051
features representing color variance:
Figure BDA0002301445560000052
features representing color asymmetry:
Figure BDA0002301445560000053
if the image color is completely symmetric, its value is zero.
Wherein P isjiThe ith color component value for the jth pixel of the image, N represents the number of pixels contained in the image.
The texture features of the image are regular distribution of gray values caused by repeated arrangement of the ground objects on the image, and are different from image features such as gray levels, colors and the like. The texture features of the image reflect the inherent attributes of the image, and can represent important information of the image.
The shape feature describes a local feature of an image, which is the geometric property of the image in a local area.
In the embodiment of the application, the hyperspectral is subjected to feature extraction by a Kernel Principal Component Analysis (KPCA) method. The color feature, the texture feature and the shape feature are used as input of a kernel principal component analysis method, wherein the kernel principal component is a method for effectively processing nonlinear problems, data (consisting of color feature, texture feature and shape feature) in a low-dimensional input space X is mapped to a high-dimensional feature space F through nonlinear mapping h, space conversion is carried out through a kernel function, a feature value and a feature vector are calculated in the F space through principal component analysis, and then all components are sequenced to select a main feature vector.
The basic idea of kernel principal component is to apply kernel method to principal component analysis, firstly realize mapping from input space X to feature space F by transforming h, and define kernel function K (X)i,xj)=<h(xi),h(xj)>Then the inner product of the two vectors in the feature space can be represented by the kernel function of the two vectors in the input space, at which point the sample point x in the input space1,x2,...,xiTransforming to a sample point h (x) of the feature space1),h(x2),...,h(xi) H () represents a sample of the high-dimensional feature space resulting from the mapping; principal component analysis is then used in the feature space.
And connecting the feature vectors of the selected hyperspectral images in series, namely, connecting different groups of feature nodes generated by the input of the system as shown in fig. 2, and then taking the feature vectors corresponding to the features as the input vectors of the wavelet width learning classification system.
Step 4, designing input and output nodes of the wavelet width learning system according to the type of the hyperspectral image defined in the step 1 and the number of the features in the step 4; the method comprises the steps of taking the number of categories of hyperspectral images as the number of output nodes of a wavelet width learning system, generating feature nodes of the wavelet width learning system by using feature vectors screened by a principal component analysis method, and taking the number of the feature nodes as the number of input nodes of the wavelet width learning system.
Inputting the feature vectors corresponding to the features into a wavelet width learning system as a substitute method of a deep learning network, wherein the specific processing process is as follows:
firstly, using the features mapped by the input feature vector as feature nodes of the network; secondly, the mapped feature nodes are enhanced into enhanced nodes which randomly generate weights; finally, all the mapping feature nodes and the enhancement nodes are directly connected to the output nodes, and the corresponding weight parameters can be solved through pseudo-inverse, such as in fig. 2, and then classified.
The specific processing steps of the wavelet width learning system are as follows:
step 5.1, using the features mapped by the input feature vector as feature nodes of the network:
Figure BDA0002301445560000061
wherein x represents an input vector, wherein x refers to all input vectors of the wavelet width learning system, and different groups of feature nodes can be mapped through randomly generated weights, transfer parameters and scaling parameters based on a wavelet function; n is the total number of wavelet basis functions, n is selected according to the complexity of the hyperspectral image, and the number m of the underlying enhancement nodes.
Wherein wi,aiAnd biRespectively, a mapping weight value, a transfer parameter and a scaling parameter, which are randomly generated in initialization and updated by using a k-mean algorithm.
Figure BDA0002301445560000071
Is the ith wavelet basis function, wherein n is the total number of wavelet basis functions.
Step 5.2, each time the parameters in the wavelet basis functions are randomly generated and updated, a group of characteristic node sets can be obtained:
Zn=[Z1,Z2,......,Zn]
step 5.3, utilizing the feature node set ZnMapping incremental nodes, wherein the mapped characteristic nodes are enhanced into enhanced nodes which randomly generate weights:
Hm=ξ(ZnWhh)
wherein WhAnd βhThe weight value and the threshold parameter of the incremental node are respectively, the parameters are randomly generated during initialization and do not need to be updated, ξ (-) is a stimulus function, and the sigmoid function is used in the embodiment.
And 5.4, training the wavelet width learning system by using a training set, and in the training process, obtaining output weight parameters by using a pseudo-inverse value and ridge regression algorithm:
Wall=[Zn|Hm]+Y
where Y is the reference output of the training set and [ Z ]n|Hm]+Is [ Z ]n|Hm]The pseudo inverse value of (d). In the testing and practical process, the output node can be directly mapped out:
Figure BDA0002301445560000072
and 5.5, storing the trained wavelet width learning system, preprocessing the hyperspectral images to be classified in the process of classifying the hyperspectral images, selecting and extracting features, inputting the extracted feature vectors into the trained wavelet width learning system, and outputting the output obtained by the system
Figure BDA0002301445560000073
And identifying specific things in the hyperspectral images according to the output of the system so as to achieve the purpose of classifying the hyperspectral images.

Claims (5)

1. A hyperspectral classification method based on a wavelet width learning system is characterized by comprising the following steps:
acquiring a plurality of hyperspectral images to be classified through an imaging spectrometer, and classifying the types of the hyperspectral images; preprocessing the hyperspectral images to be classified to obtain preprocessed hyperspectral images;
and (3) carrying out feature selection and extraction on the preprocessed hyperspectral image, wherein the feature selection and extraction comprise the following steps: selecting color features, texture features and shape features of a hyperspectral image, taking the color features, the texture features and the shape features of the hyperspectral image as input of a kernel principal component analysis method, mapping data in a low-dimensional input space to a high-dimensional feature space, performing space conversion by using a kernel function, calculating feature values and feature vectors by using principal component analysis in the high-dimensional feature space, then sequencing all components, and selecting main feature vectors as input vectors of a wavelet width learning system;
the method comprises the following steps of taking the number of categories of hyperspectral images as the number of output nodes of a wavelet width learning system, generating characteristic nodes of the wavelet width learning system by using characteristic vectors screened by a principal component analysis method, taking the number of the characteristic nodes as the number of input nodes of the wavelet width learning system, and carrying out the following classification processes:
using the features mapped by the input feature vector as feature nodes of the network:
Figure FDA0002301445550000011
wherein x represents an input vector, n is the total number of wavelet basis functions,
Figure FDA0002301445550000012
is the ith wavelet basis function, wi,aiAnd biRespectively are mapped weight values, transfer parameters and scaling parameters, and the parameters are randomly generated in initialization and are updated by using a k-mean algorithm;
each time the parameters in the wavelet basis function are randomly generated and updated, a group of characteristic node sets Z are obtainedn
Using feature node set ZnMapping incremental nodes, wherein the mapped characteristic nodes are enhanced into enhanced nodes which randomly generate weights:
Hm=ξ(ZnWhh)
wherein WhAnd βhξ (-) is an excitation function, m represents the number of enhanced nodes;
training the wavelet width learning system by using a training set, and in the training process, obtaining weight parameters of output nodes by using a pseudo-inverse value and a ridge regression algorithm:
Wall=[Zn|Hm]+Y
where Y is the reference output of the training set and [ Z ]n|Hm]+Is [ Z ]n|Hm]The pseudo-inverse of (d);
for a feature vector input into a wavelet width learning system, the output of the system is
Figure FDA0002301445550000021
And identifying specific things in the hyperspectral images according to the output of the system so as to achieve the purpose of classifying the hyperspectral images.
2. The hyperspectral classification method based on wavelet width learning system according to claim 1 is characterized in that the preprocessing comprises:
according to the characteristics of strong inter-spectrum correlation and spectrum integration of hyperspectral images, performing wavelet denoising on each hyperspectral image to be classified by adopting the following formula:
Figure FDA0002301445550000022
where λ is the filter threshold, W represents the pre-processing wavelet coefficient, sgn () represents the sign-taking function, and h represents the filter factor.
3. The hyperspectral classification method based on the wavelet width learning system according to claim 1, wherein the color features of the hyperspectral image comprise:
features representing the color mean:
Figure FDA0002301445550000023
features representing color variance:
Figure FDA0002301445550000024
features representing color asymmetry:
Figure FDA0002301445550000025
wherein, PjiThe ith color component value for the jth pixel of the image, N represents the number of pixels contained in the image.
4. A terminal device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that the steps of the method according to any of claims 1 to 3 are implemented when the computer program is executed by the processor.
5. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201911223284.XA 2019-12-03 2019-12-03 Hyperspectral classification method based on wavelet width learning system Pending CN111160392A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911223284.XA CN111160392A (en) 2019-12-03 2019-12-03 Hyperspectral classification method based on wavelet width learning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911223284.XA CN111160392A (en) 2019-12-03 2019-12-03 Hyperspectral classification method based on wavelet width learning system

Publications (1)

Publication Number Publication Date
CN111160392A true CN111160392A (en) 2020-05-15

Family

ID=70556438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911223284.XA Pending CN111160392A (en) 2019-12-03 2019-12-03 Hyperspectral classification method based on wavelet width learning system

Country Status (1)

Country Link
CN (1) CN111160392A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111814545A (en) * 2020-06-01 2020-10-23 北京简巨科技有限公司 Crop identification method and device, electronic equipment and storage medium
CN111982302A (en) * 2020-08-24 2020-11-24 广东工业大学 Temperature measurement method with noise filtering and environment temperature compensation
CN113159062A (en) * 2021-03-23 2021-07-23 中国科学院深圳先进技术研究院 Training of classification model, image classification method, electronic device and storage medium
CN113657479A (en) * 2021-08-12 2021-11-16 广东省人民医院 Novel multi-scale depth-width combined pathological picture classification method, system and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109615008A (en) * 2018-12-11 2019-04-12 华中师范大学 Hyperspectral image classification method and system based on stack width learning
CN110389663A (en) * 2019-06-24 2019-10-29 广东工业大学 A kind of sEMG gesture identification method based on small wave width learning system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109615008A (en) * 2018-12-11 2019-04-12 华中师范大学 Hyperspectral image classification method and system based on stack width learning
CN110389663A (en) * 2019-06-24 2019-10-29 广东工业大学 A kind of sEMG gesture identification method based on small wave width learning system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JIATAI LIN等: "A wavelet broad learning adaptive filter for forecasting and cancelling the physiological tremor in teleoperation", 《NEUROCOMPUTING》 *
李明喜等: "近红外图像消噪方法的对比实验研究", 《红外技术》 *
杨仁欣等: "高光谱图像的特征提取与特征选择研究", 《广西师范学院学报:自然科学版》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111814545A (en) * 2020-06-01 2020-10-23 北京简巨科技有限公司 Crop identification method and device, electronic equipment and storage medium
CN111982302A (en) * 2020-08-24 2020-11-24 广东工业大学 Temperature measurement method with noise filtering and environment temperature compensation
CN111982302B (en) * 2020-08-24 2023-12-29 广东工业大学 Temperature measurement method with noise filtering and environment temperature compensation
CN113159062A (en) * 2021-03-23 2021-07-23 中国科学院深圳先进技术研究院 Training of classification model, image classification method, electronic device and storage medium
CN113159062B (en) * 2021-03-23 2023-10-03 中国科学院深圳先进技术研究院 Classification model training and image classification method, electronic device and storage medium
CN113657479A (en) * 2021-08-12 2021-11-16 广东省人民医院 Novel multi-scale depth-width combined pathological picture classification method, system and medium
CN113657479B (en) * 2021-08-12 2022-12-06 广东省人民医院 Novel multi-scale depth-width combined pathological picture classification method, system and medium

Similar Documents

Publication Publication Date Title
Wang et al. Locality and structure regularized low rank representation for hyperspectral image classification
Ding et al. Semi-supervised locality preserving dense graph neural network with ARMA filters and context-aware learning for hyperspectral image classification
CN112836773B (en) Hyperspectral image classification method based on global attention residual error network
CN111160392A (en) Hyperspectral classification method based on wavelet width learning system
Lin et al. Hyperspectral image denoising via matrix factorization and deep prior regularization
CN109978041B (en) Hyperspectral image classification method based on alternative updating convolutional neural network
CN107590515B (en) Hyperspectral image classification method of self-encoder based on entropy rate superpixel segmentation
Chaugule et al. Evaluation of texture and shape features for classification of four paddy varieties
CN113673590B (en) Rain removing method, system and medium based on multi-scale hourglass dense connection network
CN110458192B (en) Hyperspectral remote sensing image classification method and system based on visual saliency
Gao et al. Densely connected multiscale attention network for hyperspectral image classification
He et al. Multi-spectral remote sensing land-cover classification based on deep learning methods
CN114937173A (en) Hyperspectral image rapid classification method based on dynamic graph convolution network
CN111639697A (en) Hyperspectral image classification method based on non-repeated sampling and prototype network
CN113421198B (en) Hyperspectral image denoising method based on subspace non-local low-rank tensor decomposition
Hussain et al. Image denoising to enhance character recognition using deep learning
Li et al. Ga-cnn: Convolutional neural network based on geometric algebra for hyperspectral image classification
Luo et al. Wavelet-based extended morphological profile and deep autoencoder for hyperspectral image classification
CN113052130A (en) Hyperspectral image classification method based on depth residual error network and edge protection filtering
CN109460788B (en) Hyperspectral image classification method based on low-rank-sparse information combination network
Kuril et al. Cloud classification for weather information by artificial neural network
CN113887656B (en) Hyperspectral image classification method combining deep learning and sparse representation
CN115375941A (en) Multi-feature fusion hyperspectral image classification method based on GAT and 3D-CNN
Lin et al. 2D/3D face recognition using neural networks based on hybrid taguchi-particle swarm optimization
Muthusamy et al. Deep belief network for solving the image quality assessment in full reference and no reference model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200515

RJ01 Rejection of invention patent application after publication