CN112949416B - Supervised hyperspectral multiscale graph volume integral classification method - Google Patents
Supervised hyperspectral multiscale graph volume integral classification method Download PDFInfo
- Publication number
- CN112949416B CN112949416B CN202110158621.2A CN202110158621A CN112949416B CN 112949416 B CN112949416 B CN 112949416B CN 202110158621 A CN202110158621 A CN 202110158621A CN 112949416 B CN112949416 B CN 112949416B
- Authority
- CN
- China
- Prior art keywords
- hyperspectral
- data
- scale
- graph
- supervised
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 239000011159 matrix material Substances 0.000 claims abstract description 26
- 230000003595 spectral effect Effects 0.000 claims abstract description 25
- 238000012549 training Methods 0.000 claims abstract description 15
- 238000013528 artificial neural network Methods 0.000 claims abstract description 10
- 230000008569 process Effects 0.000 claims description 11
- 230000009467 reduction Effects 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 6
- 230000004913 activation Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 4
- 238000012847 principal component analysis method Methods 0.000 claims description 2
- 238000001228 spectrum Methods 0.000 abstract description 11
- 230000010365 information processing Effects 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 10
- 238000000547 structure data Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/194—Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention relates to the field of intelligent information processing of hyperspectral remote sensing, in particular to a supervised hyperspectral multiscale atlas integral method, which adopts a multiscale atlas neural network to realize fine classification of earth surface coverage in a hyperspectral image scene in a supervised mode. According to the method, aiming at a high-dimensional nonlinear high-spectrum data structure, a global, local and spectral index adjacency matrix is constructed, a small amount of known sample information is used for training a multi-scale graph convolution neural network in a supervision mode, and the graph convolution neural network can effectively adapt to the high-spectrum data to carry out feature learning and label prediction, so that the graph expression capability of the nonlinear features of the high-spectrum data is enhanced, and the accuracy of ground surface coverage classification and identification is improved.
Description
Technical Field
The invention relates to the field of intelligent information processing of hyperspectral remote sensing, in particular to a supervised hyperspectral multi-scale graph convolution method, which integrates multi-mode feature data and adopts a multi-scale graph convolution neural network designed by training in a standard supervised learning mode to realize hyperspectral image classification.
Background
The hyperspectral data has the characteristic of 'map-in-one', contains high-resolution space and spectrum information and abundant electromagnetic spectrum and radiation characteristics, has the advantages of higher spectral resolution, high sensitivity to ground element observation and stronger advantage to fine ground surface coverage classification, and accordingly has a plurality of challenges in the field of hyperspectral intelligent information processing and analysis. The acquisition and information processing of the hyperspectral images are high-dimensional signal acquisition and representation processes, strong correlation exists in both a spatial domain and a spectral domain, and a chart feature learning method suitable for nonlinear feature expression is needed to improve the performance of hyperspectral image classification. The convolutional neural network has excellent characteristic abstract expression capability, is known to realize fine classification of earth surface coverage in a hyperspectral image scene, is mainly used for regular grid data characterization and analysis, but cannot model inherent topological relation among samples and also cannot depict object distribution and geometric characteristics in a local area, such as class boundaries. The graph convolution network classification method is used as an extension of a deep convolution network, although a traditional convolution neural network can be expanded to efficiently process irregular structure data represented by a graph structure, the graph structure data comprises the attribute of a node and the topological relation between the node and an adjacent node, and how to construct an adjacent matrix is often involved, so that high-dimensional hyperspectral data is converted into the relational data of the graph structure. In particular, in the conventional graph convolution classification Fang Faduo, a hyperspectral image is used as an input of a whole graph, only spectral features are used, not only is the calculation cost high, but also local spatial structure information embedded in hyperspectral data is not considered, and the graph features do not meet the semi-supervised or supervised learning assumption about samples in the process of learning.
Disclosure of Invention
In view of this, the present invention provides a supervised hyperspectral multi-scale graph volume integral classification method, which uses graph structure to encode high-dimensional nonlinear hyperspectral structure data, migrates high-spectral data in a regular domain to a low-dimensional irregular domain, and then effectively processes the hyperspectral data by adopting global and local perception multiscale graph volume filtering, and compared with the prior art, the method has lower computation cost and learning complexity.
In order to achieve the above object, the embodiments of the present invention adopt the following technical solutions.
The embodiment of the invention provides a supervised hyperspectral multi-scale volume integral method, which comprises the following steps:
obtaining hyperspectral image dataWhereinIs a domain space, w is an image width, h is an image height, and l is a spectral channel number;
for the hyperspectral image dataTo carry out non-Supervised feature reduction and sampling are carried out to obtain a multi-scale hyperspectral cube data setWherein K is the total number of samples;
from the hyperspectral image dataDeriving a multi-channel spectral index product setWherein J is the number of adopted spectral indexes;
according to the hyperspectral image dataHyperspectral cube data setAnd multi-channel spectral index product setGenerating multi-modal multi-scale derived data { H, C, I };
according to the derived data { H, C, I }, adopting unsupervised clustering based on distance measurement to construct a global and local adjacency graph matrix { A } H ,A C ,A I };
Combining the derived data { H, C, I } with the adjacency graph matrix { A } H ,A C ,A I Training a designed multi-scale graph neural network architecture in a supervision mode, and obtaining the optimal model weight by minimizing training loss;
obtaining a weight matrix W and a bias matrix b of a minimized loss function by supervised training of GCN and predicting the probability of a sample belonging to all classesN represents the number of classes, and the maximum prediction probability MAX (p) is taken t ) The corresponding category is assigned to the unlabeled sample as a final classification label.
Further, the unsupervised feature reduction method is a principal component analysis method.
Further, the sampling is performed in tile sizes of 3, 5, 7, and 11 pixels.
Further, from the hyperspectral image dataDeriving a multi-channel spectral index product setThe method specifically comprises the following steps: applying a normalized vegetation index (NDVI), a normalized water body index (NDWI), and a normalized building index (NDBI) to the hyperspectral image dataPerforming waveband algebraic calculation, and further stacking to obtain a multi-channel spectral index product set
Further, the distance measure refers to the mahalanobis distance (also called euclidean distance) of the L2 paradigm adopted by the matrix elements of the adjacency graphAs a distance metric; where Dim is the dimension of the feature vector x, row denotes the Row and Col denotes the column.
Further, the GCN adopts an exponential linear unit as an activation function and adopts a sparse cross entropy loss function to process a non-one-hot coded digital coding label.
The method has the advantages that the method provided by the embodiment of the invention adopts multi-modal feature data to construct various multi-scale adjacency graph matrixes, trains the multi-scale graph convolution neural network in a standard supervision mode by using a small amount of known sample information, learns the attributes of the nodes contained in the complex graph structure data and the topological relation between the nodes and the adjacent nodes, and more effectively describes the object distribution and space texture details in a local area. Compared with the traditional hyperspectral data classification method based on the deep convolutional network, the method has more excellent classification performance, can obtain a classification map with higher quality, and is beneficial to enhancing the map expression capability of nonlinear features and improving the accuracy of surface coverage classification identification. The method has the advantages that the high-spectrum high-dimensional nonlinear data structure and the excellent graph characteristic learning capacity of the graph neural network are considered, the high-spectrum data are classified and identified based on the supervised training learning of the local spectrum filtering, the space topological characteristic information can be effectively utilized, the category edge and space texture details can be described, the intra-class similarity analysis can be strengthened, and the noise resistance is good. The spectral indexes adopted by the invention mainly comprise common normalized vegetation indexes, normalized water body indexes and normalized building indexes, and the indexes can respectively improve vegetation, water body and artificial characteristics, inhibit the interference of other ground surface coatings and enhance corresponding ground object information, thereby realizing better classification and extraction effects.
In order to make the aforementioned and other objects, features and advantages of the invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts. The above and other objects, features and advantages of the present invention will become more apparent from the accompanying drawings. Like reference numerals refer to like parts throughout the drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
Fig. 1 shows a flowchart of a supervised hyperspectral multi-scale volume integral classification method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating a supervised hyperspectral multi-scale volume integral classification method according to an embodiment of the present invention.
Fig. 3 is a general framework diagram illustrating a supervised hyperspectral multi-scale volume integral classification method according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In view of the urgent need in the prior art for a method capable of learning irregular graph structure data in a supervised manner, modeling multi-scale feature topological relation and describing class boundary information simultaneously, the inventor conceals a supervised hyperspectral multi-scale graph volume integral method. In the method, multiple multi-scale adjacency Graph matrixes are constructed by adopting multi-modal reduction features, a multi-scale Graph Convolutional Neural Network (GCN) is trained in a supervision mode by utilizing a small amount of known sample information (30-60 samples are selected for each category), the attributes of nodes contained in complex Graph structure data and the topological relation between the nodes and adjacent nodes are learned, and object distribution and space texture details in a local area are described more effectively. Compared with the traditional hyperspectral data classification method based on the deep convolutional network, the method disclosed by the invention has more excellent classification performance, can obtain a classification chart with higher quality, and is beneficial to enhancing the graph expression capability of nonlinear characteristics and improving the accuracy of earth surface coverage classification and identification.
1-2 show a flow chart and a schematic diagram of a supervised hyperspectral multi-scale volume integral classification method provided by an embodiment of the invention. Referring to fig. 1-2, a supervised hyperspectral multi-scale volume integral classification method according to an embodiment of the present invention includes: using hyperspectral mapsImageMulti-scale hyperspectral cube with equal sampling and pixel numberAnd a set of spectral index products derived from the hyperspectral imageWherein,is a domain space, w is an image width, h is an image height, and l is a spectral channel number; k represents the total number of samples, typically taken as the product of the image width and height; j is the number of adopted spectral indexes, and the value is 3.
Aiming at the multi-modal characteristic data, firstly, constructing a corresponding adjacency graph matrix by adopting a K nearest neighbor algorithm to obtain a whole scene graph matrix, a multi-scale cube graph matrix and a stacking index graph matrix derived from original hyperspectral data; then, inputting the graph matrix into a multi-scale graph convolutional neural network in parallel to extract space-spectrum joint features, and performing feature graph superposition when the depth feature extraction is finished to obtain a feature extraction result fusing global and local graph node representations; and finally, obtaining the optimal graph weight parameter by minimizing the classification loss on the traditional multi-layer perception-based network, and finally realizing label distribution and classification graph output of the unlabeled samples.
In this embodiment, the spectral indices used mainly include:
(1) Normalized Difference Vegetation Index (NDVI): NDVI = (NIR-R)/(NIR + R);
(2) Normalized Difference Water Index (NDWI): NDWI = (G-NIR)/(G + NIR);
(3) Normalized Difference building Index (Normalized Difference build-Up Index, NDBI): NDBI = (MIR-NIR)/(MIR + NIR);
wherein R is the reflection value of a red light wave band, G is the reflection value of a green light wave band, NIR is the reflection value of a near infrared wave band, and MIR is the reflectivity of a middle infrared wave band. The indexes can respectively improve the characteristics of vegetation, water and construction land, inhibit the interference of other ground surface coatings and enhance the information of corresponding ground surface types, thereby realizing better ground object extraction effect.
The specific steps of this example are as follows.
S1, acquiring hyperspectral original data, then performing data standardization, writing a function to adapt to an algorithm to load data, and facilitating the preprocessing and data enhancement of the hyperspectral data.
In this embodiment, first, a 3D hyperspectral original image is acquired by an imaging spectrometerThen, carrying out hyperspectral data standardization, storing the data of the image body in a Double-precision (Double) format, storing the labeled sample data in an 8-bit unsigned integer (Uint 8) format, and processing the storage formats into an MAT format; then writing a uniform data loading function functional interface (Define load Datasetsuction function (String RawFilamee) { return Double DNValues, uint8 actual laboratories }); further, an image digital value (DN value) is converted into the surface reflectivity of a ground feature pixel through an MODTRAN (medium spectrum resolution atmospheric radiation transmission mode) model and a 6S (satellite signal in secondary simulation solar spectrum) model, data enhancement is achieved by random rotation and random inversion, and development of a diversified sample data set in subsequent steps is facilitated.
S2, calculating index maps corresponding to different spectral indexes by using hyperspectral original map data and adopting band algebra and stacking bands; obtaining reduced dimension data by adopting unsupervised characteristic reductionThen sampling to obtain a multi-scale feature cubeFinally generating multi-modal multi-scale derivative numberAccording to { H, C, I }.
In this embodiment, first, the hyperspectral image data of the whole scene is loaded, and unsupervised feature reduction is performed by Principal Component Analysis (PCA); then, establishing a multi-scale hyperspectral cubic volume data set according to the size (3, 5, 7 and 11) of the image blocks with different pixel numbersThen, according to different spectral index (NDVI, NDWI and NDBI) definitions, performing waveband algebraic calculation on the hyperspectral images, and further stacking the hyperspectral images to obtain a multichannel spectral index data product set of hyperspectral data
S3, for multi-scale multi-modal hyperspectral derivative data { H, C and I }, constructing a global and local multi-scale adjacency graph matrix { A) by adopting unsupervised clustering based on distance measurement H ,A C ,A I And (4) conveniently training the multi-scale GCN in parallel in a supervision mode by combining the multi-modal raw data.
In this embodiment, a K nearest neighbor algorithm is used for unsupervised clustering, so that the multi-scale multi-modal feature data is constructed into an adjacency graph matrix a with multiple scales, and thus, an edge topological relation among nodes in complex graph structure data can be represented. Here, the matrix elements of the adjacency graph adopt the Mahalanobis distance (also called Euclidean distance) in L2 paradigmAs a distance measure index; where Dim is the dimension of the feature vector x, row denotes the Row and Col denotes the column. Corresponding to different adjacent map matrixes for feature data of different modes, namely, an overall view adjacent map matrix A H Multiscale cube adjacency matrix A C And spectral index adjacency graph matrix A I 。
Step S4, combining original characteristic data { H, C, I } of the multi-mode with a derived adjacency graph matrix { A } H ,A C ,A I To superviseThe multi-scale graph neural network architecture is trained and designed in the mode, and the optimal model weight is obtained by minimizing training loss.
In this embodiment, the optimal weights are obtained by training multi-scale GCNs in parallel through forward propagation rules.
Step S5, a small number of training samples (30-60 samples are selected for each class) are adopted, optimal model parameters W and b obtained by supervised training of GCN are obtained, and the probability that the samples belong to all classes is predictedN represents the number of categories and takes the maximum prediction probability MAX (p) t ) The corresponding category is assigned to the unlabeled sample as a final classification label.
In this embodiment, a GCN architecture including a graph convolution, an activation function, a graph pooling layer and a full connection layer is designed, where the graph convolution uses multi-scale kernel sizes (3, 5, 7 and 11), uses an Exponential Linear Unit (ELU) as the activation function to accelerate the training process and improve the accuracy of classification, then defines a Sparse cross entropy loss function (Sparse intercalary cross entropy transmission) to process digital coding labels of non-one-hot coding, further measures the difference between the predicted labels and the actual labels, obtains a weight matrix W and a bias matrix b of the minimized loss function, and finally outputs a digitized label graph and further converts the label graph into a classification graph of different color spaces.
On the basis of the traditional convolutional neural network classification model based on the regular structure image, the invention provides a supervised hyperspectral multi-scale image volume integral classification method by constructing multi-modal multi-scale derived feature data and constructing various multi-scale image weight matrixes by adopting a K nearest neighbor algorithm. It should be noted that, a hyperspectral multi-scale graph convolutional network classification method based on multi-modal derived feature data and various multi-scale adjacency graph matrixes, based on standard supervised learning and considering global and local perception is adopted here, but the method is not limited to this method.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (6)
1. A supervised hyperspectral multi-scale volume integral classification method comprises the following steps:
obtaining hyperspectral image dataIn whichIn order to be a domain space,wto be the width of the image,his the height of the image,lThe number of spectral channels;
for the hyperspectral image dataUnsupervised feature reduction is carried out, sampling is carried out, and a multi-scale hyperspectral cube data set is obtainedIn whichKIs the total number of samples;
from the hyperspectral image dataDeriving a multi-channel spectral index product setWhereinJThe number of adopted spectral indexes;
according to the hyperspectral image dataHyperspectral cube data setAnd multi-channel spectral index product setGenerating multi-modal multi-scale derived data;
According to the derived dataConstructing global and local adjacency graph matrix by adopting unsupervised clustering based on distance measurement;
Uniting said derived dataAnd adjacency graph matrixTraining a designed multi-scale graph neural network architecture in a supervision mode, and obtaining the optimal model weight by minimizing training loss;
deriving a weight matrix for a minimization of loss function by supervised training of the GCNAnd a bias matrixAnd predicting the probability that the sample belongs to all classes,The maximum prediction probability is taken as the number of the representative categoriesThe corresponding category is assigned to the unlabeled exemplar as the final classification label.
2. The method of claim 1, wherein: the unsupervised feature reduction method is a principal component analysis method.
3. The method of claim 1, wherein: the sampling is done in tile sizes of 3, 5, 7 and 11 pixels.
4. The method of claim 1, wherein: from the hyperspectral image dataDeriving a multi-channel spectral index product setThe method specifically comprises the following steps: aligning the hyperspectral image data according to a normalized vegetation index, a normalized water body index and a normalized building indexPerforming waveband algebraic calculation, and further stacking to obtain a multi-channel spectral index product set。
5. The method of claim 1, wherein: the distance measurement refers to the Mahalanobis distance of the L2 paradigm adopted by the matrix elements of the adjacency graphAs a distance metric; wherein,is a feature vectorThe dimension (c) of (a) is,the line in which the character is located is shown,indicating the column in which it is located.
6. The method of claim 1, wherein: the GCN adopts an exponential linear unit as an activation function and adopts a sparse cross entropy loss function to process a digital coding label of the non-one-hot coding.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110158621.2A CN112949416B (en) | 2021-02-04 | 2021-02-04 | Supervised hyperspectral multiscale graph volume integral classification method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110158621.2A CN112949416B (en) | 2021-02-04 | 2021-02-04 | Supervised hyperspectral multiscale graph volume integral classification method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112949416A CN112949416A (en) | 2021-06-11 |
CN112949416B true CN112949416B (en) | 2022-10-04 |
Family
ID=76244005
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110158621.2A Active CN112949416B (en) | 2021-02-04 | 2021-02-04 | Supervised hyperspectral multiscale graph volume integral classification method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112949416B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113673300B (en) * | 2021-06-24 | 2024-07-19 | 核工业北京地质研究院 | Hyperspectral image intelligent unmixing method based on non-supervision training |
CN113762128A (en) * | 2021-08-31 | 2021-12-07 | 中国人民解放军战略支援部队信息工程大学 | Hyperspectral image classification method based on unsupervised learning |
CN113780146B (en) * | 2021-09-06 | 2024-05-10 | 西安电子科技大学 | Hyperspectral image classification method and system based on lightweight neural architecture search |
CN113920442B (en) * | 2021-09-29 | 2024-06-18 | 中国人民解放军火箭军工程大学 | Hyperspectral classification method combining graph structure and convolutional neural network |
CN114743037B (en) * | 2022-04-06 | 2024-08-27 | 华南农业大学 | Deep medical image clustering method based on multi-scale structure learning |
CN116337240B (en) * | 2023-03-29 | 2024-05-17 | 大连海事大学 | Thermal infrared hyperspectral data band selection method based on graph neural network |
CN116030355B (en) * | 2023-03-30 | 2023-08-11 | 武汉城市职业学院 | Ground object classification method and system |
CN118015356B (en) * | 2024-02-04 | 2024-10-11 | 中国矿业大学 | Hyperspectral classification method of variation dynamic fusion network based on Gaussian mixture |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461207A (en) * | 2018-11-05 | 2019-03-12 | 胡翰 | A kind of point cloud data building singulation method and device |
CN110781775A (en) * | 2019-10-10 | 2020-02-11 | 武汉大学 | Remote sensing image water body information accurate segmentation method supported by multi-scale features |
CN110991483A (en) * | 2019-11-01 | 2020-04-10 | 北京邮电大学 | High-order neighborhood mixed network representation learning method and device |
CN111476287A (en) * | 2020-04-02 | 2020-07-31 | 中国人民解放军战略支援部队信息工程大学 | Hyperspectral image small sample classification method and device |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7450761B2 (en) * | 2004-11-02 | 2008-11-11 | The Boeing Company | Spectral geographic information system |
US20140204092A1 (en) * | 2012-04-09 | 2014-07-24 | The Regents Of The University Of California | Classification of high dimensional data |
CN108564588B (en) * | 2018-03-21 | 2020-07-10 | 华中科技大学 | Built-up area automatic extraction method based on depth features and graph segmentation method |
CN109509192B (en) * | 2018-10-18 | 2023-05-30 | 天津大学 | Semantic segmentation network integrating multi-scale feature space and semantic space |
CN110163472B (en) * | 2019-04-11 | 2021-03-26 | 中国水利水电科学研究院 | Large-range extreme drought emergency monitoring and influence evaluation method and system |
CN110321963B (en) * | 2019-07-09 | 2022-03-04 | 西安电子科技大学 | Hyperspectral image classification method based on fusion of multi-scale and multi-dimensional space spectrum features |
CN111144496B (en) * | 2019-12-27 | 2022-11-18 | 齐齐哈尔大学 | Garbage classification method based on hybrid convolutional neural network |
CN111191736B (en) * | 2020-01-05 | 2022-03-04 | 西安电子科技大学 | Hyperspectral image classification method based on depth feature cross fusion |
CN111695636B (en) * | 2020-06-15 | 2023-07-14 | 北京师范大学 | Hyperspectral image classification method based on graph neural network |
-
2021
- 2021-02-04 CN CN202110158621.2A patent/CN112949416B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461207A (en) * | 2018-11-05 | 2019-03-12 | 胡翰 | A kind of point cloud data building singulation method and device |
CN110781775A (en) * | 2019-10-10 | 2020-02-11 | 武汉大学 | Remote sensing image water body information accurate segmentation method supported by multi-scale features |
CN110991483A (en) * | 2019-11-01 | 2020-04-10 | 北京邮电大学 | High-order neighborhood mixed network representation learning method and device |
CN111476287A (en) * | 2020-04-02 | 2020-07-31 | 中国人民解放军战略支援部队信息工程大学 | Hyperspectral image small sample classification method and device |
Also Published As
Publication number | Publication date |
---|---|
CN112949416A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112949416B (en) | Supervised hyperspectral multiscale graph volume integral classification method | |
CN108985238B (en) | Impervious surface extraction method and system combining deep learning and semantic probability | |
CN111914907B (en) | Hyperspectral image classification method based on deep learning space-spectrum combined network | |
CN111368896B (en) | Hyperspectral remote sensing image classification method based on dense residual three-dimensional convolutional neural network | |
CN111259828B (en) | High-resolution remote sensing image multi-feature-based identification method | |
CN107392925B (en) | Remote sensing image ground object classification method based on super-pixel coding and convolutional neural network | |
CN102013017B (en) | Method for roughly sorting high-resolution remote sensing image scene | |
Rouhani et al. | Semantic segmentation of 3D textured meshes for urban scene analysis | |
CN107358260B (en) | Multispectral image classification method based on surface wave CNN | |
CN107451616A (en) | Multi-spectral remote sensing image terrain classification method based on the semi-supervised transfer learning of depth | |
CN107590515B (en) | Hyperspectral image classification method of self-encoder based on entropy rate superpixel segmentation | |
CN111860351B (en) | Remote sensing image fishpond extraction method based on line-row self-attention full convolution neural network | |
CN111639587B (en) | Hyperspectral image classification method based on multi-scale spectrum space convolution neural network | |
CN112950780B (en) | Intelligent network map generation method and system based on remote sensing image | |
CN110309780A (en) | High resolution image houseclearing based on BFD-IGA-SVM model quickly supervises identification | |
CN111783884B (en) | Unsupervised hyperspectral image classification method based on deep learning | |
CN112818920B (en) | Double-temporal hyperspectral image space spectrum joint change detection method | |
CN109145832A (en) | Polarimetric SAR image semisupervised classification method based on DSFNN Yu non local decision | |
CN112329818B (en) | Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization | |
CN115661652A (en) | Object-oriented graph neural network unsupervised remote sensing image change detection method | |
CN114120036A (en) | Lightweight remote sensing image cloud detection method | |
CN115661677A (en) | Light-weight satellite image cloud detection method based on dark channel feature guidance | |
CN116912550A (en) | Land utilization parallel classification method for heterogeneous convolution network remote sensing images based on ground object dependency relationship | |
CN114972885A (en) | Multi-modal remote sensing image classification method based on model compression | |
CN113449603B (en) | High-resolution remote sensing image earth surface element identification method and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20231102 Address after: Room 709, 7th floor, No. 892 Fenglin Avenue (Aerospace Building), Nanchang Economic and Technological Development Zone, Nanchang City, Jiangxi Province, 330052 (1st to 9th floors) Patentee after: Nanchang Qingtuo Intelligent Technology Co.,Ltd. Address before: 344000 No. 56, Xuefu Road, Fuzhou, Jiangxi Patentee before: EAST CHINA INSTITUTE OF TECHNOLOGY |
|
TR01 | Transfer of patent right |