CN114494762A - Hyperspectral image classification method based on deep migration network - Google Patents
Hyperspectral image classification method based on deep migration network Download PDFInfo
- Publication number
- CN114494762A CN114494762A CN202111503748.XA CN202111503748A CN114494762A CN 114494762 A CN114494762 A CN 114494762A CN 202111503748 A CN202111503748 A CN 202111503748A CN 114494762 A CN114494762 A CN 114494762A
- Authority
- CN
- China
- Prior art keywords
- domain
- data
- layer
- source domain
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000013508 migration Methods 0.000 title claims abstract description 33
- 230000005012 migration Effects 0.000 title claims abstract description 33
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000009826 distribution Methods 0.000 claims abstract description 8
- 230000006978 adaptation Effects 0.000 claims description 25
- 230000009467 reduction Effects 0.000 claims description 10
- 238000013528 artificial neural network Methods 0.000 claims description 8
- 238000012549 training Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 235000014653 Carica parviflora Nutrition 0.000 claims description 6
- 241000243321 Cnidaria Species 0.000 claims description 6
- 230000003044 adaptive effect Effects 0.000 claims description 6
- 230000003595 spectral effect Effects 0.000 claims description 4
- 239000004576 sand Substances 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000005034 decoration Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001617 migratory effect Effects 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a hyperspectral image classification method based on a deep migration network. And the first-order statistic information of the related sub-fields is aligned based on the local maximum mean difference, and the local distribution of the two fields is adapted. The method can extract deep and discriminant features of the target domain, and only completes classification of the unlabeled samples of the target domain by using the source domain labeled samples.
Description
Technical Field
The invention relates to a method for realizing hyperspectral image classification by using a deep migration network, belonging to the field of pattern recognition.
Background
The hyperspectral image (HSI) has abundant spectrum and space information, and has wide application prospect in the fields of agriculture, climate detection, national defense safety and the like. HSI classification is a common task for these applications and aims to classify each pixel in an image into different categories based on the spectral and spatial information obtained from the detection of the feature. Researchers have proposed many methods to improve the accuracy of HSI classification, including random forests, support vector machines, and decision trees. Although the traditional hyperspectral image classification methods are simple in model, most of the traditional hyperspectral image classification methods cannot guarantee classification accuracy.
In recent years, deep learning has achieved excellent performance on many computer vision tasks, such as image processing, object detection, and natural language processing. Deep learning can automatically learn features, which enables it to be adapted to various task scenarios. And deep learning has strong nonlinear representation capability, and can extract deeper and discriminative features of data. The advantages enable the deep learning to be successfully applied to the hyperspectral image classification task. The strong feature expression capability of deep learning often requires a large number of labeled training sample supports. It is known that with the development of a new generation of satellite hyperspectral sensor, a large amount of unlabeled HSIs can be rapidly acquired, however, labeling of the images requires a large amount of time for relevant experts to complete. Therefore, the shortage of the marked samples seriously affects the application of the deep learning method in the hyperspectral classification task. In order to solve the problems, many researchers combine active learning, data enhancement and other methods with deep learning, and complete hyperspectral image classification by using a small number of labeled samples. Although the above method can alleviate the problem of insufficient training samples caused by the difficulty in labeling HSI to some extent, when the training set data and the test set data come from similar but different fields, i.e. the data distribution of the training set data and the test set data are similar but different, it is difficult for the above method to obtain an ideal experimental effect. Transfer learning can solve this problem well by exploring domain invariant structures to transfer knowledge from a tagged domain (source domain) to a similar but non-distributed domain (target domain), thereby completing cross-domain classification. The deep migration learning model obtained by combining the deep network with the migration learning can learn deeper and more migratory features, and breakthrough achievements are achieved on the hyperspectral classification task. The source of the method is that the deep migration learning network can extract the characteristics with discriminant and invariant factors in the data and effectively group the HSI characteristics according to the correlation of the invariant factors.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a hyperspectral image classification method based on a deep migration network, which can finish the classification of label-free samples of a target domain only by using source domain labeled samples.
The technical scheme is as follows: a hyperspectral image classification method based on a depth migration network comprises the following steps:
step 1, performing dimensionality reduction on an original hyperspectral image by using waveband selection: removing the redundancy of wave bands to obtain the hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
Step 2, training an auxiliary classifier by using a source domain marking sample, and obtaining a target domain pseudo label by using the auxiliary classifier;
step 3, constructing a deep migration neural network, adapting a source domain and a target domain by using a domain adaptation layer based on CORAL loss to reduce the second-order statistic difference, and reducing the first-order statistic difference of the source domain and the target related subspace based on LMMD;
and 4, classifying the target domain data by using the trained auxiliary classifier.
Further, in the step 1, the original hyperspectral image is subjected to dimensionality reduction by using waveband selection: removing the redundancy of wave bands to obtain the hyperspectral data X after dimensionality reduction0The method specifically comprises the following steps:
defining the original HSI wave band number as NbIn the number of intervals ofAndrespectively select a andb wave bands, the number of the wave bands after dimensionality reduction is d, whereinRepresenting a rounding down operation, we can obtain:
defining the hyperspectral data X after dimensionality reduction0∈Rn×dAs input to the model, n represents the number of samples, spectral data X0Including reduced-dimension source domain dataAnd target domain dataWherein,andwherein n issRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
Further, a deep migration neural network is constructed in the step 3, a field adaptation layer is used for adapting a source domain and a target domain to reduce second-order statistic difference based on a CORAL method, and first-order statistic difference of a source domain and a target related subspace is reduced based on LMMD; the method specifically comprises the following steps:
the deep migration network DTN comprises a full connection layer, a nonlinear layer, a domain adaptation layer and a Softmax layer; reducing the dimension of the source domain dataAnd target domain dataInput into deep migration network DTN via full connectionExtracting features by a feature extractor consisting of the layer and the nonlinear layer; wherein, the input of the full connection layer is as follows:
F1=I×W1+b1
wherein I is the full link layer input, b1Is an offset; connecting the fully-connected layer output as an input to a nonlinear layer, the nonlinear layer output being:
wherein, INInputting a nonlinear layer;
the domain adaptation layer is used for adapting the distribution difference of the two domains and then connecting the output of the domain adaptation layer to the Softmax layer; the loss function of the deep migration network DTN is defined as:
wherein,the term is adapted for the covariance field,in order to adapt the terms for the subspace,for source domain data classification loss, α1And alpha2Respectively are variance field adaptive parameters and subspace adaptive parameters; the covariance field adaptation term can be expressed as:
wherein d is1Dimension for domain adaptation layer input, CsAnd CtRespectively representing a source domain data covariance matrix and a target domain data covariance matrix;
the subspace adaptation term may be expressed as:
wherein,c ∈ {1, 2., C } is a category index for the target domain pseudo label obtained by the auxiliary classifier,andto representAndweights belonging to class c, thenIs a weighted sum of the c classes,can be calculated as:
the source domain data classification penalty can be expressed as:
wherein C is the number of categories, Y represents a category matrix, and S is a prediction result of a DTN model of the deep migration network.
Has the advantages that: according to the cross-domain hyperspectral image classification method of the deep migration network, the second-order statistic information of the two domains is aligned integrally based on CORAL, and the global distribution of the two domains is adapted. And the first-order statistic information of the related sub-fields is aligned based on the local maximum mean difference, and the local distribution of the two fields is adapted. The method can extract deep and discriminant features of the target domain, and only completes classification of unlabeled samples of the target domain by using the source domain labeled samples
Drawings
FIG. 1 is a schematic diagram of the process of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings.
A hyperspectral image classification method based on a deep migration network can finish classification of target domain unlabeled samples only by using source domain labeled samples.
The technical scheme is as follows: a hyperspectral image classification method based on a depth migration network comprises the following steps:
step 1, performing dimensionality reduction on an original hyperspectral image by using waveband selection: removing the redundancy of wave bands to obtain the hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
The number of original HSI bands is large, and the correlation between the bands is strong, so that a large amount of redundant information exists between the bands. Directly inputting the original HSI into the DTN causes the increase of network parameters and the reduction of model performance. And performing dimensionality reduction on the original HSI data by applying wave band selection. Defining the original HSI wave band number as NbIn the number of intervals ofAndrespectively selecting a wave band and b wave bands, and the number of the wave bands after dimensionality reduction is d, whereinRepresenting a rounding down operation, we can obtain:
defining the hyperspectral data X after dimensionality reduction0∈Rn×dAs input to the model, n represents the number of samples, spectral data X0Including reduced-dimension source domain dataAnd target domain dataWherein,andwherein n issRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
Step 2, training an auxiliary classifier by using a source domain marking sample, and obtaining a target domain pseudo label by using the auxiliary classifier;
and 3, constructing a deep migration neural network, reducing the second-order statistic difference of the two domains by using a domain adaptation layer based on a CORAL (CORrelation Alignment) algorithm, and reducing the first-order statistic difference of the source domain and the target related subspace based on an LMMD (Local Maximum Mean value metric).
Deep neural networks are widely used for HSI classification due to their powerful deep feature extraction capabilities. However, when the training set and the test set belong to different data distributions, the deep neural network has difficulty in learning migratable knowledge, thereby resulting in insufficient classification capability of the model. In order to solve the problems, a deep migration network DTN is provided, a domain adaptation layer is added into the DNN, and global second-order statistics and first-order statistics of each type of related subspace are aligned simultaneously on deep layers extracted by the DNN, source domain with discriminant and target domain features.
The deep migration network DTN is a feed-forward neural network comprising a fully-connected layer, a nonlinear layer, a domain adaptation layer and a Softmax layer. FC1 in fig. 1 represents a domain adaptation layer, and FC2 represents a Softmax layer.
Reducing the dimension of the source domain dataAnd target domain dataThe features are extracted by a feature extractor composed of a full link layer and a nonlinear layer when the features are inputted into a deep migration network DTN. Wherein, the input of the full connection layer is as follows:
F1=I×W1+b1
wherein I is the full link layer input, b1Is an offset. Connecting the fully-connected layer output as an input to a nonlinear layer, the nonlinear layer output being:
wherein, INIs a non-linear layer input. A domain adaptation layer is added to adapt the two domain distribution differences, and then the output of the domain adaptation layer is connected to the Softmax layer. The loss function of the deep migration network DTN is defined as:
wherein,the term is adapted for the covariance field,in order to adapt the terms for the subspace,for source domain data classification loss, α1And alpha2Respectively, a variance domain adaptive parameter and a subspace adaptive parameter.
The covariance field adaptation term can be expressed as:
wherein d is1Dimension for domain adaptation layer input, CsAnd CtRespectively representing the source domain and target domain data covariance matrices.
The subspace adaptation term may be expressed as:
wherein,c ∈ {1, 2., C } is a category index for the target domain pseudo label obtained by the auxiliary classifier,andto representAndweights belonging to class c, thenIs class cThe weighted sum of (a) and (b),can be calculated as:
the source domain data classification penalty can be expressed as:
wherein C is the number of categories, Y represents a category matrix, and S is a prediction result of a DTN model of the deep migration network.
And 4, step 4: and classifying the target domain data by using a trained auxiliary classifier.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (3)
1. A hyperspectral image classification method based on a depth migration network is characterized by comprising the following steps:
step 1, performing dimensionality reduction on an original hyperspectral image by using waveband selection: removing the redundancy of wave bands to obtain the hyperspectral data X after dimensionality reduction0Hyperspectral data X0Including reduced-dimension source domain dataAnd target domain data
Step 2, training an auxiliary classifier by using a source domain marking sample, and obtaining a target domain pseudo label by using the auxiliary classifier;
step 3, constructing a deep migration neural network, adapting a source domain and a target domain by using a domain adaptation layer based on CORAL loss to reduce the second-order statistic difference, and reducing the first-order statistic difference of the source domain and the target related subspace based on LMMD;
and 4, classifying the target domain data by using the trained auxiliary classifier.
2. The hyperspectral image classification method based on the depth migration network according to claim 1 is characterized in that in step 1, the original hyperspectral image is subjected to dimensionality reduction by using wave band selection: removing the redundancy of wave bands to obtain the hyperspectral data X after dimensionality reduction0The method specifically comprises the following steps:
defining the original HSI wave band number as NbIn the number of intervals ofAndrespectively selecting a wave band and b wave bands, and the number of the wave bands after dimensionality reduction is d, whereinRepresenting a rounding down operation, we can obtain:
defining the hyperspectral data X after dimensionality reduction0∈Rn×dAs input to the model, n represents the number of samples, spectral data X0Including reduced-dimension source domain dataAnd target domain dataWherein,andwherein n issRepresenting the number of source domain samples, ntRepresenting the number of samples in the target domain, YsIs a source domain label.
3. The hyperspectral image classification method based on the depth migration network according to claim 1 is characterized in that a depth migration neural network is constructed in step 3, a domain adaptation layer is used for adapting a source domain and a target domain based on a CORAL method to reduce the difference of second-order statistics, and simultaneously, the difference of first-order statistics of related subspaces of the source domain and the target is reduced based on LMMD; the method specifically comprises the following steps:
the deep migration network DTN comprises a full connection layer, a nonlinear layer, a domain adaptation layer and a Softmax layer; reducing the dimension of the source domain dataAnd target domain dataInputting the data into a deep migration network (DTN), and extracting features through a feature extractor consisting of a full connection layer and a nonlinear layer; wherein, the input of the full connection layer is as follows:
F1=I×W1+b1
wherein I is the full link layer input, b1Is an offset; connecting the fully-connected layer output as an input to a nonlinear layer, the nonlinear layer output being:
wherein, INInputting a nonlinear layer;
the domain adaptation layer is used for adapting the distribution difference of the two domains and then connecting the output of the domain adaptation layer to the Softmax layer; the loss function of the deep migration network DTN is defined as:
wherein,the term is adapted for the covariance field,in order to adapt the terms for the subspace,for source domain data classification loss, α1And alpha2Respectively are variance field adaptive parameters and subspace adaptive parameters; the covariance field adaptation term can be expressed as:
wherein d is1Dimension for domain adaptation layer input, CsAnd CtRespectively representing a source domain data covariance matrix and a target domain data covariance matrix;
the subspace adaptation term may be expressed as:
wherein,c ∈ {1, 2., C } is a category index for the target domain pseudo label obtained by the auxiliary classifier,andto representAndweights belonging to class c, thenIs a weighted sum of the c classes,can be calculated as:
the source domain data classification penalty can be expressed as:
wherein C is the number of categories, Y represents a category matrix, and S is a prediction result of a DTN model of the deep migration network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111503748.XA CN114494762B (en) | 2021-12-10 | 2021-12-10 | Hyperspectral image classification method based on depth migration network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111503748.XA CN114494762B (en) | 2021-12-10 | 2021-12-10 | Hyperspectral image classification method based on depth migration network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114494762A true CN114494762A (en) | 2022-05-13 |
CN114494762B CN114494762B (en) | 2024-09-24 |
Family
ID=81492795
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111503748.XA Active CN114494762B (en) | 2021-12-10 | 2021-12-10 | Hyperspectral image classification method based on depth migration network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114494762B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116188830A (en) * | 2022-11-01 | 2023-05-30 | 青岛柯锐思德电子科技有限公司 | Hyperspectral image cross-domain classification method based on multi-level feature alignment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109359623A (en) * | 2018-11-13 | 2019-02-19 | 西北工业大学 | High spectrum image based on depth Joint Distribution adaptation network migrates classification method |
KR102197297B1 (en) * | 2019-09-27 | 2020-12-31 | 서울대학교산학협력단 | Change detection method using recurrent 3-dimensional fully convolutional network for hyperspectral image |
-
2021
- 2021-12-10 CN CN202111503748.XA patent/CN114494762B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109359623A (en) * | 2018-11-13 | 2019-02-19 | 西北工业大学 | High spectrum image based on depth Joint Distribution adaptation network migrates classification method |
KR102197297B1 (en) * | 2019-09-27 | 2020-12-31 | 서울대학교산학협력단 | Change detection method using recurrent 3-dimensional fully convolutional network for hyperspectral image |
Non-Patent Citations (1)
Title |
---|
毛远宏;贺占庄;马钟;: "重构迁移学习的红外目标分类", 电子科技大学学报, no. 04, 30 July 2020 (2020-07-30) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116188830A (en) * | 2022-11-01 | 2023-05-30 | 青岛柯锐思德电子科技有限公司 | Hyperspectral image cross-domain classification method based on multi-level feature alignment |
CN116188830B (en) * | 2022-11-01 | 2023-09-29 | 青岛柯锐思德电子科技有限公司 | Hyperspectral image cross-domain classification method based on multi-level feature alignment |
Also Published As
Publication number | Publication date |
---|---|
CN114494762B (en) | 2024-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111414942B (en) | Remote sensing image classification method based on active learning and convolutional neural network | |
CN111695467B (en) | Spatial spectrum full convolution hyperspectral image classification method based on super-pixel sample expansion | |
CN106203523B (en) | The hyperspectral image classification method of the semi-supervised algorithm fusion of decision tree is promoted based on gradient | |
CN111274869B (en) | Method for classifying hyperspectral images based on parallel attention mechanism residual error network | |
CN111598001B (en) | Identification method for apple tree diseases and insect pests based on image processing | |
CN112241762B (en) | Fine-grained identification method for pest and disease damage image classification | |
CN111695456B (en) | Low-resolution face recognition method based on active discriminant cross-domain alignment | |
CN111401426B (en) | Small sample hyperspectral image classification method based on pseudo label learning | |
CN105205449A (en) | Sign language recognition method based on deep learning | |
CN110175248B (en) | Face image retrieval method and device based on deep learning and Hash coding | |
Akhand et al. | Convolutional Neural Network based Handwritten Bengali and Bengali-English Mixed Numeral Recognition. | |
CN113947725B (en) | Hyperspectral image classification method based on convolution width migration network | |
CN112069900A (en) | Bill character recognition method and system based on convolutional neural network | |
Akhand et al. | Convolutional neural network training with artificial pattern for bangla handwritten numeral recognition | |
CN114972904B (en) | Zero sample knowledge distillation method and system based on fighting against triplet loss | |
WO2021128704A1 (en) | Open set classification method based on classification utility | |
Sethy et al. | Off-line Odia handwritten numeral recognition using neural network: a comparative analysis | |
CN107633264B (en) | Linear consensus integrated fusion classification method based on space spectrum multi-feature extreme learning | |
CN114494762A (en) | Hyperspectral image classification method based on deep migration network | |
CN116596891B (en) | Wood floor color classification and defect detection method based on semi-supervised multitasking detection | |
CN114049567B (en) | Adaptive soft label generation method and application in hyperspectral image classification | |
Kumar et al. | Siamese based Neural Network for Offline Writer Identification on word level data | |
CN113723456B (en) | Automatic astronomical image classification method and system based on unsupervised machine learning | |
CN103530658B (en) | A kind of plant leaf blade data recognition methods based on rarefaction representation | |
CN114519787B (en) | Depth feature visualization interpretation method of intelligent handwriting evaluation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |