CN107194437B - Image classification method based on Gist feature extraction and concept machine recurrent neural network - Google Patents

Image classification method based on Gist feature extraction and concept machine recurrent neural network Download PDF

Info

Publication number
CN107194437B
CN107194437B CN201710481975.4A CN201710481975A CN107194437B CN 107194437 B CN107194437 B CN 107194437B CN 201710481975 A CN201710481975 A CN 201710481975A CN 107194437 B CN107194437 B CN 107194437B
Authority
CN
China
Prior art keywords
neural network
image
machine
recurrent neural
concept
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710481975.4A
Other languages
Chinese (zh)
Other versions
CN107194437A (en
Inventor
李秀敏
刘阳阳
薛方正
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN201710481975.4A priority Critical patent/CN107194437B/en
Publication of CN107194437A publication Critical patent/CN107194437A/en
Application granted granted Critical
Publication of CN107194437B publication Critical patent/CN107194437B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/333Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image classification method based on Gist characteristic extraction and a concept machine recurrent neural network, which comprises the following steps: 1) extracting Gist characteristics of the image; 2) building a concept machine recurrent neural network; 3) inputting the characteristics generated in the step 1) as a training sample set into a concept machine recurrent neural network; 4) exciting the internal state of the reserve pool according to the training sample set of the images; 5) calculating and storing a concept machine matrix of the images according to the internal state of the reserve pool; 6) inputting the characteristics generated in the step 1) as a test sample set into a concept machine recurrent neural network to excite the internal state of the reserve pool, and calculating and identifying the images according to the concept machine matrix and the reserve pool state calculated in the step 5). The invention can better process the image classification problem, and the concept machine matrix of the image calculated by the steps 1), 3), 4) and 5) can enable the concept machine recurrent neural network to learn and identify new images in an incremental mode.

Description

Image classification method based on Gist feature extraction and concept machine recurrent neural network
Technical Field
The invention relates to the technical field of image processing, in particular to an image classification method.
Background
Image recognition and classification are an important direction in the field of computer vision and image understanding, and at present, many models have achieved obvious results in processing the image classification problem, and no matter the support vector machines in the past or the convolutional neural networks in the present have high accuracy, but the structures of the classifiers are already established after learning, and the classifiers cannot be changed, and new images cannot be continuously learned and recognized on the original basis. If learning is required to identify new images, all structures must be reinitialized, which necessarily wastes significant losses to retrain and learn the model.
Disclosure of Invention
In view of the above, the present invention is directed to an image classification method based on Gist feature extraction and concept machine recurrent neural network, so as to implement incremental learning to identify new images without repeated learning on the learned images.
The invention relates to an image classification method based on Gist feature extraction and a concept machine recurrent neural network, which comprises the following steps:
1) extracting Gist features of an image
Filtering the scene image through a Gabor filter, dividing the filtered image into t multiplied by t grids, and extracting global feature information of the image by adopting discrete Fourier transform and window Fourier transform in each grid;
the specific process is as follows: dividing a grayscale image f (x, y) with the size of h × w into t × t grids with equal size, wherein the size of each grid block is h '× w', and h '═ h/t and w' ═ w/t; convolution filtering is carried out on each grid block image by using filters of m channels, the results of filtering of the m channels are cascaded to form the characteristics of the grid block, the characteristic values calculated by each grid block are averaged to obtain the Gist characteristics of the grid block
Figure GDA0002270796570000021
Wherein G' represents the average eigenvalue generated after the m channel filtering; g (x, y) represents the characteristic values generated after the mth channel is filtered, and each grid block generates m characteristic values; cascading the m average characteristic values generated in each grid block to obtain a Gist characteristic of the whole image, wherein the dimension of the Gist characteristic is t multiplied by m;
2) building concept machine recurrent neural network
The concept machine recurrent neural network consists of input neurons, a reserve pool and output neurons, wherein a connection matrix among the reserve pool neurons conforms to an ESN rule, namely the spectrum radius β of a weight matrix connected inside the reserve pool is less than 1, so that the reserve pool is ensured to have echo state attributes, a reserve pool excitation function adopts a hyperbolic tangent function, and the update equation of the network is as follows:
xj(n+1)=tanh(W*xj(n)+Win*pj(n+1)+b)
yj(n)=xj(n)
wherein, WinThe input weight matrix between the neuron of the input layer and the neuron of the reserve pool is composed of random numbers in standard normal distribution, W is a connection weight matrix between the neurons in the reserve pool, is generated by the random numbers in standard normal distribution, and then the spectral radius of the random numbers is restrained β<1; b is an offset, and has a value of 1; p is a radical ofj(n +1) is a concept machineAn input to the neural network; winW and b are fixed after generation;
3) inputting the image Gist characteristics generated in the step 1) as a training sample set into a concept machine recurrent neural network, namely pj(n +1) is the training sample set resulting from step 1);
4) exciting internal states of a reservoir from a training sample set of such images
Training sample set p of imagesjInputting a recurrent neural network of a concept machine, calculating and recording a training sample set pjExcited reserve pool internal state set { xj};
5) Conceptual machine matrix for computing and storing such images
When the neural network is composed of model pjDriven, N-dimensional excitatory neuron states { xjLocated in a state star cloud, whose geometric characteristics are determined by the driving model; { xjThe basic description of the geometry is an ellipse CjThe principal axis is the state set { x }jMajor component of }; this ellipse CjA conceptual machine representing a model of interest in the recurrent neural network; according to the state set { xjGet C through learning rule trainingj(ii) a For the pool state sequence x (1), … x (l), the following cost function is constructed:
Figure GDA0002270796570000031
wherein C is a concept machine matrix which describes the characteristics of the state space of the reserve pool, α is more than or equal to 0 and is an adjusting parameter, and the concept machine C is obtained by a random gradient descent method:
C(R,α)=R(R+α-2I)-1
wherein R is XXTL, is a state correlation matrix, X is a set of states { XjFinding the appropriate α according to the gradient of Frobenius squared norm:
Figure GDA0002270796570000032
Figure GDA0002270796570000033
for measuring the sensitivity of the concept machine C on an exponential scale when
Figure GDA0002270796570000034
The sensitivity of C to data changes is maximum when the maximum value is reached;
6) identifying such images based on the internal state of the reservoir excited by the conceptual machine matrix and the test sample
By training samples pjGet concept machine CjAnd will train sample p1,p2… into a reserve pool; obtaining a test sample set p by steps 1), 3) and 4)iReserve pool internal state set { xiAnd obtaining the category of the test picture through the following formula:
E=XTCX
Figure GDA0002270796570000035
j*is the category to which the test image belongs.
The invention has the beneficial effects that:
the image classification method based on Gist feature extraction and the concept machine recurrent neural network can better deal with the image classification problem, and the concept machine matrix of the image is calculated through the steps 1), 3), 4) and 5), so that the concept machine recurrent neural network can learn and identify new images in an incremental mode.
Drawings
FIG. 1 is a diagram of a recurrent neural network of a conceptual machine, in which K input units means K input neurons, Ninternal units means N pool neurons, and L output units means L output neurons;
FIG. 2 is
Figure GDA0002270796570000041
The response condition of (2);
fig. 3 is a geometric description of a conceptual machine. Hand-written numberWord image set "0" model p1Excited neuron state star cloud, the shape of which can be abstracted into ellipse, namely concept machine C1
FIG. 4 is a flowchart of an image classification method based on Gist feature extraction and concept machine recurrent neural network of the present invention.
Detailed Description
The invention is further described below with reference to the figures and examples.
To clearly illustrate the effectiveness of the present invention in classifying images, in this embodiment, an experiment of incremental learning and recognition of handwritten digital images is performed, and a common MNIST data set is used, and the handwritten digital images have 10 categories: the number "0", the number "1", the number "2", the number "3", the number "4", the number "5", the number "6", the number "7", the number "8" and the number "9", the training set had 60000 images, the test set had 10000 images, and each image size was 28 × 28. The image classification method based on Gist feature extraction and a concept machine recurrent neural network in the embodiment specifically comprises the following steps:
1) gist features of the images are extracted for a handwritten digital image set "0". Gist feature extraction method an image global feature extraction method, which forms an overall global description of an external scene to capture context information of an image. According to the feature extraction method, a scene image is filtered through a Gabor filter, the filtered image is divided into n multiplied by n grids, and then each grid adopts discrete Fourier transform and window Fourier transform to extract global feature information of the image.
The specific process is as follows: dividing a 28 × 28 grayscale image f (x, y) into 4 × 4 grids of equal size, each grid having a size h '× w', where h 'is 7 and w' is 7; convolution filtering is carried out on each grid block image by using 32-channel filters, the results obtained after 32-channel filtering are cascaded to form the characteristics of the grid block, the characteristic values calculated by each grid block are averaged to obtain the Gist characteristics of the grid block
Figure GDA0002270796570000042
Wherein G' represents the average eigenvalue generated after the m channel filtering; g (x, y) represents the eigenvalues generated after the m-th channel filtering. Thus, each grid block will produce 32 eigenvalues; and (3) cascading the 32 average feature values generated in each grid block to obtain the Gist feature of the whole image, wherein the dimension of the Gist feature is 4 multiplied by 32 to 512, and the 512-dimensional Gist feature of the image is used as a training sample set.
2) The method comprises the following steps of building a concept machine recurrent neural network, wherein the concept machine recurrent neural network is shown in figure 1 and comprises input neurons, a reserve pool and output neurons, wherein the number of the input neurons is 512, the number of the reserve pool neurons is 20, the number of the output neurons is 10, a connection matrix between the reserve pool neurons conforms to an ESN rule, namely the spectral radius β of a weight matrix connected inside the reserve pool is less than 1, and the reserve pool is ensured to have echo state attributes.
xj(n+1)=tanh(W*xj(n)+Win*pj(n+1)+b)
yj(n)=xj(n)
Wherein, WinThe initialization method is that the input weight matrix between the neuron of the input layer and the neuron of the reserve pool is composed of the random numbers in the standard normal distribution, W is the connection weight matrix between the neurons in the reserve pool, the initialization method is that the random numbers in the standard normal distribution are firstly generated, then the characteristic value d ═ abs (W) with the maximum absolute value is calculated, and finally the spectrum radius is restrained β<1, W ═ 0.8W/d; b is an offset, and has a value of 1; p is a radical ofj(n +1) is the input of the recurrent neural network of the concept machine; winW and b are fixed after generation.
3) Inputting the features generated in the step 1) as a training sample set into a concept machine recurrent neural network, namely p1(n +1) is the Gist feature training sample set of the handwritten digital image set "0" generated by step 1).
4) The internal state of the reservoir is excited from a training sample set of such images. Gist feature training of handwritten digital image set "0Sample set p1Inputting a recurrent neural network of a concept machine, calculating and recording a training sample set p1Excited reserve pool internal state set { x1}。
5) A conceptual machine matrix of such images is calculated and stored. When the neural network is composed of model p1Driven, N-dimensional excitatory neuron states { x1Located in a state star cloud whose geometry is determined by the driving model. { x1The basic description of the geometry is an ellipse C1The principal axis is the state set { x }1Major component of. This ellipse C1A conceptual machine representing a handwritten digital image set "0" in the recurrent neural network is shown in fig. 3. According to the state set { x1Get C through simple learning rule training1. For the pool state sequence x (1), … x (l), the following cost function is constructed:
Figure GDA0002270796570000061
where C is the concept machine matrix, which characterizes the reservoir state space, α ≧ 0, which is a tuning parameter called "aperture", and a balance point can be found by tuning α to minimize the objective function.
C(R,α)=R(R+α-2I)-1
Wherein R is XXTL, is a state correlation matrix, X is a set of states { X1}. finding a suitable "iris" α according to the gradient of the Frobenius squared norm:
Figure GDA0002270796570000062
Figure GDA0002270796570000063
the sensitivity (2 norm) of the conceptual machine C on an exponential scale can be measured when
Figure GDA0002270796570000064
To a maximum, C is most sensitive to data changes, similar to adjusting the aperture to maximize the sensitivity of the image to changes in brightness, as shown in FIG. 2, which depicts
Figure GDA0002270796570000065
When α is 5.2.
6) Such images are identified based on the conceptual machine matrix and the internal state of the reservoir excited by the test sample. Assume that a training sample p has passedjGet concept machine CjAnd training a sample p1,p2… have all been loaded into the pool. Then the test sample set p may be alignediIdentifying to obtain a test sample set p through the steps 1), 3) and 4)iReserve pool internal state set { xiAnd obtaining the category of the test picture according to the following formula:
E=XTCX
Figure GDA0002270796570000066
j*is the category to which the test image belongs.
By calculating the concept machine matrix of the handwritten digit image sets of 1, 2, 3, 4) and 5), the concept machine matrix of the handwritten digit image sets of 4, 5, 6, 7, 8 and 9 can enable the concept machine recurrent neural network to learn and identify new images in an incremental mode without repeated learning on the learned images, the learned image types can be identified by the step 6, and the identification accuracy of the MNIST data set is 98.6%.
Finally, the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered in the claims of the present invention.

Claims (1)

1. The image classification method based on Gist feature extraction and a concept machine recurrent neural network is characterized by comprising the following steps: the method comprises the following steps:
1) extracting Gist features of an image
Filtering the scene image through a Gabor filter, dividing the filtered image into t multiplied by t grids, and extracting global feature information of the image by adopting discrete Fourier transform and window Fourier transform in each grid;
the specific process is as follows: dividing a grayscale image f (x, y) with the size of h × w into t × t grids with equal size, wherein the size of each grid block is h '× w', and h '═ h/t and w' ═ w/t; convolution filtering is carried out on each grid block image by using filters of m channels, the results of filtering of the m channels are cascaded to form the characteristics of the grid block, the characteristic values calculated by each grid block are averaged to obtain the Gist characteristics of the grid block
Figure FDA0002270796560000011
Wherein G' represents the average eigenvalue generated after the m channel filtering; g (x, y) represents the characteristic values generated after the mth channel is filtered, and each grid block generates m characteristic values; cascading the m average characteristic values generated in each grid block to obtain a Gist characteristic of the whole image, wherein the dimension of the Gist characteristic is t multiplied by m;
2) building concept machine recurrent neural network
The concept machine recurrent neural network consists of input neurons, a reserve pool and output neurons, wherein a connection matrix among the reserve pool neurons conforms to an ESN rule, namely the spectrum radius β of a weight matrix connected inside the reserve pool is less than 1, so that the reserve pool is ensured to have echo state attributes, a reserve pool excitation function adopts a hyperbolic tangent function, and the update equation of the network is as follows:
xj(n+1)=tanh(W*xj(n)+Win*pj(n+1)+b)
yj(n)=xj(n)
wherein, WinIs the input weight between the input layer neuron and the reserve pool neuronW is a connection weight matrix between neurons in the reserve pool, is generated by the random numbers in the standard normal distribution, and then the spectral radius is restrained β<1; b is an offset, and has a value of 1; p is a radical ofj(n +1) is the input of the recurrent neural network of the concept machine; winW and b are fixed after generation;
3) inputting the image Gist characteristics generated in the step 1) as a training sample set into a concept machine recurrent neural network, namely pj(n +1) is the training sample set resulting from step 1);
4) exciting internal states of a reservoir from a training sample set of such images
Training sample set p of imagesjInputting a recurrent neural network of a concept machine, calculating and recording a training sample set pjExcited reserve pool internal state set { xj};
5) Conceptual machine matrix for computing and storing such images
When the neural network is composed of model pjDriven, N-dimensional excitatory neuron states { xjLocated in a state star cloud, whose geometric characteristics are determined by the driving model; { xjThe basic description of the geometry is an ellipse CjThe principal axis is the state set { x }jMajor component of }; this ellipse CjA conceptual machine representing a model of interest in the recurrent neural network; according to the state set { xjGet C through learning rule trainingj(ii) a For the pool state sequence x (1), … x (l), the following cost function is constructed:
Figure FDA0002270796560000021
wherein C is a concept machine matrix which describes the characteristics of the state space of the reserve pool, α is more than or equal to 0 and is an adjusting parameter, and the concept machine C is obtained by a random gradient descent method:
C(R,α)=R(R+α-2I)-1
wherein R is XXTL, is a state correlation matrix, X is a set of states { Xj}; according to Frobenius squareGradient of norm finds the appropriate α:
Figure FDA0002270796560000022
Figure FDA0002270796560000023
for measuring the sensitivity of the concept machine C on an exponential scale when
Figure FDA0002270796560000024
The sensitivity of C to data changes is maximum when the maximum value is reached;
6) identifying such images based on the internal state of the reservoir excited by the conceptual machine matrix and the test sample
By training samples pjGet concept machine CjAnd will train sample p1,p2… into a reserve pool; obtaining a test sample set p by steps 1), 3) and 4)iReserve pool internal state set { xiAnd obtaining the category of the test picture through the following formula:
E=XTCX
Figure FDA0002270796560000031
j*is the category to which the test image belongs.
CN201710481975.4A 2017-06-22 2017-06-22 Image classification method based on Gist feature extraction and concept machine recurrent neural network Expired - Fee Related CN107194437B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710481975.4A CN107194437B (en) 2017-06-22 2017-06-22 Image classification method based on Gist feature extraction and concept machine recurrent neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710481975.4A CN107194437B (en) 2017-06-22 2017-06-22 Image classification method based on Gist feature extraction and concept machine recurrent neural network

Publications (2)

Publication Number Publication Date
CN107194437A CN107194437A (en) 2017-09-22
CN107194437B true CN107194437B (en) 2020-04-07

Family

ID=59879552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710481975.4A Expired - Fee Related CN107194437B (en) 2017-06-22 2017-06-22 Image classification method based on Gist feature extraction and concept machine recurrent neural network

Country Status (1)

Country Link
CN (1) CN107194437B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107808139B (en) * 2017-11-01 2021-08-06 电子科技大学 Real-time monitoring threat analysis method and system based on deep learning
CN108256463B (en) * 2018-01-10 2022-01-04 南开大学 Mobile robot scene recognition method based on ESN neural network
CN109102002A (en) * 2018-07-17 2018-12-28 重庆大学 In conjunction with the image classification method of convolutional neural networks and conceptual machine recurrent neural network
CN109190708A (en) * 2018-09-12 2019-01-11 重庆大学 The conceptual machine neural network image classification method of view-based access control model cortex treatment mechanism
CN113469271A (en) * 2021-07-19 2021-10-01 北京邮电大学 Image classification method based on Echo State Network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104572940A (en) * 2014-12-30 2015-04-29 中国人民解放军海军航空工程学院 Automatic image annotation method based on deep learning and canonical correlation analysis
CN104598920A (en) * 2014-12-30 2015-05-06 中国人民解放军国防科学技术大学 Scene classification method based on Gist characteristics and extreme learning machine
CN106778768A (en) * 2016-11-22 2017-05-31 广西师范大学 Image scene classification method based on multi-feature fusion
CN106815601A (en) * 2017-01-10 2017-06-09 西安电子科技大学 Hyperspectral image classification method based on recurrent neural network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170140240A1 (en) * 2015-07-27 2017-05-18 Salesforce.Com, Inc. Neural network combined image and text evaluator and classifier

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104572940A (en) * 2014-12-30 2015-04-29 中国人民解放军海军航空工程学院 Automatic image annotation method based on deep learning and canonical correlation analysis
CN104598920A (en) * 2014-12-30 2015-05-06 中国人民解放军国防科学技术大学 Scene classification method based on Gist characteristics and extreme learning machine
CN106778768A (en) * 2016-11-22 2017-05-31 广西师范大学 Image scene classification method based on multi-feature fusion
CN106815601A (en) * 2017-01-10 2017-06-09 西安电子科技大学 Hyperspectral image classification method based on recurrent neural network

Also Published As

Publication number Publication date
CN107194437A (en) 2017-09-22

Similar Documents

Publication Publication Date Title
CN107194437B (en) Image classification method based on Gist feature extraction and concept machine recurrent neural network
CN109961089B (en) Small sample and zero sample image classification method based on metric learning and meta learning
US10713563B2 (en) Object recognition using a convolutional neural network trained by principal component analysis and repeated spectral clustering
CN110348399B (en) Hyperspectral intelligent classification method based on prototype learning mechanism and multidimensional residual error network
US11651229B2 (en) Methods and systems for face recognition
EP3333768A1 (en) Method and apparatus for detecting target
Hoai et al. Discriminative sub-categorization
CN108734199B (en) Hyperspectral image robust classification method based on segmented depth features and low-rank representation
JP2022551683A (en) Methods and systems for non-invasive genetic testing using artificial intelligence (AI) models
CN111160249A (en) Multi-class target detection method of optical remote sensing image based on cross-scale feature fusion
CN109902662B (en) Pedestrian re-identification method, system, device and storage medium
CN108509833B (en) Face recognition method, device and equipment based on structured analysis dictionary
JP6897749B2 (en) Learning methods, learning systems, and learning programs
CN108629373B (en) Image classification method, system, equipment and computer readable storage medium
CN106557782B (en) Hyperspectral image classification method and device based on class dictionary
CN112200123B (en) Hyperspectral open set classification method combining dense connection network and sample distribution
CN110569971A (en) convolutional neural network single-target identification method based on LeakyRelu activation function
CN110688966B (en) Semantic guidance pedestrian re-recognition method
CN114547365A (en) Image retrieval method and device
Siméoni et al. Unsupervised object discovery for instance recognition
CN110827327B (en) Fusion-based long-term target tracking method
CN113869098A (en) Plant disease identification method and device, electronic equipment and storage medium
CN108875445B (en) Pedestrian re-identification method and device
CN114821200B (en) Image detection model and method applied to industrial vision detection field
CN107563287B (en) Face recognition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200407

Termination date: 20210622

CF01 Termination of patent right due to non-payment of annual fee