CN113989575A - Small sample image classification method and system based on specific parameter distribution generation - Google Patents

Small sample image classification method and system based on specific parameter distribution generation Download PDF

Info

Publication number
CN113989575A
CN113989575A CN202111465249.6A CN202111465249A CN113989575A CN 113989575 A CN113989575 A CN 113989575A CN 202111465249 A CN202111465249 A CN 202111465249A CN 113989575 A CN113989575 A CN 113989575A
Authority
CN
China
Prior art keywords
small sample
sample image
image classification
network
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111465249.6A
Other languages
Chinese (zh)
Inventor
刘东博
展华益
文艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Cric Technology Co ltd
Original Assignee
Sichuan Cric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Cric Technology Co ltd filed Critical Sichuan Cric Technology Co ltd
Priority to CN202111465249.6A priority Critical patent/CN113989575A/en
Publication of CN113989575A publication Critical patent/CN113989575A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the field of small sample image classification, in particular to a small sample image classification method and system based on specific parameter distribution generation, which can effectively improve the result accuracy of the small sample image classification method. The technical scheme comprises the following steps: the method comprises the steps of constructing an overall architecture of a small sample image classification neural network, using parameters of various convolution neural networks as parameter training sets, training a distribution learning network by using the parameter training sets, generating initial parameters of the small sample image classification neural network through the trained distribution learning network, constructing a parameter adjusting network, using a target data set to train the small sample image classification neural network under the adjustment of the parameter adjusting network, inputting images to be classified, extracting characteristics of the images to be classified through the small sample image classification neural network, and judging the classes of the images. The method is suitable for classifying the small sample images.

Description

Small sample image classification method and system based on specific parameter distribution generation
Technical Field
The invention relates to the field of small sample image classification, in particular to a small sample image classification method and system based on specific parameter distribution generation.
Background
Training of image classification neural networks has been highly complex, since neural networks require many pairs of large amounts of data and repeated iterative training to learn "knowledge" from the data. However, in a small sample learning task of the neural network, there are not enough samples available for the neural network to learn. One common method is to compensate for the training insufficiency caused by too few training samples by migrating the knowledge of other training adequate networks to a small sample learning neural network. This method of migrating knowledge from one network to another is called migration learning. In recent years, more and more small sample learning has begun to be combined with migratory learning. Migratory learning can migrate knowledge in a neural network that is trained adequately on large data sets into a new network for processing new tasks. The mainstream neural network migration learning method comprises parameter direct migration and parameter distribution migration, wherein the parameter direct migration directly takes a trained network parameter as a part of a current task network parameter, and the parameter distribution migration takes the distribution of the trained network parameter as a regularization parameter to guide the training of the current task network.
However, the current mainstream neural network transfer learning method is trained according to the existing data direct network model, and the classification accuracy is not high.
Disclosure of Invention
The invention aims to provide a small sample image classification method and system based on specific parameter distribution generation, which can effectively improve the result accuracy of the small sample image classification method.
The invention adopts the following technical scheme to realize the purpose, and the small sample image classification method based on the specific parameter distribution comprises the following steps:
step 1, constructing an overall architecture of a small sample image classification neural network, wherein the overall architecture comprises an architecture of a distributed learning network, an architecture of a parameter adjusting network and an architecture of the small sample image classification neural network;
step 2, obtaining parameters of various convolutional neural networks as a parameter training set;
step 3, training a distributed learning network by using a parameter training set;
step 4, generating initial parameters of a small sample image classification neural network through the trained distributed learning network;
step 5, under the adjustment of the parameter adjustment network, training a small sample image classification neural network by using a target data set, wherein the target data set is an image set to be classified;
and 6, inputting the image to be classified, extracting the characteristics of the image through a small sample image classification neural network, and judging the category of the image.
Further, in step 1, the parameter adjusting network and the distributed learning network have the same structure and parameters.
Further, in step 2, the plurality of convolutional neural networks include VGG, DensNet, AlexNet, and ResNet convolutional neural networks.
Further, in step 3, the distributed learning network is constructed by a countermeasure generation network, and an optimization objective function of the distributed learning network is as follows:
Figure BDA0003391147250000021
g and D denote the generator and the discriminator, respectively, x is the data of the training data set, z is the data sampled from the model, PdataRepresenting the distribution of real data, PzWhich represents the distribution of the simulated data,
Figure BDA0003391147250000022
denotes x is in PdataThe above expectations.
Further, according to the strategy of optimizing the objective function by optimizing D and then optimizing G, the optimization objective function is decomposed as follows:
Figure BDA0003391147250000023
Figure BDA0003391147250000024
Figure BDA0003391147250000025
further, in step 4, a specific method for generating the initial parameter includes:
randomly generating random Gaussian noise images with the number of convolution kernels meeting the requirement of the small sample learning network being N and 7 multiplied by 7, and then inputting the Gaussian noise images into the distribution learning network to generate initial parameters of the small sample image classification neural network.
Further, in step 5, a specific method for training a small sample image classification neural network by using the target data set includes:
carrying out adjustment training on the small sample image classification neural network by using a target data set to obtain new parameters; adjusting the new parameters by using a parameter adjusting network; and then replacing corresponding original parameters in the small sample image classification neural network by using the adjusted parameters, and performing adjustment training on the small sample image classification neural network through the target data set until the loss function of the small sample image classification neural network in the training of the target data set reaches a set condition.
Further, in step 6, the specific method for extracting the features of the small sample image classification neural network and judging the category of the small sample image classification neural network comprises the following steps:
and performing feature extraction on the input image by using a small sample image classification neural network, wherein the features are feature graphs obtained after convolution of a convolution kernel layer of the small sample image classification neural network, performing pooling processing on the feature graphs to reduce the dimensionality to a set value, classifying the feature graphs by using a final full-connection layer, and judging the category of the feature graphs.
The small sample image classification system generated based on specific parameter distribution is applied to the small sample image classification method generated based on specific parameter distribution, and comprises the following steps:
the parameter generation module is used for constructing an overall architecture of the small sample image classification neural network, the overall architecture comprises an architecture of a distributed learning network, an architecture of a parameter adjusting network and an architecture of the small sample image classification neural network, parameters of various convolutional neural networks are used as a parameter training set to train the distributed learning network by using the parameter training set, and initial parameters of the small sample image classification neural network are generated through the trained distributed learning network;
the network training module is used for training a small sample image classification neural network by using a target data set under the adjustment of the parameter adjustment network, wherein the target data set is an image set to be classified;
and the online classification module is used for inputting the images to be classified, extracting the characteristics of the images through a small sample image classification neural network and judging the categories of the images.
The invention has the beneficial effects that: a network framework comprising distributed learning, parameter adjustment and a small sample image classification network is constructed, and the network for small sample image classification can be directly subjected to parameter generation, so that the network is not directly trained according to the existing data like a traditional method. This ensures the rationality of the network framework. And only a small amount of labeled samples are needed to carry out high-accuracy classification, the method provided by the invention is extremely low in cost and easy to implement, and the result accuracy of the small-sample image classification method can be effectively improved.
Drawings
FIG. 1 is a flow chart of a method for classifying small sample images generated based on a particular parameter distribution according to the present invention.
Fig. 2 is a block diagram of a small sample image classification system generated based on a specific parameter distribution according to the present invention.
FIG. 3 is a schematic diagram of the general architecture of the present invention.
In the drawing, 101 denotes migration source network parameters, 102 denotes initial parameters, 103 denotes a small sample learning network, 104 denotes small sample training data, a denotes a parameter generation method flow, b denotes a network training method flow, and c denotes an online classification method flow.
Detailed Description
The invention is described in more detail below with reference to the figures and examples.
Fig. 1 is a flowchart of a method for classifying small sample images generated based on specific parameter distribution according to the present invention, which includes: a parameter generation method flow a, a network training method flow b and an online classification method flow c. The method comprises the following specific steps:
step 1, constructing an overall architecture of a small sample image classification neural network;
step 2, obtaining parameters of various convolutional neural networks as a parameter training set;
step 3, training a distributed learning network by using a parameter training set;
step 4, generating initial parameters of a small sample image classification neural network through the trained distributed learning network;
step 5, constructing a parameter adjusting network, wherein the parameter adjusting network and the distributed learning network have the same parameters;
step 6, under the adjustment of the parameter adjustment network, training a small sample image classification neural network by using a target data set, wherein the target data set is an image set to be classified;
and 7, inputting the image to be classified, extracting the characteristics of the image through a small sample image classification neural network, and judging the category of the image.
Wherein, the content of the step 1-4 is a parameter generation method process, the content of the step 5-6 is a network training method process, and the content of the step 7 is an online classification method process.
In step 1, the overall architecture comprises: the method comprises the steps of constructing a distributed learning network, a parameter adjusting network and a small sample image classification neural network.
In step 2, the plurality of convolutional neural networks include VGG, DensNet, AlexNet and ResNet convolutional neural networks.
In step 3, the distributed learning network is constructed by a countermeasure generation network, and an optimization objective function of the distributed learning network is as follows:
Figure BDA0003391147250000041
g and D denote the generator and the discriminator, respectively, x is the data of the training data set, z is the data sampled from the model, PdataRepresenting the distribution of real data, PzWhich represents the distribution of the simulated data,
Figure BDA0003391147250000042
denotes x is in PdataThe above expectations.
According to the strategy of firstly optimizing D and then optimizing G, the optimization objective function is decomposed as follows:
Figure BDA0003391147250000043
Figure BDA0003391147250000044
Figure BDA0003391147250000045
in step 4, the specific method for generating the initial parameters includes:
randomly generating random Gaussian noise images with the number of convolution kernels meeting the requirement of the small sample learning network being N and 7 multiplied by 7, and then inputting the Gaussian noise images into the distribution learning network to generate initial parameters of the small sample image classification neural network.
In step 6, the specific method for training the small sample image classification neural network by using the target data set comprises the following steps:
carrying out adjustment training on the small sample image classification neural network by using a target data set to obtain new parameters; adjusting the new parameters by using a parameter adjusting network; and then replacing corresponding original parameters in the small sample image classification neural network by using the adjusted parameters, and performing adjustment training on the small sample image classification neural network through the target data set until the loss function of the small sample image classification neural network in the training of the target data set reaches a set condition.
In step 7, the specific method for extracting the characteristics of the small sample image classification neural network and judging the category of the small sample image classification neural network comprises the following steps:
and performing feature extraction on the input image by using a small sample image classification neural network, wherein the features are feature graphs obtained after convolution of a convolution kernel layer of the small sample image classification neural network, performing pooling processing on the feature graphs to reduce the dimensionality to a set value, classifying the feature graphs by using a final full-connection layer, and judging the category of the feature graphs.
Fig. 2 is a block diagram of a small sample image classification system generated based on specific parameter distribution according to the present invention, which includes:
the parameter generation module is used for constructing an overall architecture of the small sample image classification neural network, training the distributed learning network by using the parameter training set by taking the parameters of various convolutional neural networks as the parameter training set, and generating initial parameters of the small sample image classification neural network through the trained distributed learning network;
the network training module is used for constructing a parameter adjusting network, the parameter adjusting network and the distributed learning network have the same parameters, and under the adjustment of the parameter adjusting network, a target data set is used for training a small sample image classification neural network, wherein the target data set is an image set to be classified;
and the online classification module is used for inputting the images to be classified, extracting the characteristics of the images through a small sample image classification neural network and judging the categories of the images.
When acquiring corresponding parameters from a convolutional neural network, normalization operation needs to be performed on a convolutional kernel, and the specific operation is as follows:
selecting a convolution kernel with the largest size as a basic template, wherein the selected convolution kernel has the size of 7 multiplied by 7; then expanding convolution kernels with other sizes to a uniform size, and aiming at convolution kernels with the size less than 7 multiplied by 7 quarter, such as 1 multiplied by 1, 2 multiplied by 2 and 3 multiplied by 3, expanding the convolution kernels by integral times, and supplementing the insufficient size with 0; convolution kernels of other sizes were supplemented directly with 0 to 7 × 7; finally, all convolution kernels are averaged and squared, and then normalized.
FIG. 3 is a schematic diagram of the overall architecture of the present invention, including the architecture of the distributed learning network, the architecture of the parameter adjustment network, and the architecture of the small sample image classification neural network 103, training the distributed learning network using the migration source network parameters 101, and generating the initial parameters 102; under the adjustment of the parameter adjustment network, the small sample training data 104 is used to train the small sample image classification neural network, and the small sample training data is the image set to be classified. After training is finished, inputting images to be classified, extracting the characteristics of the images through a small sample image classification neural network, and judging the categories of the images.
It is to be understood that the specific embodiments of the invention described are merely illustrative of some of the embodiments of the invention, and that the invention is not to be construed as being limited to all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

Claims (9)

1. The small sample image classification method based on specific parameter distribution generation is characterized by comprising the following steps:
step 1, constructing an overall architecture of a small sample image classification neural network, wherein the overall architecture comprises an architecture of a distributed learning network, an architecture of a parameter adjusting network and an architecture of the small sample image classification neural network;
step 2, obtaining parameters of various convolutional neural networks as a parameter training set;
step 3, training a distributed learning network by using a parameter training set;
step 4, generating initial parameters of a small sample image classification neural network through the trained distributed learning network;
step 5, under the adjustment of the parameter adjustment network, training a small sample image classification neural network by using a target data set, wherein the target data set is an image set to be classified;
and 6, inputting the image to be classified, extracting the characteristics of the image through a small sample image classification neural network, and judging the category of the image.
2. The method for classifying small sample images generated based on specific parameter distribution according to claim 1, wherein the parameter adjusting network and the distribution learning network have the same structure and parameters.
3. The method for classifying small sample images generated based on specific parameter distribution according to claim 1, wherein in the step 2, the plurality of convolutional neural networks comprise VGG, DensNet, AlexNet and ResNet convolutional neural networks.
4. The method for classifying small sample images generated based on specific parameter distribution according to claim 1, wherein in step 3, the distributed learning network is composed by constructing a confrontation generation network, and the optimization objective function of the distributed learning network is as follows:
Figure FDA0003391147240000011
g and D denote the generator and the discriminator, respectively, x is the data of the training data set, z is the data sampled from the model, PdataRepresenting the distribution of real data, PzWhich represents the distribution of the simulated data,
Figure FDA0003391147240000012
denotes x is in PdataThe above expectations.
5. The method for classifying small sample images generated based on specific parameter distribution according to claim 4, wherein the optimization objective function is decomposed as follows according to the strategy of optimizing D and then optimizing G:
Figure FDA0003391147240000013
Figure FDA0003391147240000014
Figure FDA0003391147240000015
6. the method for classifying small sample images generated based on specific parameter distribution according to claim 1, wherein in step 4, the specific method for generating initial parameters comprises:
randomly generating random Gaussian noise images with the number of convolution kernels meeting the requirement of the small sample learning network being N and 7 multiplied by 7, and then inputting the Gaussian noise images into the distribution learning network to generate initial parameters of the small sample image classification neural network.
7. The method for classifying small sample images generated based on specific parameter distribution as claimed in claim 1, wherein in step 5, the specific method for training the neural network for classifying small sample images by using the target data set comprises:
carrying out adjustment training on the small sample image classification neural network by using a target data set to obtain new parameters; adjusting the new parameters by using a parameter adjusting network; and then replacing corresponding original parameters in the small sample image classification neural network by using the adjusted parameters, and performing adjustment training on the small sample image classification neural network through the target data set until the loss function of the small sample image classification neural network in the training of the target data set reaches a set condition.
8. The method for classifying the small sample image generated based on the specific parameter distribution according to claim 1, wherein in step 6, the specific method for extracting the features of the small sample image through the small sample image classification neural network and judging the category of the small sample image comprises the following steps:
and performing feature extraction on the input image by using a small sample image classification neural network, wherein the features are feature graphs obtained after convolution of a convolution kernel layer of the small sample image classification neural network, performing pooling processing on the feature graphs to reduce the dimensionality to a set value, classifying the feature graphs by using a final full-connection layer, and judging the category of the feature graphs.
9. The small sample image classification system generated based on specific parameter distribution is applied to the small sample image classification method generated based on specific parameter distribution in any one of claims 1 to 8, and is characterized by comprising the following steps:
the parameter generation module is used for constructing an overall architecture of the small sample image classification neural network, the overall architecture comprises an architecture of a distributed learning network, an architecture of a parameter adjusting network and an architecture of the small sample image classification neural network, parameters of various convolutional neural networks are used as a parameter training set to train the distributed learning network by using the parameter training set, and initial parameters of the small sample image classification neural network are generated through the trained distributed learning network;
the network training module is used for training a small sample image classification neural network by using a target data set under the adjustment of the parameter adjustment network, wherein the target data set is an image set to be classified;
and the online classification module is used for inputting the images to be classified, extracting the characteristics of the images through a small sample image classification neural network and judging the categories of the images.
CN202111465249.6A 2021-12-03 2021-12-03 Small sample image classification method and system based on specific parameter distribution generation Pending CN113989575A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111465249.6A CN113989575A (en) 2021-12-03 2021-12-03 Small sample image classification method and system based on specific parameter distribution generation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111465249.6A CN113989575A (en) 2021-12-03 2021-12-03 Small sample image classification method and system based on specific parameter distribution generation

Publications (1)

Publication Number Publication Date
CN113989575A true CN113989575A (en) 2022-01-28

Family

ID=79733131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111465249.6A Pending CN113989575A (en) 2021-12-03 2021-12-03 Small sample image classification method and system based on specific parameter distribution generation

Country Status (1)

Country Link
CN (1) CN113989575A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020200030A1 (en) * 2019-04-02 2020-10-08 京东方科技集团股份有限公司 Neural network training method, image processing method, image processing device, and storage medium
CN112149755A (en) * 2020-10-12 2020-12-29 自然资源部第二海洋研究所 Small sample seabed underwater sound image substrate classification method based on deep learning
CN112348792A (en) * 2020-11-04 2021-02-09 广东工业大学 X-ray chest radiography image classification method based on small sample learning and self-supervision learning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020200030A1 (en) * 2019-04-02 2020-10-08 京东方科技集团股份有限公司 Neural network training method, image processing method, image processing device, and storage medium
CN112149755A (en) * 2020-10-12 2020-12-29 自然资源部第二海洋研究所 Small sample seabed underwater sound image substrate classification method based on deep learning
CN112348792A (en) * 2020-11-04 2021-02-09 广东工业大学 X-ray chest radiography image classification method based on small sample learning and self-supervision learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
仓明杰: "基于ICNN和IGAN的SAR目标识别方法", 《雷达科学与技术》, vol. 18, no. 3, 30 June 2020 (2020-06-30), pages 287 - 294 *
郑宗生: "基于迁移学习及气象卫星云图的台风等级分类研究", 《遥感技术与应用》, vol. 35, no. 1, 20 February 2020 (2020-02-20), pages 202 - 210 *

Similar Documents

Publication Publication Date Title
CN109035149B (en) License plate image motion blur removing method based on deep learning
CN110516596B (en) Octave convolution-based spatial spectrum attention hyperspectral image classification method
CN110929603B (en) Weather image recognition method based on lightweight convolutional neural network
CN110188685B (en) Target counting method and system based on double-attention multi-scale cascade network
WO2022160771A1 (en) Method for classifying hyperspectral images on basis of adaptive multi-scale feature extraction model
CN108491765B (en) Vegetable image classification and identification method and system
CN109754017B (en) Hyperspectral image classification method based on separable three-dimensional residual error network and transfer learning
CN111079795B (en) Image classification method based on CNN (content-centric networking) fragment multi-scale feature fusion
CN108256482B (en) Face age estimation method for distributed learning based on convolutional neural network
CN108665005B (en) Method for improving CNN-based image recognition performance by using DCGAN
CN110929602A (en) Foundation cloud picture cloud shape identification method based on convolutional neural network
CN108847223B (en) Voice recognition method based on deep residual error neural network
CN111861906B (en) Pavement crack image virtual augmentation model establishment and image virtual augmentation method
CN109461458B (en) Audio anomaly detection method based on generation countermeasure network
CN114943345B (en) Active learning and model compression-based federal learning global model training method
CN110689039A (en) Trunk texture identification method based on four-channel convolutional neural network
CN112991493A (en) Gray level image coloring method based on VAE-GAN and mixed density network
CN114897884A (en) No-reference screen content image quality evaluation method based on multi-scale edge feature fusion
CN110956201A (en) Image distortion type classification method based on convolutional neural network
CN114580517A (en) Method and device for determining image recognition model
CN112561054B (en) Neural network filter pruning method based on batch characteristic heat map
CN110554667A (en) convolutional Neural Network (CNN) based intermittent industrial process fault diagnosis
CN117911252A (en) Method and system for removing illumination in image based on deep learning
CN112686268A (en) Crop leaf disorder identification method based on SVD-ResNet50 neural network
CN111652238B (en) Multi-model integration method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination