CN112434721B - Image classification method, system, storage medium and terminal based on small sample learning - Google Patents

Image classification method, system, storage medium and terminal based on small sample learning Download PDF

Info

Publication number
CN112434721B
CN112434721B CN202011150086.8A CN202011150086A CN112434721B CN 112434721 B CN112434721 B CN 112434721B CN 202011150086 A CN202011150086 A CN 202011150086A CN 112434721 B CN112434721 B CN 112434721B
Authority
CN
China
Prior art keywords
image
features
image data
query
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011150086.8A
Other languages
Chinese (zh)
Other versions
CN112434721A (en
Inventor
赵磊
方红波
廖旻可
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Terminus Technology Group Co Ltd
Original Assignee
Terminus Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Terminus Technology Group Co Ltd filed Critical Terminus Technology Group Co Ltd
Priority to CN202011150086.8A priority Critical patent/CN112434721B/en
Publication of CN112434721A publication Critical patent/CN112434721A/en
Application granted granted Critical
Publication of CN112434721B publication Critical patent/CN112434721B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application discloses an image classification method, a system, a storage medium and a terminal based on small sample learning, wherein the method comprises the following steps: determining an image dataset to be classified; extracting a plurality of types of image data samples in the image data set, and establishing an image support set based on the plurality of types of image data samples; re-extracting second image data samples of a plurality of categories from the image data samples remaining after the extraction in the image data set to construct a query set; and performing generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network, and outputting the category of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.

Description

Image classification method, system, storage medium and terminal based on small sample learning
Technical Field
The invention relates to the technical field of deep learning of computers, in particular to an image classification method, system, storage medium and terminal based on small sample learning.
Background
With the development of deep learning, an image classification method based on a deep neural network is proposed, and training is required to be performed through a large number of samples, so that the deep neural network has better performance. However, in some practical applications, such as object tracking or object detection, we may have only limited samples, and thus it is difficult to build a large number of valuable, labeled sample sets. Therefore, the small sample learning is applied to the field of supervision and learning as meta learning, and the problem of m-way k-shot is solved by constructing different meta tasks, so that the method is suitable for improving the generalization capability of the classification model under the condition of a small number of samples.
The existing small sample learning method mainly comprises a model-based, metric-based and optimization-based learning method. The learning method based on the measurement is characterized in that the distance measurement between samples in a support set and a query set is calculated, and the samples are classified by a non-parameter method (nearest neighbor, K mean and the like), so that the purpose that the similar samples are closest in distance and the heterogeneous samples are farthest in distance is realized. The metric-based learning method has the following problems: these methods reduce the generalization ability of the classification model by embedding samples of the support and query sets into feature space, but not fully utilizing these extracted sample features.
Disclosure of Invention
The embodiment of the application provides an image classification method, an image classification system, a storage medium and a terminal based on small sample learning. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
In a first aspect, an embodiment of the present application provides an image classification method based on small sample learning, where the method includes:
determining an image dataset to be classified;
extracting a plurality of types of image data samples in the image data set, and establishing an image support set based on the plurality of types of image data samples;
re-extracting second image data samples of a plurality of categories from the image data samples remaining after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
And performing generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network, and outputting the category of each image data sample.
Optionally, the inputting each image data sample in the image query set into the generalized small sample classification network, and outputting the category of each image data sample includes:
calculating a plurality of category similarities between each image data sample in the image query set and the image support set based on the generalized small sample classification network, and marking the maximum category similarity in the plurality of category similarities corresponding to each image data sample;
and outputting the category of each image sample data.
Optionally, the small sample classification network includes a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer, and a category output layer;
the small sample classification network after the generalization processing calculates a plurality of category similarities between each image data sample in the image query set and the image support set, marks the maximum category similarity in the plurality of category similarities corresponding to each image data sample, and comprises the following steps:
The feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
the feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates similar probability values of the query fusion features and the support fusion features, and generates a plurality of similarity probability values of the query fusion features and the support fusion features;
And the class output layer is used for arranging the similarity probability values in a descending order, extracting the similarity probability value arranged at the first position for class marking and outputting the class marking of the sample. .
Optionally, training the small sample classification network according to the following method includes:
collecting a training image data sample;
constructing a plurality of small sample network training tasks aiming at the training image data samples;
respectively extracting an image support set and an image query set aiming at each small sample network training task in the plurality of small sample network training tasks;
creating a small sample classification network, and training the small sample classification network based on the image support set and the image query set;
and when the loss value of the trained small sample classification network is smaller than a preset value, generating a pre-trained small sample classification network.
Optionally, the creating the small sample classification network, training the small sample classification network based on the image support set and the image query set, includes:
creating a small sample classification network, wherein the created small sample classification network comprises a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer and a category output layer;
The feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
the feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates similar probability values of the query fusion features and the support fusion features, and generates a plurality of similarity probability values of the query fusion features and the support fusion features;
And the class output layer performs descending order arrangement on the plurality of similarity probability values, extracts the similarity probability value arranged at the first position to perform class marking, and outputs class marking of the sample.
Optionally, when the loss value of the trained small sample classification network is smaller than a preset value, generating a pre-trained small sample classification network includes:
and when the loss value of the trained small sample classification network is smaller than a preset value, continuing to execute the step of respectively extracting the feature map sets corresponding to the image support set and the image query set by the feature extraction layer.
Optionally, the feature map is obtained by convolution through a convolutional neural network, and a convolution parameter used in the convolution is a convolution parameter of 1x 1.
In a second aspect, an embodiment of the present application provides an image classification system based on small sample learning, the system including:
the data set determining module is used for determining an image data set to be classified;
the image support set establishing module is used for extracting a plurality of types of image data samples in the image data set and establishing an image support set based on the plurality of types of image data samples;
the image query set establishing module is used for extracting second image data samples of a plurality of categories again from the image data samples remaining after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
The class output module is used for carrying out generalization processing on the pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network and outputting the class of each image data sample.
In a third aspect, embodiments of the present application provide a computer storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor and to perform the above-described method steps.
In a fourth aspect, an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps described above.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
in the embodiment of the application, an image processing system firstly determines an image data set to be classified, extracts a plurality of types of image data samples in the image data set, establishes an image support set based on the plurality of types of image data samples, then extracts a plurality of types of data samples again based on the image data samples remained after the extraction in the image data set to establish an image query set, performs generalization processing on a pre-trained small sample classification network according to the image support set, inputs each image data sample in the image query set into the generalization processed small sample classification network, and finally outputs the type of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of an image classification method based on small sample learning according to an embodiment of the present application;
FIG. 2 is a general flow chart of a small sample classification network process used in the small sample learning-based image classification process provided by an embodiment of the present application;
FIG. 3 is a block diagram of an attention enhancement module and feature fusion module provided by an embodiment of the present application;
FIG. 4 is a flow chart of another image classification method based on small sample learning according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a system architecture of a feature map pooling system according to an embodiment of the present application;
fig. 6 is a schematic diagram of a terminal according to an embodiment of the present application.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the application to enable those skilled in the art to practice them.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the invention. Rather, they are merely examples of systems and methods that are consistent with aspects of the invention as detailed in the accompanying claims.
In the description of the present invention, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present invention, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
To date, existing small sample learning methods mainly include model-based, metric-based, and optimization-based learning methods. The learning method based on the measurement is characterized in that the distance measurement between samples in a support set and a query set is calculated, and the samples are classified by a non-parameter method (nearest neighbor, K mean and the like), so that the purpose that the similar samples are closest in distance and the heterogeneous samples are farthest in distance is realized. The metric-based learning method has the following problems: these approaches would reduce the generalization ability of the classification model by embedding samples of the support and query sets into feature space, but not fully utilizing these extracted sample features. Therefore, the application provides an image classification method, an image classification system, a storage medium and a terminal based on small sample learning, so as to solve the problems in the related technical problems. In the technical scheme provided by the application, the characteristic fusion processing is carried out by extracting the sample characteristics of the support set and the query set, so that the two sets mutually enhance the related characteristics between the samples to highlight the characteristics which are interested between the samples, thereby improving the accuracy of image classification, and the method is described in detail by adopting an exemplary embodiment.
The image classification method based on small sample learning according to the embodiment of the present application will be described in detail with reference to fig. 1 to fig. 4. The method may be implemented in dependence on a computer program, and may be run on a small sample learning based image classification system based on von neumann systems. The computer program may be integrated in the application or may run as a stand-alone tool class application.
Referring to fig. 1, a flow chart of an image classification method based on small sample learning is provided in an embodiment of the present application. As shown in fig. 1, the method according to the embodiment of the present application may include the following steps:
s101, determining an image data set to be classified;
among them, small sample Learning (Few-shot Learning) is also called as few sample Learning, and humans are very good at recognizing a new object through a very small number of samples, such as children can know what is "zebra" and what is "rhinoceros" by only needing some pictures in a book. The application hopes that the machine learning model can learn quickly with a small amount of samples on the basis of unchanged and trained models for new categories after learning a large amount of data of a certain category, so that the trained network model has the ability of identifying similar samples. The image dataset is a sample of image data to be classified.
In general, in Few-shot Learning, there is a problem called m-way k-shot, which is simply that the image data samples to be classified belong to one of m classes, but only k samples in each class training set, that is, a class of m×k samples in total, are known.
In one possible implementation, when performing small sample learning based image classification, it is first necessary to determine the image dataset to be classified. The image dataset may be regarded as test image data of the model.
Further, before the test, the system further comprises a training small sample classification network, and when the small sample classification network is trained, training image data samples are firstly collected; and constructing a plurality of small sample network training tasks aiming at training image data samples, respectively extracting an image support set and an image query set aiming at each small sample network training task in the plurality of small sample network training tasks, then creating a small sample classification network, training the small sample classification network based on the image support set and the image query set, and finally generating a pre-trained small sample classification network when the loss value of the trained small sample classification network is smaller than a preset value.
Specifically, when a small sample classification network is created and trained based on an image support set and an image query set, the small sample classification network is created first, and the created small sample classification network comprises a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer and a category output layer, wherein the feature extraction layer respectively extracts feature graph sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, each channel corresponds to one feature map, weight vectors corresponding to each layer feature map are respectively extracted through an attention enhancement layer by adopting a global average pooling network, query features and support features corresponding to the feature maps are weighted mutually to obtain weighted query features and support features, then the attention enhancement layer is used for constructing a normalized attention score map according to the query features and the support features corresponding to the feature maps through a normalization algorithm, the weighted query features and the support features are secondarily weighted according to the normalized attention score map to generate secondarily weighted query enhancement features and support enhancement features, then the weighted query features and the secondarily weighted query enhancement features and support enhancement features are subjected to feature stitching through a feature fusion layer to generate query fusion features and support fusion features, the query fusion features and the support fusion features are subjected to similarity probability value calculation through a feature classification layer to generate query fusion features and support fusion feature similarity probability values, finally the similarity degree probability values are subjected to descending order according to the normalized attention score map to generate a second probability value, and the first class-ranking probability value is extracted, and then the class-ranking sample similarity is ranked according to the class probability value is ranked.
And when the loss value of the trained small sample classification network is smaller than a preset value, continuing to execute the step of respectively extracting the feature map sets corresponding to the image support set and the image query set by the feature extraction layer.
For example, when training a small sample classification network, taking a training image sample data set as an example, in the training process, 5 classes (for example, 64 classes, and 600 samples in each class) are randomly sampled from the training image sample data set, and 5 samples in each class form a support set to learn a network model; and then sampling from samples (extracted 5 classes, the rest samples of each class) of the training set to form a query set, wherein each class in the set has 15 samples for obtaining a loss value of the network model so as to learn the small-sample network model.
The feature map is obtained by performing convolution operation on the image data by using a convolutional neural network, and the adopted convolution parameter is a convolution parameter of 1x 1.
S102, extracting second image data samples of a plurality of categories from the image data samples remained after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
The support set is a small number of image data samples with classification labels, and the pre-trained small sample learning network (model) is subjected to generalization processing through the small number of samples, so that the pre-trained small sample learning network has the capability of identifying similar image data samples.
In one possible implementation manner, when the image data set to be classified is classified, firstly, randomly selecting a images from the image data set as classification targets of the small sample classification task, then randomly extracting N samples from each of the selected a images as a support set, finally extracting a plurality of types of data samples from the remaining image data samples after the extraction in the image data set again, and establishing an image query set based on the extracted data samples. Wherein the class of the data samples extracted for the second time is smaller than the class of the data samples extracted for the first time.
S103, establishing an image query set based on the image data samples remained after the extraction in the image data set;
for example, m classes are randomly selected for 659 classes in the image dataset, each class providing k samples as a Support Set to generalize the model, called m-way k-shot. Taking a twin network as an example, in m classes, if each class of the Support Set is a single sample (i.e., m samples), the Query Set (also m classes) is continuously paired with the Support Set to measure similarity, and if the Query Set is similar, the Query Set is classified as the class.
S104, performing generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network, and outputting the category of each image data sample.
In one possible implementation manner, multiple category similarities between each image data sample and the image support set in the image query set are calculated based on the generalized small sample classification network, the maximum category similarity in the multiple category similarities corresponding to each image data sample is marked, and finally the category of each image sample data is output.
For example, as shown in fig. 2, fig. 2 is a general flow chart of an image classification process based on small sample learning, which is provided by the embodiment of the application, and uses a small sample classification network to process an image data set to be classified, wherein the small sample classification network is firstly divided into a support set and a query set, the small sample classification network trained in advance firstly extracts support features and query features in a feature map of the support set and the query set through a feature extraction layer (feature extraction module), then the support features and the query features are sequentially input into an attention enhancement module (layer), a feature fusion module (layer), a classification model (layer) and an output module (layer) for processing, and finally classification results are input. Wherein the structure of the attention enhancement module and the feature fusion module is shown in fig. 3.
In the embodiment of the application, an image processing system firstly determines an image data set to be classified, extracts a plurality of types of image data samples in the image data set, establishes an image support set based on the plurality of types of image data samples, then extracts a plurality of types of data samples again based on the image data samples remained after the extraction in the image data set to establish an image query set, performs generalization processing on a pre-trained small sample classification network according to the image support set, inputs each image data sample in the image query set into the generalization processed small sample classification network, and finally outputs the type of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.
Referring to fig. 4, a flowchart of another image classification method based on small sample learning according to an embodiment of the present application is shown. The small sample learning-based image classification method may include the steps of:
S201, determining an image data set to be classified;
s202, extracting a plurality of types of image data samples in the image data set, and establishing an image support set based on the plurality of types of image data samples;
s203, extracting second image data samples of a plurality of categories from the image data samples remained after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
s204, the feature extraction layer extracts feature image sets corresponding to the image support set and the image query set respectively; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
in one possible implementation manner, for training tasks of small sample data, C-way K-shot tasks are constructed, for each C-way K-shot task, an image support set and an image query set are respectively taken out, a convolutional neural network (CNN network) is adopted through a feature extraction layer to respectively extract a series of feature graphs of the image support set and the image query set, wherein the feature graph of each layer comprises a plurality of channels, and each channel corresponds to one feature graph.
S205, the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights the query features and the support features corresponding to the feature graphs to obtain weighted query features and weighted support features;
in one possible implementation manner, the attention enhancement layer is subjected to enhancement treatment twice, wherein the first enhancement is to extract weight vectors corresponding to a certain layer of feature images through a global average pooling network respectively, and then mutually weighting operation treatment is performed on query features and support features corresponding to the feature images to obtain the query features and the support features after the global average pooling network weighting treatment; the second enhancement is to normalize the query features and the support features obtained above, obtain a normalized attention score graph through a normalization function, and then mutually weight the query features and the support features after the global average pooling network weighting treatment to obtain the query enhancement features and the support enhancement features.
S206, the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
S207, the feature fusion layer performs feature fusion on the weighted query features and the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
in one possible implementation, the feature fusion layer splices the query feature and the support feature after the global average pooling network weighting processing with the support enhancement feature and the query enhancement feature, respectively, so as to obtain the query fusion feature and the support fusion feature.
S208, the feature classification layer calculates similar probability values of the query fusion feature and the support fusion feature, and generates a plurality of similarity probability values of the query fusion feature and the support fusion feature;
in one possible implementation, the feature classification layer inputs the query fusion feature and the support fusion feature to the classification module to obtain a probability value representing a degree of similarity of the query fusion feature and the support fusion feature.
S209, the class output layer performs descending order arrangement on the plurality of similarity probability values, extracts the similarity probability value arranged at the first position to perform class marking, and outputs class marking of the sample. The method comprises the steps of carrying out a first treatment on the surface of the
In one possible implementation, the class output layer performs descending order of the plurality of similarity probability values, extracts the similarity probability value arranged at the first position to perform class marking, and outputs the class marking of the sample.
S210, generating the category of each image sample data.
In the embodiment of the application, an image processing system firstly determines an image data set to be classified, extracts a plurality of types of image data samples in the image data set, establishes an image support set based on the plurality of types of image data samples, then extracts a plurality of types of data samples again based on the image data samples remained after the extraction in the image data set to establish an image query set, performs generalization processing on a pre-trained small sample classification network according to the image support set, inputs each image data sample in the image query set into the generalization processed small sample classification network, and finally outputs the type of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.
The following are system embodiments of the present application that may be used to perform method embodiments of the present application. For details not disclosed in the system embodiments of the present application, please refer to the method embodiments of the present application.
Referring to fig. 5, a schematic diagram of an image classification system based on small sample learning according to an exemplary embodiment of the present invention is shown. The small sample learning based image classification system may be implemented as all or part of an intelligent robot by software, hardware, or a combination of both. The system 1 comprises a data set determination module 10, an image support set creation module 20, an image query set creation module 30, a category output module 40.
A dataset determination module 10 for determining an image dataset to be classified;
an image support set creation module 20 for extracting a plurality of types of image data samples in the image data set, and creating an image support set based on the plurality of types of image data samples;
an image query set creation module 30, configured to re-extract second image data samples of a plurality of categories from the image data samples remaining after the extraction in the image data set, and create an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
the class output module 40 is configured to generalize the pre-trained small sample classification network according to the image support set, input each image data sample in the image query set into the generalized small sample classification network, and output a class of each image data sample.
It should be noted that, when the image classification system based on the small sample learning provided in the foregoing embodiment performs the image classification method based on the small sample learning, only the division of the foregoing functional modules is used for illustration, in practical application, the foregoing functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the image classification system based on the small sample learning provided in the above embodiment belongs to the same concept as the image classification method embodiment based on the small sample learning, and the implementation process is detailed in the method embodiment, which is not described herein again.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the embodiment of the application, an image processing system firstly determines an image data set to be classified, extracts a plurality of types of image data samples in the image data set, establishes an image support set based on the plurality of types of image data samples, then extracts a plurality of types of data samples again based on the image data samples remained after the extraction in the image data set to establish an image query set, performs generalization processing on a pre-trained small sample classification network according to the image support set, inputs each image data sample in the image query set into the generalization processed small sample classification network, and finally outputs the type of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.
The present application also provides a computer readable medium having stored thereon program instructions which, when executed by a processor, implement the small sample learning based image classification method provided by the above-described method embodiments. The application also provides a computer program product containing instructions which, when run on a computer, cause the computer to perform the small sample learning based image classification method of the various method embodiments described above.
Referring to fig. 6, a schematic structural diagram of a terminal is provided in an embodiment of the present application. As shown in fig. 6, terminal 1000 can include: at least one processor 1001, at least one network interface 1004, a user interface 1003, a memory 1005, at least one communication bus 1002.
Wherein the communication bus 1002 is used to enable connected communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may further include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 1001 may include one or more processing cores. The processor 1001 connects various parts within the entire electronic device 1000 using various interfaces and lines, and performs various functions of the electronic device 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005, and invoking data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1001 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 1001 and may be implemented by a single chip.
The Memory 1005 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). The memory 1005 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described respective method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 1005 may also optionally be at least one storage system located remotely from the processor 1001. As shown in fig. 6, an operating system, a network communication module, a user interface module, and an image classification application based on small sample learning may be included in a memory 1005 as one type of computer storage medium.
In terminal 1000 shown in fig. 6, user interface 1003 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the processor 1001 may be configured to call the small sample learning-based image classification application stored in the memory 1005, and specifically perform the following operations:
Determining an image dataset to be classified;
extracting a plurality of types of image data samples in the image data set, and establishing an image support set based on the plurality of types of image data samples;
re-extracting second image data samples of a plurality of categories from the image data samples remaining after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
and performing generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network, and outputting the category of each image data sample.
In one embodiment, the processor 1001, when executing the input of each image data sample in the image query set into the generalized small sample classification network and outputting the category of each image data sample, specifically performs the following operations:
calculating a plurality of category similarities between each image data sample in the image query set and the image support set based on the generalized small sample classification network, and marking the maximum category similarity in the plurality of category similarities corresponding to each image data sample;
And outputting the category of each image sample data.
In one embodiment, when performing the calculation of the plurality of class similarities between each image data sample in the image query set and the image support set based on the generalized small sample classification network, the processor 1001 specifically performs the following operations:
the feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
The feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates similar probability values of the query fusion features and the support fusion features, and generates a plurality of similarity probability values of the query fusion features and the support fusion features;
and the class output layer is used for arranging the similarity probability values in a descending order, extracting the similarity probability value arranged at the first position for class marking and outputting the class marking of the sample.
In the embodiment of the application, an image processing system firstly determines an image data set to be classified, extracts a plurality of types of image data samples in the image data set, establishes an image support set based on the plurality of types of image data samples, then extracts a plurality of types of data samples again based on the image data samples remained after the extraction in the image data set to establish an image query set, performs generalization processing on a pre-trained small sample classification network according to the image support set, inputs each image data sample in the image query set into the generalization processed small sample classification network, and finally outputs the type of each image data sample. Therefore, by adopting the embodiment of the application, the characteristics of interest among the samples are highlighted by mutually enhancing the related characteristics among the samples by extracting the sample characteristics of the support set and the query set to perform the characteristic fusion processing, so that the accuracy of image classification is improved.
Those skilled in the art will appreciate that implementing all or part of the above-described embodiment methods may be accomplished by computer programs to instruct related hardware, and the programs may be stored in a computer readable storage medium, which when executed may include the embodiment methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random access memory, or the like.
The foregoing disclosure is illustrative of the present application and is not to be construed as limiting the scope of the application, which is defined by the appended claims.

Claims (8)

1. An image classification method based on small sample learning, the method comprising:
determining an image dataset to be classified;
extracting first image data samples of a plurality of categories in the image data set, and establishing an image support set based on the image data samples of the plurality of categories;
re-extracting second image data samples of a plurality of categories from the image data samples remaining after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
Performing generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network, and outputting the category of each image data sample; wherein,,
inputting each image data sample in the image query set into the generalized small sample classification network, and outputting the category of each image data sample, wherein the method comprises the following steps:
calculating a plurality of category similarities between each image data sample in the image query set and the image support set based on the generalized small sample classification network, and marking the maximum category similarity in the plurality of category similarities corresponding to each image data sample;
outputting the category of each image sample data; wherein,,
the small sample classification network comprises a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer and a category output layer;
the small sample classification network after the generalization processing calculates a plurality of category similarities between each image data sample in the image query set and the image support set, marks the maximum category similarity in the plurality of category similarities corresponding to each image data sample, and comprises the following steps:
The feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
the feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates similar probability values of the query fusion features and the support fusion features, and generates a plurality of similarity probability values of the query fusion features and the support fusion features;
And the class output layer performs descending order arrangement on the plurality of similarity probability values, extracts the similarity probability value arranged at the first position to perform class marking, and outputs class marking of the sample.
2. The method of claim 1, wherein training the small sample classification network comprises:
collecting a training image data sample;
constructing a plurality of small sample network training tasks aiming at the training image data samples;
respectively extracting an image support set and an image query set aiming at each small sample network training task in the plurality of small sample network training tasks;
creating a small sample classification network, and training the small sample classification network based on the image support set and the image query set;
and when the loss value of the trained small sample classification network is smaller than a preset value, generating a pre-trained small sample classification network.
3. The method of claim 2, wherein creating the small sample classification network, training the small sample classification network based on the image support set and the image query set, comprises:
creating a small sample classification network, wherein the created small sample classification network comprises a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer and a category output layer;
The feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
the feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates the similarity probability value of the query fusion feature and the support fusion feature, and generates a probability value of the similarity degree of the query fusion feature and the support fusion feature;
And the class output layer is used for arranging the similarity probability values in a descending order, extracting the similarity probability value arranged at the first position for class marking and outputting the class marking of the sample.
4. A method according to claim 3, wherein generating a pre-trained small sample classification network when the trained small sample classification network has a loss value less than a preset value comprises:
and when the loss value of the trained small sample classification network is smaller than a preset value, continuing to execute the step of respectively extracting the feature map sets corresponding to the image support set and the image query set by the feature extraction layer.
5. A method according to claim 3, wherein the feature map is extracted by convolution using a convolutional neural network, and the convolution parameter used in the convolution is a convolution parameter of 1x 1.
6. An image classification system based on small sample learning, the system comprising:
the data set determining module is used for determining an image data set to be classified;
the image support set establishing module is used for extracting a plurality of types of image data samples in the image data set and establishing an image support set based on the plurality of types of image data samples;
The image query set establishing module is used for extracting second image data samples of a plurality of categories again from the image data samples remaining after the extraction in the image data set, and establishing an image query set based on the second image data samples of the plurality of categories; wherein the number of categories of the second image data sample is less than the number of categories of the first image data sample;
the class output module is used for carrying out generalization processing on a pre-trained small sample classification network according to the image support set, inputting each image data sample in the image query set into the generalization processed small sample classification network and outputting the class of each image data sample; wherein,,
inputting each image data sample in the image query set into the generalized small sample classification network, and outputting the category of each image data sample, wherein the method comprises the following steps:
calculating a plurality of category similarities between each image data sample in the image query set and the image support set based on the generalized small sample classification network, and marking the maximum category similarity in the plurality of category similarities corresponding to each image data sample;
Outputting the category of each image sample data; wherein,,
the small sample classification network comprises a feature extraction layer, an attention enhancement layer, a feature fusion layer, a feature classification layer and a category output layer;
the small sample classification network after the generalization processing calculates a plurality of category similarities between each image data sample in the image query set and the image support set, marks the maximum category similarity in the plurality of category similarities corresponding to each image data sample, and comprises the following steps:
the feature extraction layer respectively extracts feature image sets corresponding to the image support set and the image query set; the feature map set is a multi-layer feature map, and each layer of feature map comprises a plurality of channels, and each channel corresponds to one feature map;
the attention enhancement layer respectively extracts weight vectors corresponding to the feature graphs of each layer through a global average pooling network, and mutually weights query features and support features corresponding to the feature graphs to obtain weighted query features and support features;
the attention enhancement layer builds a normalized attention score graph according to the query features and the support features corresponding to the feature graph by a normalization algorithm, secondarily weights the weighted query features and the support features according to the normalized attention score graph, and generates secondarily weighted query enhancement features and support enhancement features;
The feature fusion layer performs feature splicing on the weighted query features, the support features and the secondarily weighted query enhancement features and the support enhancement features to generate query fusion features and support fusion features;
the feature classification layer calculates similar probability values of the query fusion features and the support fusion features, and generates a plurality of similarity probability values of the query fusion features and the support fusion features;
and the class output layer performs descending order arrangement on the plurality of similarity probability values, extracts the similarity probability value arranged at the first position to perform class marking, and outputs class marking of the sample.
7. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method of any one of claims 1-5.
8. A terminal, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method according to any of claims 1-5.
CN202011150086.8A 2020-10-23 2020-10-23 Image classification method, system, storage medium and terminal based on small sample learning Active CN112434721B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011150086.8A CN112434721B (en) 2020-10-23 2020-10-23 Image classification method, system, storage medium and terminal based on small sample learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011150086.8A CN112434721B (en) 2020-10-23 2020-10-23 Image classification method, system, storage medium and terminal based on small sample learning

Publications (2)

Publication Number Publication Date
CN112434721A CN112434721A (en) 2021-03-02
CN112434721B true CN112434721B (en) 2023-09-01

Family

ID=74696002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011150086.8A Active CN112434721B (en) 2020-10-23 2020-10-23 Image classification method, system, storage medium and terminal based on small sample learning

Country Status (1)

Country Link
CN (1) CN112434721B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819075B (en) * 2021-02-02 2021-10-22 电子科技大学 Balanced small sample task classification method based on transduction type reasoning
CN113052802B (en) * 2021-03-11 2024-04-09 南京大学 Small sample image classification method, device and equipment based on medical image
CN113052185A (en) * 2021-03-12 2021-06-29 电子科技大学 Small sample target detection method based on fast R-CNN
CN112784929B (en) * 2021-03-14 2023-03-28 西北工业大学 Small sample image classification method and device based on double-element group expansion
CN113111205B (en) * 2021-04-13 2022-06-14 复旦大学 Image characteristic dynamic alignment method and device based on meta-filter kernel
CN113221964B (en) * 2021-04-22 2022-06-24 华南师范大学 Single sample image classification method, system, computer device and storage medium
CN113313669B (en) * 2021-04-23 2022-06-03 石家庄铁道大学 Method for enhancing semantic features of top layer of surface defect image of subway tunnel
CN113392876B (en) * 2021-05-24 2022-07-05 电子科技大学 Small sample image classification method based on graph neural network
CN113177533B (en) * 2021-05-28 2022-09-06 济南博观智能科技有限公司 Face recognition method and device and electronic equipment
CN113537088A (en) * 2021-07-20 2021-10-22 杭州人云数字科技有限公司 Digital image sensor hardware fingerprint matching method and system based on small sample learning
CN113780335B (en) * 2021-07-26 2023-09-29 华南师范大学 Small sample commodity image classification method, device, equipment and storage medium
CN113673583A (en) * 2021-07-30 2021-11-19 浙江大华技术股份有限公司 Image recognition method, recognition network training method and related device
CN113592008B (en) * 2021-08-05 2022-05-31 哈尔滨理工大学 System, method, device and storage medium for classifying small sample images
CN114549894A (en) * 2022-01-20 2022-05-27 北京邮电大学 Small sample image increment classification method and device based on embedded enhancement and self-adaptation
CN114494195B (en) * 2022-01-26 2024-06-04 南通大学 Small sample attention mechanism parallel twin method for fundus image classification
CN114529765A (en) * 2022-02-16 2022-05-24 腾讯科技(深圳)有限公司 Data processing method, data processing equipment and computer readable storage medium
CN114612708B (en) * 2022-02-23 2022-12-09 广州市玄武无线科技股份有限公司 Commodity identification method and device, terminal equipment and computer readable medium
CN114818963B (en) * 2022-05-10 2023-05-09 电子科技大学 Small sample detection method based on cross-image feature fusion
CN115115825B (en) * 2022-05-27 2024-05-03 腾讯科技(深圳)有限公司 Method, device, computer equipment and storage medium for detecting object in image
CN115424053B (en) * 2022-07-25 2023-05-02 北京邮电大学 Small sample image recognition method, device, equipment and storage medium
CN115131580B (en) * 2022-08-31 2022-11-22 中国科学院空天信息创新研究院 Space target small sample identification method based on attention mechanism
CN115775340B (en) * 2023-02-13 2023-05-16 北京科技大学 Self-adaptive small sample image classification method and device based on feature modulation
CN116403071B (en) * 2023-03-23 2024-03-26 河海大学 Method and device for detecting few-sample concrete defects based on feature reconstruction

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN110569886A (en) * 2019-08-20 2019-12-13 天津大学 Image classification method for bidirectional channel attention element learning
CN110580500A (en) * 2019-08-20 2019-12-17 天津大学 Character interaction-oriented network weight generation few-sample image classification method
CN110717554A (en) * 2019-12-05 2020-01-21 广东虚拟现实科技有限公司 Image recognition method, electronic device, and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106021364B (en) * 2016-05-10 2017-12-12 百度在线网络技术(北京)有限公司 Foundation, image searching method and the device of picture searching dependency prediction model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109961089A (en) * 2019-02-26 2019-07-02 中山大学 Small sample and zero sample image classification method based on metric learning and meta learning
CN110569886A (en) * 2019-08-20 2019-12-13 天津大学 Image classification method for bidirectional channel attention element learning
CN110580500A (en) * 2019-08-20 2019-12-17 天津大学 Character interaction-oriented network weight generation few-sample image classification method
CN110717554A (en) * 2019-12-05 2020-01-21 广东虚拟现实科技有限公司 Image recognition method, electronic device, and storage medium

Also Published As

Publication number Publication date
CN112434721A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN112434721B (en) Image classification method, system, storage medium and terminal based on small sample learning
CN111104962B (en) Semantic segmentation method and device for image, electronic equipment and readable storage medium
CN111797893B (en) Neural network training method, image classification system and related equipment
CN109522942B (en) Image classification method and device, terminal equipment and storage medium
CN111126258A (en) Image recognition method and related device
CN111414915B (en) Character recognition method and related equipment
CN111368656A (en) Video content description method and video content description device
CN113408570A (en) Image category identification method and device based on model distillation, storage medium and terminal
CN112488999B (en) Small target detection method, small target detection system, storage medium and terminal
CN115204183A (en) Knowledge enhancement based dual-channel emotion analysis method, device and equipment
CN113112497B (en) Industrial appearance defect detection method based on zero sample learning, electronic equipment and storage medium
CN111950702A (en) Neural network structure determining method and device
CN115131698A (en) Video attribute determination method, device, equipment and storage medium
CN116151263A (en) Multi-mode named entity recognition method, device, equipment and storage medium
WO2022100607A1 (en) Method for determining neural network structure and apparatus thereof
CN113434722B (en) Image classification method, device, equipment and computer readable storage medium
CN113435531B (en) Zero sample image classification method and system, electronic equipment and storage medium
CN111797862A (en) Task processing method and device, storage medium and electronic equipment
CN113705293A (en) Image scene recognition method, device, equipment and readable storage medium
CN112257840A (en) Neural network processing method and related equipment
CN115439726A (en) Image detection method, device, equipment and storage medium
CN115033700A (en) Cross-domain emotion analysis method, device and equipment based on mutual learning network
CN114924876A (en) Voiceprint recognition method and device based on distributed heterogeneous operation and storage medium
Bırant et al. Classification of scatter plot images using deep learning
CN113408571A (en) Image classification method and device based on model distillation, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant