CN116168257B - Small sample image classification method, device and storage medium based on sample generation - Google Patents

Small sample image classification method, device and storage medium based on sample generation Download PDF

Info

Publication number
CN116168257B
CN116168257B CN202310436361.XA CN202310436361A CN116168257B CN 116168257 B CN116168257 B CN 116168257B CN 202310436361 A CN202310436361 A CN 202310436361A CN 116168257 B CN116168257 B CN 116168257B
Authority
CN
China
Prior art keywords
sample
class
samples
query
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310436361.XA
Other languages
Chinese (zh)
Other versions
CN116168257A (en
Inventor
赵鹏
赵旭阳
韩莉
叶子龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui University
Original Assignee
Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui University filed Critical Anhui University
Priority to CN202310436361.XA priority Critical patent/CN116168257B/en
Publication of CN116168257A publication Critical patent/CN116168257A/en
Application granted granted Critical
Publication of CN116168257B publication Critical patent/CN116168257B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/55Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a small sample image classification method, equipment and a storage medium based on sample generation, which comprise the steps of classifying image data by utilizing a small sample image classification model constructed in advance, wherein the small sample image classification model construction step comprises the steps of dividing a data set into a basic class data set and a new class data set; removing the influence of interference samples in each category in the basic category data, and finding out the most representative sample data training feature generator in each category; generating more samples by using the trained feature generator, and updating each class prototype in the new class; calculating the weight of each local feature of the query sample in the query set in each class judgment, and obtaining the weighted feature representation of the query sample in each class judgment; classification is performed by the metric class prototype and the weighted query samples. The invention can improve the attention degree of the target object area and reduce the attention of other irrelevant areas, thereby improving the classification performance of the small sample image.

Description

Small sample image classification method, device and storage medium based on sample generation
Technical Field
The invention relates to the technical field of small sample image classification, in particular to a small sample image classification method based on sample generation.
Background
With the development of the whole social informatization, the generation speed of the digitized data has been advanced at an unprecedented speed. Such data exists in text, images, sound, video, etc., and is stored in structured or unstructured form. The vast amount of data generated has prompted the prosperous development of artificial intelligence in recent years, particularly in the deep learning direction. Deep learning has enjoyed tremendous success due to the large amount of data, efficient algorithms, and high performance hardware devices. However, deep learning techniques that rely on large data still face significant challenges. Conventional deep learning requires a large number of samples with labels for training, and when the sample size of the labels is insufficient, the problem of over-fitting is caused, and the performance of the model is seriously reduced. In reality, building a large-scale standard dataset requires a lot of human and material resources, and not all tasks can get a lot of labeled samples. Thus, the small sample image classification method becomes a hot spot research problem.
The main problem in the task of classifying small sample images is that the number of samples of each class is too small to better express the distribution of the samples of each class. Therefore, a method of performing data expansion by using a small number of labeled support set samples of each class or performing sample expansion by predicting pseudo labels for unlabeled query set samples is an effective way to solve the problem. For example, wang et al (Wang Y X, girshick R, hebert M, et al, low-shot learning from imaginary data [ C ]// Proceedings of the IEEE conference on computer vision and pattern recognment, 2018:7278-7286.) authors propose a small sample image classification method that first generates new samples by adding noise to support set samples to expand the samples, then performs end-to-end optimization on models that include a generator and a classifier, and synchronously updates the generator and classifier parameters so that the generator generates samples that are easy to classify; the authors of Zhang et al (Zhang H, zhang J, koniusz p. Few-shot learning via saliency-guided hallucination of samples [ C ]// Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern recogntion 2019:2770-2779.) utilized a saliency object detection algorithm on the basis of a relational network to segment images into foreground and background, and then combine the foreground and background of different pictures to form more composite images, thereby realizing expansion of the dataset, and the prototype network was expanded by Mengye et al (Mengye Ren, sachin Ravi, eleni Triantafillou, jake Snell, kevin swesky, josh b, tenenbaum, hugo Larochelle, and Richard s. Zemel. Meta-learning for semi-supervised few-shot classification [ C ]// In International Conference on Learning Representations, 2018.). The author firstly distributes a probability belonging to each category for each unlabeled query sample data by using an initial category prototype, namely, each query sample takes different probabilities as an expansion sample of each category, and the category prototype of each category is updated; hariharan et al (Hariharan B, girsheck, R. Low-shot visual recognition by shrinking and hallucinating features [ C ]/Proceedings of the IEEE International Conference on ComputerVision, venice, italy, 2017:3018-3027.) model differences between different samples of the same class in a base class using an automatic encoder, then migrate such differences into a small sample class, fuse the differences with samples in the small sample class, generate new samples in more small sample classes, and implement expansion of a dataset; li et al (Li K, zhangY, li K, fu Y, adversarial feature hallucination networks for few-shot learning [ C ]// Proceedings ofthe 33rd IEEE Conference on Computer Vision and Pattern Recognition.Piscataway,NJ,USA:IEEE Press,2020:2020:13467-13476.) propose a small sample learning algorithm for generating an countermeasure network based on a condition Wasserstein, and a new sample conforming to the real distribution of the small sample category is generated by mutually gaming a generator and a discriminator, so that a training data set is expanded; jian (Jian Y, torreani L. Label hallucination for few-shot classification [ C ]// Proceedings of the AAAI Conference on Artificial Intelligent.2022, 36 (6): 7005-7014.) et al labels data across a base class dataset with pseudo tags using a linear classifier trained on the new class (i.e., the small sample class). For each new class, a large amount of pseudo-tag sample data is derived from the base class data. The entire model is then trimmed using the distillation loss on the pseudo tag base class dataset and the standard cross entropy loss on the new class dataset.
However, the method often adopts a method of directly adding noise or generating a countermeasures network generated sample to realize sample expansion of a support set, but in the problem of small sample classification, the sample generation directly according to the support set sample can cause the generated sample to have no diversity, and different parts of objects possibly exist in the support set sample and the query set sample, so that the difference of visual characteristic distribution is obvious.
Disclosure of Invention
The small sample image classification method based on sample generation provided by the invention can at least solve one of the technical problems.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a small sample image classification method based on sample generation comprises classifying image data by using a pre-constructed image classification model; the image classification model construction steps are as follows,
s1, dividing a data set into a basic class data set and a new class data set, wherein the basic class data set comprises a sample with a mark and is used for training a feature generator; the new class data set is used for simulating and constructing small sample classification tasks, and each small sample task comprises a support set and a query set;
s2, eliminating the influence of interference samples in each category in the base category data set, and finding out the most representative sample data;
s3, training a feature generator by using sample data most representative of each category in the base class data set;
s4, calculating a support set class prototype in the new class data set, predicting a query set sample label, and taking the sample label as a pseudo label of the query set sample; generating a set number of samples by using a trained feature generator according to the support set samples and the pseudo tags with the credibility meeting the requirements, expanding the support set together with the pseudo tag samples with the credibility meeting the requirements, and then recalculating to obtain updated class prototypes;
s5, calculating the weight of each local feature of the query sample in the query set in each class of judgment, and obtaining the weighted query sample feature in each class of judgment;
and S6, classifying the query samples by measuring the feature similarity of the class prototypes and the weighted query samples.
Further, the step S1 specifically includes:
partitioning a data set into a base class data set to be processed
Figure SMS_1
And new class data set->
Figure SMS_2
Wherein the base class data set contains class-labeled samples for training the feature generator; the new class is used for simulating a small sample classification task; the categories of the base class dataset and the new class dataset are disjoint, i.e. +.>
Figure SMS_3
In order to train the small sample image classification model, small sample image classification tasks are constructed on the new class data set, and each small sample image classification task comprises
Figure SMS_10
Each category comprising +.>
Figure SMS_5
Individual class-marked samples forming a support set
Figure SMS_16
,/>
Figure SMS_8
Wherein->
Figure SMS_17
Is->
Figure SMS_7
Sample number->
Figure SMS_13
For the class label to which the sample corresponds,
Figure SMS_15
to support the total number of set samples +.>
Figure SMS_21
At the same time, each category is alternatively different +.>
Figure SMS_4
The individual samples form a query set->
Figure SMS_11
,/>
Figure SMS_12
Wherein->
Figure SMS_20
Is->
Figure SMS_19
Sample number->
Figure SMS_22
For the class label corresponding to the sample, +.>
Figure SMS_6
For the total number of query set samples, +.>
Figure SMS_14
Is marked as->
Figure SMS_9
A small sample image classification task; the purpose of the small sample image classification task is to learn knowledge from support set samples in the basic class and the new class to construct a model, correctly classify the query set samples, and the query set sample class label>
Figure SMS_18
Only for the calculation of the loss function in model training.
Further, step S2, excluding the influence of the interference sample in each category in the base class data set, and finding the most representative sample data; in particular to the preparation method of the composite material,
s21, assuming that the characteristics of the base classes follow Gaussian distribution, estimating parameters of Gaussian distribution of each class, and calculating the base classes
Figure SMS_23
Is taken as the mean value->
Figure SMS_24
The calculation formula is as follows:
Figure SMS_25
wherein the method comprises the steps of
Figure SMS_26
Is->
Figure SMS_27
No. 4 of the base class>
Figure SMS_28
Sample number->
Figure SMS_29
Is->
Figure SMS_30
The total number of samples of the individual base class;
s22, then the first
Figure SMS_31
Covariance matrix of sample-like data distribution>
Figure SMS_32
The calculation mode of (a) is as follows:
Figure SMS_33
s23, calculating the first step according to the calculated Gaussian distribution parameters
Figure SMS_34
Class->
Figure SMS_35
Sample->
Figure SMS_36
Belongs to the category->
Figure SMS_37
The probability of (2) is:
Figure SMS_38
wherein the method comprises the steps of
Figure SMS_39
Is the dimension of the feature;
s24, setting a threshold value
Figure SMS_40
Filtering out samples with smaller probability values to obtain the first ∈>
Figure SMS_41
A representative sample feature set of class +.>
Figure SMS_42
Figure SMS_43
Further, the step S3 specifically comprises learning training of a feature generator of the self-encoder structure based on the condition variation;
conditional variation self-encoder architecture with an encoder
Figure SMS_44
And a feature generator->
Figure SMS_47
Constructing; encoder->
Figure SMS_50
Sample characterization->
Figure SMS_45
And class label as condition->
Figure SMS_49
As input, generating the mean and variance of the sample, generating the hidden variable by randomly sampling the normal distribution and then re-parameterizing the mean and variance of the sample>
Figure SMS_51
Then the hidden variable ++>
Figure SMS_52
And corresponding category label->
Figure SMS_46
Input to the feature generator->
Figure SMS_48
Thereby generating a new sample;
for classes
Figure SMS_53
Middle->
Figure SMS_54
Sample characteristics->
Figure SMS_55
Loss function trained by feature generator>
Figure SMS_56
The method comprises the following steps:
Figure SMS_57
posterior distribution where the first term is a hidden variable
Figure SMS_58
And a priori distribution->
Figure SMS_59
Between->
Figure SMS_60
Divergence, second term is reconstruction error of feature generator, +.>
Figure SMS_61
Is->
Figure SMS_62
Semantics of classA feature representation;
total loss function for feature generator training
Figure SMS_63
The method comprises the following steps:
Figure SMS_64
wherein the method comprises the steps of
Figure SMS_65
Is the number of categories of the base class.
Further, the step S4 includes initializing each class prototype in the new class by using the feature of the support set sample, then predicting the pseudo tag of the query set sample by using each class prototype, selecting the pseudo tag sample with high reliability and the support set sample to form a seed sample set, generating more samples by using the trained feature generator, and updating each class prototype in the new class;
sample characteristics are expressed as
Figure SMS_66
Wherein->
Figure SMS_67
High, ∈representing sample profile>
Figure SMS_68
Width of representative sample feature map, ++>
Figure SMS_69
The number of channels representing the sample feature map;
the method specifically comprises the following steps:
first, for a support set
Figure SMS_70
Averaging the samples of each class to obtain the original class prototype ++>
Figure SMS_71
The calculation method is as follows:
Figure SMS_72
wherein the method comprises the steps of
Figure SMS_73
Is->
Figure SMS_74
Class prototype of individual class,/>
Figure SMS_75
For the number of samples of each class, +.>
Figure SMS_76
To indicate the function, when the sample +.>
Figure SMS_77
Label->
Figure SMS_78
Is->
Figure SMS_79
When the function value is 1, otherwise, the function value is 0;
then, the sample of the query set is predicted by using the class prototype obtained by calculation
Figure SMS_80
,/>
Figure SMS_81
Category labels of (2) to give a sample of the query set +.>
Figure SMS_82
Marking pseudo tag->
Figure SMS_83
The calculation formula is as follows:
Figure SMS_84
wherein the method comprises the steps of
Figure SMS_85
Is->
Figure SMS_86
Predicted pseudo tag->
Figure SMS_87
Is a similarity measurement function;
then, calculating the credibility score between the pseudo tag sample and the corresponding class prototype by using the obtained pseudo tag query set sample
Figure SMS_88
Figure SMS_89
Finally, each category selects the front with the highest confidence
Figure SMS_90
Pseudo tag samples, and->
Figure SMS_91
The support set samples form a seed sample set of the category;
and then generating from each seed sample using a trained feature generator
Figure SMS_92
Samples, thus obtaining an extended set of support sets +.>
Figure SMS_93
And updating the class prototype to obtain a new class prototype +.>
Figure SMS_94
The specific process is as follows:
Figure SMS_95
further, the step S5 calculates the weight of each local feature of the query sample in the query set in each class of determination, and obtains the weighted query sample feature in each class of determination, which specifically includes,
first, the first is
Figure SMS_98
A class prototype of a class is denoted->
Figure SMS_100
,/>
Figure SMS_103
Wherein each row vector is +.>
Figure SMS_97
,/>
Figure SMS_101
For a local feature representation of the class prototype +.>
Figure SMS_102
The individual query set samples are represented as
Figure SMS_104
Wherein each row vector is +.>
Figure SMS_96
,/>
Figure SMS_99
A local feature representation for the query sample;
because some interference information irrelevant to the category exists in the query sample and the importance exerted by the same local feature in the judgment of different categories is not the same, the cosine similarity between the local feature of the query sample and each local feature of the category prototype is calculated so as to obtain the weight of the local feature
Figure SMS_105
Figure SMS_106
Wherein the method comprises the steps of
Figure SMS_107
,/>
Figure SMS_111
For inquiring sample->
Figure SMS_115
Middle->
Figure SMS_109
Local features->
Figure SMS_112
In calculating and->
Figure SMS_117
Prototype of class->
Figure SMS_119
Weight at similarity; thereby obtaining category->
Figure SMS_108
Weighted feature representation of the query sample in the decision, i.e. in the calculation and +.>
Figure SMS_113
Prototype of class->
Figure SMS_116
In similarity, query sample->
Figure SMS_118
Weight matrix->
Figure SMS_110
And obtains a weighted query sample feature representation +.>
Figure SMS_114
Figure SMS_120
=/>
Figure SMS_121
Figure SMS_122
Representing the hadamard product.
Further, the step 6 specifically includes obtaining the calculated prototypes of each class
Figure SMS_123
,1≤/>
Figure SMS_124
≤/>
Figure SMS_125
And the weighted feature representation of the query sample in the class decision +.>
Figure SMS_126
,1≤/>
Figure SMS_127
≤/>
Figure SMS_128
, 1≤u≤/>
Figure SMS_129
Afterwards, classifying the query set samples, wherein the classification loss is defined as:
Figure SMS_130
according to the technical scheme, according to the small sample image classification method based on sample generation, under the condition that the number of small sample class support set samples is insufficient, the pseudo tag of the predicted query sample is calculated, the pseudo tag sample with high reliability and the support set sample are selected to form a seed sample set, more new samples are generated for each small sample class by utilizing the feature generator trained by base class data, and an expanded support set is constructed together with the pseudo tag sample with high reliability, and meanwhile, class prototypes of all small sample classes are updated. Because of the distribution difference between the support set and the query set samples, classification deviation of the query set samples can be caused, and classification performance is reduced. To alleviate this problem, the present invention obtains a weighted feature representation of a query sample in different category decisions by calculating the weights of the local features of the query sample in the different category decisions. The classification is then performed by measuring the distance between the class prototype and the feature representation of the query sample weighted over the class. Under the condition that a small number of labeled support set samples are only available, a large number of new labeled sample data which accords with the distribution of the support set samples and has diversity can be generated, the small sample data expansion is realized, meanwhile, the importance of local features of the unlabeled query sample in each class judgment is calculated, the feature representation of the query sample in each class judgment is weighted and corrected, and the classification deviation caused by the distribution difference between the support set and the query set is effectively relieved, so that the classification performance of small sample images is improved.
Specifically, the invention has the following beneficial effects:
1. because in the small sample image classification problem, the core problem is that available data samples with labels are very rare, the phenomenon of over fitting of the model can be caused, so that the classification capability of a network model on new type samples is poor.
2. The invention uses the high-credibility pseudo tag sample and the initial support set sample as seed sample set to generate samples, instead of directly adopting a large number of pseudo tag samples to expand the data set, because the pseudo tag samples contain a few low-credibility pseudo tag samples, thereby inevitably introducing noise samples. The sample generation by the feature generator in the invention greatly reduces the possibility of the problem.
3. In the small sample image classification task, each class in the support set contains only a small number of images, and the model needs to be trained according to the support set samples, and then the query set images are classified. However, the distribution difference exists between the support set and the query set sample, for example, the positions of the target object on the support set and the query set are often inconsistent, a complete dog exists on the left side in the image of the support set, and the head of the dog exists on the right side in the image of the query set.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention;
fig. 2 is a block diagram of an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention.
As shown in fig. 1 and 2, the small sample image classification method based on sample generation according to the present embodiment includes the steps of,
s1: partitioning a data set into a base class data set to be processed
Figure SMS_131
And new class data set->
Figure SMS_132
Wherein the base class data set contains a plurality of samples with class labels for training the feature generator; the new class is used for simulating a small sample classification task; the categories of the base class dataset and the new class dataset are disjoint, i.e. +.>
Figure SMS_133
In order to train the small sample image classification model, small sample image classification tasks are constructed on the new class data set, and each small sample image classification task comprises
Figure SMS_152
Each category comprising +.>
Figure SMS_137
Individual (/ ->
Figure SMS_149
The number of (2) is generally small, e.g.>
Figure SMS_141
=1 or->
Figure SMS_146
=5) samples with class labels, constituting the support set +.>
Figure SMS_151
(/>
Figure SMS_154
Wherein->
Figure SMS_138
Is->
Figure SMS_144
Sample number->
Figure SMS_134
For the class label corresponding to the sample, +.>
Figure SMS_143
To support the total number of set samples +.>
Figure SMS_136
) At the same time, each category is alternatively different +.>
Figure SMS_145
The individual samples form a query set->
Figure SMS_139
(/>
Figure SMS_148
Wherein->
Figure SMS_140
Is the first
Figure SMS_147
Sample number->
Figure SMS_153
For the class label corresponding to the sample, +.>
Figure SMS_155
For the total number of query set samples, +.>
Figure SMS_135
) Is marked as->
Figure SMS_142
Small sample image classification tasks. The purpose of the small sample image classification task is to support knowledge construction models from the basic class and the new class in the set samples and to correctly classify the query set samples. Query set sample category label
Figure SMS_150
Only for the calculation of the loss function in model training.
S2: the invention trains a feature generator on the basic class data set, and utilizes the trained feature generator to generate more sample data for new classes, in order to train the feature generator better, eliminates the influence of interference samples in each class in the basic class data set, finds the most representative class sample data for training, and comprises the following steps,
(1) Assuming that the features of the base classes follow a Gaussian distribution, estimating parameters of the Gaussian distribution of each class, and calculating the base classes
Figure SMS_156
Is taken as the mean value->
Figure SMS_157
The calculation formula is as follows:
Figure SMS_158
wherein the method comprises the steps of
Figure SMS_159
Is->
Figure SMS_160
No. 4 of the base class>
Figure SMS_161
Sample number->
Figure SMS_162
Is->
Figure SMS_163
The total number of samples of the individual base class;
(2) Then the first
Figure SMS_164
Covariance matrix of sample-like data distribution>
Figure SMS_165
The calculation mode of (a) is as follows:
Figure SMS_166
(3) Calculating the first based on the calculated Gaussian distribution parameters
Figure SMS_167
Class->
Figure SMS_168
Sample->
Figure SMS_169
Belongs to the category->
Figure SMS_170
The probability of (2) is:
Figure SMS_171
wherein the method comprises the steps of
Figure SMS_172
Is the dimension of the feature;
(4) Setting a threshold value
Figure SMS_173
Filtering out samples with smaller probability values to obtain the first ∈>
Figure SMS_174
A representative sample feature set of class +.>
Figure SMS_175
Figure SMS_176
S3: to generate more samples, the present invention trains a feature generator based on a conditional variation self-encoder structure on a base class containing a large number of marked samples. Conditional variation self-encoder architecture with an encoder
Figure SMS_178
And a feature generator->
Figure SMS_181
The composition is formed. Encoder->
Figure SMS_185
Sample characterization->
Figure SMS_179
And class label as condition->
Figure SMS_182
As input, generating the mean and variance of the sample, generating the hidden variable by randomly sampling the normal distribution and then re-parameterizing the mean and variance of the sample>
Figure SMS_187
Then the hidden variable ++>
Figure SMS_189
And corresponding category label->
Figure SMS_177
Input to the feature generator->
Figure SMS_184
Thereby generating a new sample. For class->
Figure SMS_186
Middle->
Figure SMS_188
Sample characteristics->
Figure SMS_180
Loss function trained by feature generator>
Figure SMS_183
The method comprises the following steps:
Figure SMS_190
posterior distribution where the first term is a hidden variable
Figure SMS_191
And a priori distribution->
Figure SMS_192
Between->
Figure SMS_193
Divergence, second term is reconstruction error of feature generator, +.>
Figure SMS_194
Is->
Figure SMS_195
Semantic feature representation of the class.
Total loss function for feature generator training
Figure SMS_196
The method comprises the following steps:
Figure SMS_197
wherein the method comprises the steps of
Figure SMS_198
Is the number of categories of the base class.
S4: initializing each class prototype in the new class by using the characteristics of the support set sample, then predicting pseudo labels of the query set sample by using each class prototype, selecting the pseudo label sample with high reliability and the support set sample to form a seed sample set, generating more samples by using the trained characteristic generator, and updating each class prototype in the new class. Sample characteristics may be expressed as
Figure SMS_199
Wherein->
Figure SMS_200
High, ∈representing sample profile>
Figure SMS_201
Width of representative sample feature map, ++>
Figure SMS_202
The number of channels representing the sample signature. The method specifically comprises the following steps:
(1) For support set
Figure SMS_203
Averaging the samples of each class to obtain the original class prototype ++>
Figure SMS_204
The calculation method is as follows:
Figure SMS_205
wherein the method comprises the steps of
Figure SMS_206
Is->
Figure SMS_207
Class prototype of individual class,/>
Figure SMS_208
For the number of samples of each class, +.>
Figure SMS_209
To indicate the function, when the sample +.>
Figure SMS_210
Label->
Figure SMS_211
Is->
Figure SMS_212
The function value is 1 when the value is found, otherwise, the function value is 0.
(2) Predicting a sample of a query set using computationally derived class prototypes
Figure SMS_213
(/>
Figure SMS_214
) Category labels of (2) to give a sample of the query set +.>
Figure SMS_215
Marking pseudo tag->
Figure SMS_216
The calculation formula is as follows:
Figure SMS_217
wherein the method comprises the steps of
Figure SMS_218
Is->
Figure SMS_219
Predicted pseudo tag->
Figure SMS_220
Is a similarity measurement function;
(3) Calculating a credibility score between the pseudo tag sample and the corresponding class prototype by using the obtained pseudo tag query set sample
Figure SMS_221
Figure SMS_222
(4) Each category selects the front with the highest confidence
Figure SMS_223
Pseudo tag samples, and->
Figure SMS_224
The support set samples constitute a seed sample set for the category. Then generating +.A. from each seed sample using a trained feature generator>
Figure SMS_225
Samples, thus obtaining an extended set of support sets +.>
Figure SMS_226
And updating the class prototype to obtain a new class prototype +.>
Figure SMS_227
Specifically crossThe process is as follows:
Figure SMS_228
s5: in the small sample image classification task, each class in the support set contains only a small number of images, and the model needs to be trained according to the support set samples, and then the query set images are classified. However, there is a distribution difference between the support set and the query set samples, resulting in classification bias. In order to alleviate the problem, the invention provides the method for acquiring the weighted query set sample of the given category by calculating the weight of each local feature of the query set sample in the judgment of the given category, thereby improving the attention degree of the target object area and reducing the attention to other irrelevant areas, and further improving the classification performance of the small sample image. In particular to the preparation method of the composite material,
first, the first is
Figure SMS_230
A class prototype of a class is denoted->
Figure SMS_234
,/>
Figure SMS_235
Wherein each row vector is +.>
Figure SMS_229
(/>
Figure SMS_233
) For a local feature representation of the class prototype +.>
Figure SMS_237
The individual query set samples are represented as
Figure SMS_238
Wherein each row vector is +.>
Figure SMS_231
(/>
Figure SMS_232
) Is a representation of a local feature of the query sample. Because some interference information irrelevant to the category exists in the query sample and the importance exerted by the same local feature in the judgment of different categories is not the same, the cosine similarity between the local feature of the query sample and each local feature of the class prototype is calculated to obtain the weight of the local feature>
Figure SMS_236
Figure SMS_239
Wherein the method comprises the steps of
Figure SMS_242
(/>
Figure SMS_244
) For inquiring sample->
Figure SMS_249
Middle->
Figure SMS_241
Local features->
Figure SMS_245
In calculating and->
Figure SMS_248
Prototype of class->
Figure SMS_251
Weight at similarity; thereby obtaining category->
Figure SMS_240
Weighted feature representation of the query sample in the decision, i.e. in the calculation and +.>
Figure SMS_246
Prototype of class->
Figure SMS_250
In similarity, query sample->
Figure SMS_252
Weight matrix->
Figure SMS_243
And obtains a weighted query sample feature representation +.>
Figure SMS_247
Figure SMS_253
=/>
Figure SMS_254
Figure SMS_255
Representing the hadamard product.
7. Obtaining calculated prototypes of each class
Figure SMS_256
(1≤/>
Figure SMS_257
≤/>
Figure SMS_258
) And the weighted feature representation of the query sample in the class decision +.>
Figure SMS_259
(1≤/>
Figure SMS_260
≤/>
Figure SMS_261
, 1≤u≤/>
Figure SMS_262
) Afterwards, classifying the query set samples, wherein the classification loss is defined as:
Figure SMS_263
the following examples illustrate the technical effects of embodiments of the present invention:
table 1: comparative experiment results (%)
Figure SMS_264
Table 1 shows the experimental results of the present invention (FSL-BPSG) in comparison with other mainstream methods. As can be seen from the table, on both the CIFAR-FS and CUB mainstream data sets, on
Figure SMS_265
=1 and->
Figure SMS_266
Under the experimental setup of =5 two standards, the best results were obtained for the examples of the present invention.
In yet another aspect, the invention also discloses a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of any of the methods described above.
In yet another aspect, the invention also discloses a computer device comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of any of the methods described above.
In yet another embodiment provided herein, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of any of the methods of the above embodiments.
It may be understood that the system provided by the embodiment of the present invention corresponds to the method provided by the embodiment of the present invention, and explanation, examples and beneficial effects of the related content may refer to corresponding parts in the above method.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. The small sample image classifying method based on sample generation utilizes pre-constructed image classifying model to classify image data, and features that the image classifying model is constructed through the following steps,
s1, dividing a data set into a basic class data set and a new class data set, wherein the basic class data set comprises a sample with a mark and is used for training a feature generator; the new class data set is used for simulating and constructing small sample classification tasks, and each small sample task comprises a support set and a query set;
s2, eliminating the influence of interference samples in each category in the base category data set, and finding out the most representative sample data;
s3, training a feature generator by using sample data most representative of each category in the base class data set;
s4, calculating a support set class prototype in the new class data set, predicting a query set sample label, and taking the sample label as a pseudo label of the query set sample; generating a set number of samples by using a trained feature generator according to the support set samples and the pseudo tags with the credibility meeting the requirements, expanding the support set together with the pseudo tag samples with the credibility meeting the requirements, and then recalculating to obtain updated class prototypes;
s5, calculating the weight of each local feature of the query sample in the query set in each class of judgment, and obtaining the weighted query sample feature in each class of judgment;
s6, classifying the query samples by measuring the feature similarity of the class prototypes and the weighted query samples;
wherein, the step S5 calculates the weight of each local feature of the query sample in the query set in each class of judgment, and obtains the weighted query sample feature in each class of judgment, specifically comprising,
first, the first is
Figure QLYQS_2
A class prototype of a class is denoted->
Figure QLYQS_5
,/>
Figure QLYQS_7
Wherein each row vector is +.>
Figure QLYQS_3
,/>
Figure QLYQS_4
For a local feature representation of the class prototype +.>
Figure QLYQS_8
The individual query set samples are represented as
Figure QLYQS_9
Wherein each row vector is +.>
Figure QLYQS_1
,/>
Figure QLYQS_6
A local feature representation for the query sample;
computing cosine similarity between local features of query sample and each local feature of class prototype to obtain weight of the local feature
Figure QLYQS_10
Figure QLYQS_11
Wherein the method comprises the steps of
Figure QLYQS_15
,/>
Figure QLYQS_19
For inquiring sample->
Figure QLYQS_22
Middle->
Figure QLYQS_14
Local features->
Figure QLYQS_17
In calculating and->
Figure QLYQS_21
Prototype of class->
Figure QLYQS_24
Weight at similarity; thereby obtaining category->
Figure QLYQS_12
Weighted feature representation of the query sample in the decision, i.e. in the calculation and +.>
Figure QLYQS_18
Prototype of class->
Figure QLYQS_20
In similarity, query sample->
Figure QLYQS_23
Weight matrix->
Figure QLYQS_13
And obtains a weighted query sample feature representation +.>
Figure QLYQS_16
Figure QLYQS_25
=/>
Figure QLYQS_26
Figure QLYQS_27
Representing Hadamard product。
2. The sample-based small sample image classification method of claim 1, wherein: the step S1 specifically includes:
partitioning a data set into a base class data set to be processed
Figure QLYQS_28
And new class data set->
Figure QLYQS_29
Wherein the base class data set contains class-labeled samples for training the feature generator; the new class is used for simulating a small sample classification task; the categories of the base class dataset and the new class dataset are disjoint, i.e. +.>
Figure QLYQS_30
Constructing small sample image classification tasks on the new class data set, wherein each small sample image classification task comprises
Figure QLYQS_38
Each category comprising +.>
Figure QLYQS_34
Samples with class marks constitute a support set +.>
Figure QLYQS_46
,/>
Figure QLYQS_33
Wherein->
Figure QLYQS_45
Is->
Figure QLYQS_41
Sample number->
Figure QLYQS_47
For the class label corresponding to the sample, +.>
Figure QLYQS_36
In order to support the total number of set samples,
Figure QLYQS_42
at the same time, each category is alternatively different +.>
Figure QLYQS_31
The individual samples form a query set->
Figure QLYQS_43
Figure QLYQS_32
Wherein->
Figure QLYQS_39
Is->
Figure QLYQS_37
Sample number->
Figure QLYQS_44
For the class label corresponding to the sample, +.>
Figure QLYQS_35
For the total number of query set samples, +.>
Figure QLYQS_40
The purpose of the small sample image classification task is to learn knowledge from support set samples in the basic class and the new class to construct a model, correctly classify the query set samples, and label the query set sample types
Figure QLYQS_48
Only for the calculation of the loss function in model training.
3. The sample-based small sample image classification method of claim 2, wherein: step S2, eliminating the influence of interference samples in each category in the base category data set, and finding out the most representative sample data; in particular to the preparation method of the composite material,
s21, assuming that the characteristics of the base classes follow Gaussian distribution, estimating parameters of Gaussian distribution of each class, and calculating the base classes
Figure QLYQS_49
Is taken as the mean value->
Figure QLYQS_50
The calculation formula is as follows:
Figure QLYQS_51
wherein the method comprises the steps of
Figure QLYQS_52
Is->
Figure QLYQS_53
No. 4 of the base class>
Figure QLYQS_54
Sample number->
Figure QLYQS_55
Is->
Figure QLYQS_56
The total number of samples of the individual base class;
s22, then the first
Figure QLYQS_57
Covariance matrix of sample-like data distribution>
Figure QLYQS_58
The calculation mode of (a) is as follows:
Figure QLYQS_59
s23, calculating the first step according to the calculated Gaussian distribution parameters
Figure QLYQS_60
Class->
Figure QLYQS_61
Sample->
Figure QLYQS_62
Belongs to the category->
Figure QLYQS_63
The probability of (2) is:
Figure QLYQS_64
wherein the method comprises the steps of
Figure QLYQS_65
Is the dimension of the feature;
s24, setting a threshold value
Figure QLYQS_66
Filtering out samples with smaller probability values to obtain the first ∈>
Figure QLYQS_67
A representative sample feature set of classes
Figure QLYQS_68
Figure QLYQS_69
4. A sample-based generated small sample image classification method in accordance with claim 3, wherein: step S3 comprises learning training of a feature generator of the self-encoder structure based on the condition variation;
conditional variation self-encoder architecture with an encoder
Figure QLYQS_71
And a feature generator->
Figure QLYQS_73
Constructing; encoder with a plurality of sensors
Figure QLYQS_75
Sample characterization->
Figure QLYQS_72
And class label as condition->
Figure QLYQS_76
As input, generating the mean and variance of the sample, generating the hidden variable by randomly sampling the normal distribution and then re-parameterizing the mean and variance of the sample>
Figure QLYQS_77
Then the hidden variable ++>
Figure QLYQS_78
And corresponding category label->
Figure QLYQS_70
Input to the feature generator->
Figure QLYQS_74
Thereby generating a new sample;
for classes
Figure QLYQS_79
Middle->
Figure QLYQS_80
Sample characteristics->
Figure QLYQS_81
Loss function trained by feature generator>
Figure QLYQS_82
The method comprises the following steps:
Figure QLYQS_83
posterior distribution where the first term is a hidden variable
Figure QLYQS_84
And a priori distribution->
Figure QLYQS_85
Between->
Figure QLYQS_86
Divergence, second term is reconstruction error of feature generator, +.>
Figure QLYQS_87
Is->
Figure QLYQS_88
Semantic feature representation of the class;
total loss function for feature generator training
Figure QLYQS_89
The method comprises the following steps:
Figure QLYQS_90
wherein the method comprises the steps of
Figure QLYQS_91
Is the number of categories of the base class.
5. The sample-based generated small sample image classification method of claim 4, wherein: step S4 comprises initializing each class prototype in the new class by using the sample characteristics of the support set, then predicting pseudo labels of the query set samples by using each class prototype, selecting the pseudo label samples with high reliability and the support set samples to form a seed sample set together, generating more samples by using a trained characteristic generator, and updating each class prototype in the new class;
sample characteristics are expressed as
Figure QLYQS_92
Wherein->
Figure QLYQS_93
High, ∈representing sample profile>
Figure QLYQS_94
Width of representative sample feature map, ++>
Figure QLYQS_95
The number of channels representing the sample feature map;
the method specifically comprises the following steps:
first, for a support set
Figure QLYQS_96
Averaging the samples of each class to obtain the original class prototype ++>
Figure QLYQS_97
The calculation method is as follows:
Figure QLYQS_98
wherein the method comprises the steps of
Figure QLYQS_101
Is->
Figure QLYQS_104
Class prototype of individual class,/>
Figure QLYQS_106
,/>
Figure QLYQS_100
For the number of samples of each class, +.>
Figure QLYQS_102
To indicate the function, when the sample +.>
Figure QLYQS_103
Label->
Figure QLYQS_105
Is->
Figure QLYQS_99
When the function value is 1, otherwise, the function value is 0;
then, the sample of the query set is predicted by using the class prototype obtained by calculation
Figure QLYQS_107
,/>
Figure QLYQS_108
Category labels of (2) to give a sample of the query set +.>
Figure QLYQS_109
Marking pseudo tag->
Figure QLYQS_110
The calculation formula is as follows:
Figure QLYQS_111
wherein the method comprises the steps of
Figure QLYQS_112
Is->
Figure QLYQS_113
Predicted pseudo tag->
Figure QLYQS_114
Is a similarity measurement function;
then, calculating the credibility score between the pseudo tag sample and the corresponding class prototype by using the obtained pseudo tag query set sample
Figure QLYQS_115
Figure QLYQS_116
Finally, each category selects the front with the highest confidence
Figure QLYQS_117
Pseudo tag samples, and->
Figure QLYQS_118
The support set samples form a seed sample set of the category;
and then generating from each seed sample using a trained feature generator
Figure QLYQS_119
Samples, thus obtaining an extended set of support sets +.>
Figure QLYQS_120
And updating the class prototype to obtain a new class prototype +.>
Figure QLYQS_121
The specific process is as follows:
Figure QLYQS_122
6. the sample-based generated small sample image classification method of claim 5, wherein: the step S6 specifically includes: obtaining the calculated prototypes of each class
Figure QLYQS_123
,1≤/>
Figure QLYQS_124
≤/>
Figure QLYQS_125
And the weighted feature representation of the query sample in the class decision +.>
Figure QLYQS_126
,1≤/>
Figure QLYQS_127
≤/>
Figure QLYQS_128
, 1≤u≤/>
Figure QLYQS_129
Afterwards, classifying the query set samples, wherein the classification loss is defined as:
Figure QLYQS_130
7. a computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method of any of claims 1 to 6.
8. A computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of the method of any one of claims 1 to 6.
CN202310436361.XA 2023-04-23 2023-04-23 Small sample image classification method, device and storage medium based on sample generation Active CN116168257B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310436361.XA CN116168257B (en) 2023-04-23 2023-04-23 Small sample image classification method, device and storage medium based on sample generation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310436361.XA CN116168257B (en) 2023-04-23 2023-04-23 Small sample image classification method, device and storage medium based on sample generation

Publications (2)

Publication Number Publication Date
CN116168257A CN116168257A (en) 2023-05-26
CN116168257B true CN116168257B (en) 2023-07-04

Family

ID=86422173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310436361.XA Active CN116168257B (en) 2023-04-23 2023-04-23 Small sample image classification method, device and storage medium based on sample generation

Country Status (1)

Country Link
CN (1) CN116168257B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112598091A (en) * 2021-03-08 2021-04-02 北京三快在线科技有限公司 Training model and small sample classification method and device
CN113592008A (en) * 2021-08-05 2021-11-02 哈尔滨理工大学 System, method, equipment and storage medium for solving small sample image classification based on graph neural network mechanism of self-encoder
CN114444600A (en) * 2022-01-28 2022-05-06 南通大学 Small sample image classification method based on memory enhanced prototype network
CN115731411A (en) * 2022-10-27 2023-03-03 西北工业大学 Small sample image classification method based on prototype generation

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2657857A1 (en) * 2012-04-27 2013-10-30 ATG Advanced Swiss Technology Group AG Method for binary classification of a query image
US11087174B2 (en) * 2018-09-25 2021-08-10 Nec Corporation Deep group disentangled embedding and network weight generation for visual inspection
CN112287954A (en) * 2019-07-24 2021-01-29 华为技术有限公司 Image classification method, training method of image classification model and device thereof
US11263488B2 (en) * 2020-04-13 2022-03-01 International Business Machines Corporation System and method for augmenting few-shot object classification with semantic information from multiple sources
US11816188B2 (en) * 2020-08-31 2023-11-14 Sap Se Weakly supervised one-shot image segmentation
US20220300823A1 (en) * 2021-03-17 2022-09-22 Hanwen LIANG Methods and systems for cross-domain few-shot classification
CN113076437B (en) * 2021-04-13 2023-02-14 华南理工大学 Small sample image classification method and system based on label redistribution
CN114155397B (en) * 2021-11-29 2023-01-03 中国船舶重工集团公司第七0九研究所 Small sample image classification method and system
CN114169442B (en) * 2021-12-08 2022-12-09 中国电子科技集团公司第五十四研究所 Remote sensing image small sample scene classification method based on double prototype network
CN114299362A (en) * 2021-12-27 2022-04-08 南京邮电大学 Small sample image classification method based on k-means clustering
CN114387473A (en) * 2022-01-12 2022-04-22 南通大学 Small sample image classification method based on base class sample characteristic synthesis
CN114387474A (en) * 2022-01-12 2022-04-22 南通大学 Small sample image classification method based on Gaussian prototype classifier
CN114580566A (en) * 2022-03-22 2022-06-03 南通大学 Small sample image classification method based on interval supervision contrast loss
CN114882267A (en) * 2022-03-31 2022-08-09 中国科学院信息工程研究所 Small sample image classification method and system based on relevant region
CN114758320A (en) * 2022-04-18 2022-07-15 中汽创智科技有限公司 Small sample image classification method, device, equipment and medium
CN114943859B (en) * 2022-05-05 2023-06-20 兰州理工大学 Task related metric learning method and device for small sample image classification
CN114782752B (en) * 2022-05-06 2023-09-05 兰州理工大学 Small sample image integrated classification method and device based on self-training
CN115393666A (en) * 2022-08-02 2022-11-25 广东工业大学 Small sample expansion method and system based on prototype completion in image classification

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112598091A (en) * 2021-03-08 2021-04-02 北京三快在线科技有限公司 Training model and small sample classification method and device
CN113592008A (en) * 2021-08-05 2021-11-02 哈尔滨理工大学 System, method, equipment and storage medium for solving small sample image classification based on graph neural network mechanism of self-encoder
CN114444600A (en) * 2022-01-28 2022-05-06 南通大学 Small sample image classification method based on memory enhanced prototype network
CN115731411A (en) * 2022-10-27 2023-03-03 西北工业大学 Small sample image classification method based on prototype generation

Also Published As

Publication number Publication date
CN116168257A (en) 2023-05-26

Similar Documents

Publication Publication Date Title
Liu et al. Learning deep multi-level similarity for thermal infrared object tracking
Liang et al. Efficient adversarial attacks for visual object tracking
CN106599836B (en) Multi-face tracking method and tracking system
CN113221905B (en) Semantic segmentation unsupervised domain adaptation method, device and system based on uniform clustering and storage medium
Lerch-Hostalot et al. Unsupervised steganalysis based on artificial training sets
CN112348849B (en) Twin network video target tracking method and device
CN108491766B (en) End-to-end crowd counting method based on depth decision forest
Ye et al. Learning with noisy labels for robust point cloud segmentation
WO2021243947A1 (en) Object re-identification method and apparatus, and terminal and storage medium
CN112633061A (en) Lightweight FIRE-DET flame detection method and system
CN109818971B (en) Network data anomaly detection method and system based on high-order association mining
CN113920170B (en) Pedestrian track prediction method, system and storage medium combining scene context and pedestrian social relationship
CN107358172B (en) Human face feature point initialization method based on human face orientation classification
CN112766170B (en) Self-adaptive segmentation detection method and device based on cluster unmanned aerial vehicle image
Pang et al. Federated learning for crowd counting in smart surveillance systems
CN114565106A (en) Defense method for federal learning poisoning attack based on isolated forest
KR102122168B1 (en) Selecting apparatus and selecting method for sea fog removing prediction model learning method and prediction apparatus and prediction method for sea fog removing
CN116168257B (en) Small sample image classification method, device and storage medium based on sample generation
CN117424754A (en) Defense method, terminal and storage medium for cluster federal learning attack
Yin et al. Adversarial attack, defense, and applications with deep learning frameworks
CN116740495A (en) Training method and defect detection method for defect detection model of road and bridge tunnel
CN116340869A (en) Distributed CatB body detection method and equipment based on red fox optimization algorithm
CN116541592A (en) Vector generation method, information recommendation method, device, equipment and medium
CN111160077A (en) Large-scale dynamic face clustering method
KR20230061925A (en) Apparatus and Method for Training Network Intrusion Detection Model Based on Extended Training Data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant