CN111753863A - Image classification method and device, electronic equipment and storage medium - Google Patents

Image classification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111753863A
CN111753863A CN201910295876.6A CN201910295876A CN111753863A CN 111753863 A CN111753863 A CN 111753863A CN 201910295876 A CN201910295876 A CN 201910295876A CN 111753863 A CN111753863 A CN 111753863A
Authority
CN
China
Prior art keywords
sample
image
type
determining
classification model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910295876.6A
Other languages
Chinese (zh)
Inventor
潘滢炜
姚霆
梅涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201910295876.6A priority Critical patent/CN111753863A/en
Publication of CN111753863A publication Critical patent/CN111753863A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Abstract

The embodiment of the invention discloses an image classification method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring an image to be processed, and inputting the image into a pre-trained target image classification model, wherein the target image classification model is obtained by training a first sample with a standard type label and a second sample without the standard type label in an unsupervised learning manner; and determining the image type of the image to be processed according to the output result of the target image classification model. The image classification model is subjected to unsupervised iterative training through a large number of second samples without standard type labels and a small number of first samples with standard type labels, manual classification and label setting of each sample image are not needed, the processing difficulty and the processing time of sample data are reduced, and the total training time of the target image classification model is further reduced.

Description

Image classification method and device, electronic equipment and storage medium
Technical Field
Embodiments of the present invention relate to image processing technologies, and in particular, to an image classification method and apparatus, an electronic device, and a storage medium.
Background
With the rapid development of electronic technologies and the internet, particularly the mobile internet, electronic devices, particularly intelligent mobile terminals, have increasingly powerful functions, and users can install various application programs on the intelligent mobile terminals according to their own needs to complete various transactions. Image recognition is implemented, for example, by an application installed on the electronic device.
At present, in order to identify an image and identify a category to which the image belongs, a machine learning model is generally implemented in the related art, and the machine learning model is generally obtained by performing supervised learning training on a large amount of sample data. In the training process of supervised learning of the machine learning model, a large number of data samples are needed, and each sample data needs to be manually classified, namely, the workload of the sampling process is huge, the training difficulty of the machine learning model is increased, and if the number of the sample data is reduced, the image classification precision of the machine learning model is poor.
Disclosure of Invention
The embodiment of the invention provides an image classification method, an image classification device, electronic equipment and a storage medium, and aims to reduce training difficulty on the basis of ensuring the accuracy of a machine learning model.
In a first aspect, an embodiment of the present invention provides an image classification method, including:
acquiring an image to be processed, and inputting the image into a pre-trained target image classification model, wherein the target image classification model is obtained by training a first sample with a standard type label and a second sample without the standard type label in an unsupervised learning manner;
and determining the image type of the image to be processed according to the output result of the target image classification model.
In a second aspect, an embodiment of the present invention further provides an image classification apparatus, where the apparatus includes:
the image classification model comprises a to-be-processed image acquisition module, a to-be-processed image acquisition module and a pre-trained target image classification model, wherein the to-be-processed image acquisition module is used for acquiring an image to be processed and inputting the image into the pre-trained target image classification model, and the target image classification model is obtained by training a first sample with a standard type label and a second sample without the standard type label in an unsupervised learning mode;
and the image type determining module is used for determining the image type of the image to be processed according to the output result of the target image classification model.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the image classification method according to any embodiment of the present application.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the image classification method according to any embodiment of the present application.
According to the technical scheme provided by the embodiment of the invention, the image classification model is subjected to unsupervised iterative training through a large number of second samples without standard type labels and a small number of first samples with standard type labels to obtain the target image classification model with the image classification function, each sample image is not required to be manually classified and labeled, the processing difficulty and the processing time of sample data are reduced, and the total training time of the target image classification model is further reduced. Meanwhile, a small amount of first samples with standard type labels are used for auxiliary training, correction is carried out in the unsupervised training process, the training precision of the target image classification model is guaranteed, the image classification accuracy is improved, the target image classification model obtained through the training in the mode carries out classification processing on the images to be processed, the classification speed is high, and the precision is high.
Drawings
Fig. 1 is a schematic flowchart of an image classification method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an image classification method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of sample data and class center enhanced according to the second embodiment of the present invention;
fig. 4 is a flowchart illustrating an image classification method according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of an image classification apparatus according to a fourth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flow chart of an image classification method according to an embodiment of the present invention, which is applicable to a case where an image classification model obtained by unsupervised training is used to classify an image, and the method can be executed by an image classification apparatus according to an embodiment of the present invention, and specifically includes the following steps:
s110, obtaining an image to be processed, and inputting the image into a pre-trained target image classification model, wherein the target image classification model is obtained by training in an unsupervised learning mode according to a first sample with a standard type label and a second sample without the standard type label.
And S120, determining the image type of the image to be processed according to the output result of the target image classification model.
The image to be processed may be an image locally stored in the electronic device, or an image acquired in real time by an image acquisition module of the electronic device. Electronic devices may include, but are not limited to, cell phones, tablets, computers, smart cameras, and servers. In this embodiment, images to be processed are classified, an image type of the images to be processed is determined, and the images to be processed are conveniently processed according to the image type, optionally, the images to be processed may be classified and stored according to the image type. Wherein, the image type can include but is not limited to animal type, person type, landscape type, file type, etc.; in some embodiments, the image to be processed may also be a medical inspection image, the medical inspection image is classified, the image type of each medical inspection image is determined, the medical inspection image is classified and stored according to the image type, so as to facilitate management, retrieval and comparison of the same type of medical inspection image, for example, the image type of the medical inspection image may include, but is not limited to, a head inspection image, a leg inspection image, a breast inspection image, and the like. Optionally, the target object corresponding to the image to be processed may be classified according to the image type, where the target object is a shooting target object of the image to be processed, and illustratively, the target object may be a logistics package to be sorted, the type of the logistics package is further determined according to the image type of the logistics package image by classifying the logistics package image, and the logistics package is sorted according to the type of the logistics package, so that the logistics package is sorted quickly, and a process of classifying the logistics package by a task is replaced.
In this embodiment, the images to be processed are classified by a pre-trained target image classification model, where the target image classification model may be a strong classifier composed of a plurality of weak classifiers, such as a regression tree module or a lifting tree model, or may be a deep learning neural network model, such as a convolution network model. Optionally, the target image classification model is a prototype network model (prototypical networks), the prototype network model may map sample data into a space, determine a prototype (class center point) of the type by extracting a mean value of the same type of sample data, and further determine an image type of the image to be processed by a distance between the image to be processed and the prototype of each type, where the distance between the image to be processed and the prototype of the same type is small and the distance between the image to be processed and the prototype of a different type is large.
In this embodiment, the target image classification model is obtained by performing unsupervised learning training according to a first sample with a standard type label and a second sample without the standard type label, where the number of samples of the first sample with the standard type label is small, and the number of samples of the second sample without the standard type label is large, and the target image classification model is obtained by training the image classification model through a large number of second samples without the standard type label and a small number of first samples with the standard type label, so that workload for manually classifying sample data and setting labels in an early stage of training is reduced, total time consumed by training the target image classification model is simplified, and training efficiency of the target image classification model is improved. The standard type tag of the first sample may be set according to the image type of the first sample by performing image type identification on the first sample, and for example, the first sample and the second sample may be sample data (sample images) downloaded from a network, the image type of the first sample is identified, and the standard type tag of the first sample is set according to a correspondence between the image type and the type tag. For example, if the image type of the first sample is landscape, the standard type label of the first sample is set to be a landscape label. The image type of the first sample may be determined by manual means of identification.
For example, taking a prototype network model as an example, the unsupervised training process of the target image classification model may be: the first sample comprises at least two sample data of each image type, a standard type label is set for the sample data in the first sample data, the second sample comprises a plurality of sample data, and the sample data can be obtained by downloading through a network. Determining a first class central point of each image type according to each sample data in a first sample, predicting a plurality of sample data in a second sample through an initially established image classification model to obtain a predicted image type, further determining a second class central point of each image type, determining a loss function according to the difference between the first class central point and the second class central point of the same image type, adjusting the initially established image classification model according to the loss function, updating the image classification model, circularly predicting the sample data in the second sample through the updated image classification model, determining a new second class central point based on the prediction result, further determining a new loss function, and iteratively adjusting the updated image classification model according to the new loss function until the first class central point and the second class central point of the same image type are overlapped or the difference is smaller than a preset error, it is determined that the training of the target image classification model is complete.
It should be noted that, in the model training process, a loss function layer may be connected to the output layer of the target image classification model for calculating a loss function. Correspondingly, after the training of the target image classification model is finished, the loss function layer is cancelled.
In this embodiment, the image to be processed is input into the target image classification model, and an output result of the target image classification model is obtained, where the output result may be a probability that the image to be processed belongs to each image type, and the image type with the highest probability is determined as the image type of the image to be processed.
According to the technical scheme of the embodiment, unsupervised iterative training is performed on the image classification model through a large number of second samples without standard type labels and a small number of first samples with standard type labels, so that the target image classification model with the image classification function is obtained, manual classification and label setting of each sample image are not needed, the processing difficulty and the processing time of sample data are reduced, and the total training time of the target image classification model is further reduced. Meanwhile, a small amount of first samples with standard type labels are used for auxiliary training, correction is carried out in the unsupervised training process, the training precision of the target image classification model is guaranteed, the image classification accuracy is improved, the target image classification model obtained through the training in the mode carries out classification processing on the images to be processed, the classification speed is high, and the precision is high.
Example two
Fig. 2 is a schematic flow chart of an image classification method provided in a second embodiment of the present invention, and on the basis of the second embodiment, an unsupervised training method of a target image classification model is provided, where the method specifically includes:
s210, establishing a first image classification model according to the first sample with the standard type label.
S220, predicting the second sample without the standard type label according to the first image classification model, and generating a prediction type label of the second sample according to a prediction result.
S230, determining a first loss function according to the first sample, the standard type label of the first sample, the second sample and the prediction type label of the second sample.
S240, adjusting parameters of the first image classification model according to the first loss function, and generating the target image classification model.
S250, obtaining an image to be processed, inputting the image into a pre-trained target image classification model, and determining the image type of the image to be processed according to the output result of the target image classification model.
In step S210, the first image classification model established based on the first sample with the standard type label may be established and trained by a supervised training manner based on a small number of first samples. The quantity of the sample data in the first sample is small, the generation speed of the first image classification model is high, and the total training time of the target image classification model is further reduced.
And inputting the sample data in the second sample into the first image classification model, determining a predicted image type of the sample data according to an output result of the first image classification model, wherein the output result of the first image classification model is the probability that the sample data belongs to each image type, determining the image type with the highest probability as the predicted image type of the sample data, and setting a predicted type label of the sample data according to the predicted image type. The standard type label of the first sample and the prediction type label of the second sample can be numbers, characters, symbols or character strings formed by at least one of the above combinations, and are used for representing the image type of the sample data.
In this embodiment, the prediction accuracy of the prediction type label of the second sample is determined in an auxiliary manner through the standard type labels of the first sample and the first sample, and when the prediction accuracy is high, the first loss function is small, whereas when the prediction accuracy is low, the first loss function is large. Optionally, determining a first loss function according to the first sample, the standard type label of the first sample, and the prediction type label of the second sample, includes: determining a first type central point of each image type according to the first sample and a standard type label of the first sample; determining a second type central point of each image type according to the second sample and the prediction type label of the second sample; determining a third class center point of each image type according to the first sample, the standard type label of the first sample, the second sample and the prediction type label of the second sample; determining a first loss function according to at least two of the first class center point, the second class center point, and the third class center point of each image type. For example, referring to fig. 3, fig. 3 is a schematic diagram of sample data and a class center point provided in the second embodiment of the present invention. In fig. 3, three image types are included, and each image type includes a plurality of sample data and a class center point, where the class center point of each image type is determined according to the sample data of the image type.
Optionally, taking the first type of center point as an example, the method for determining the first type of center point includes: determining an image type of the first sample according to the standard type label; for the first sample of any image type, extracting a feature vector of the first sample, and determining the position of the first sample according to the feature vector and a preset function; determining the first type of center point according to the position of the first sample. The electronic equipment is preset with a corresponding relation between a standard type label and an image type, the image type of sample data in a first sample is determined according to the standard type label of the sample data, and a first type central point of the image type is determined according to the sample data of the same image type. Wherein due to the first diagramThe image classification model is a multi-layer network model, and may be an output feature vector of a preset level in the first image classification model as a feature vector of the sample data. The sample data is projected to a metric space by a first image classification model, which may be, for example
Figure BDA0002026464720000091
Determining the position of the sample data in the measurement space, and further determining a first type central point of a c image type according to the following formula (1), wherein the c image type is any one of all image types:
Figure BDA0002026464720000092
wherein the content of the first and second substances,
Figure BDA0002026464720000093
a first type center point for a c-image type determined from a first sample with a standard type label belonging to the c-image type;
Figure BDA0002026464720000094
for the first sample with a standard type label belonging to the c picture type,
Figure BDA0002026464720000095
the number of sample data in a first sample with a standard type label and belonging to the type c of the image;
Figure BDA0002026464720000096
for sample data, θ is what needs to be learned, and the parameter θ determines the position between sample data.
Correspondingly, the method for determining the second-class center point comprises the following steps: determining an image type of the second sample according to the prediction type label; for the second sample of any image type, extracting a feature vector of the second sample, and determining the position of the second sample according to the feature vector and a preset function; determining the second sample based on the location of the second sampleA second type of center point. In a similar way, can be by
Figure BDA0002026464720000097
Determining the position of the sample data in the metric space, and determining the second-class central point of the c image type according to the following formula (2):
Figure BDA0002026464720000098
wherein the content of the first and second substances,
Figure BDA0002026464720000099
a second class center point for the c image type determined from a second sample belonging to the c image type;
Figure BDA00020264647200000910
for the second sample belonging to the type of c-picture,
Figure BDA00020264647200000911
is the number of sample data in the second sample belonging to the c image type;
Figure BDA00020264647200000912
is the sample data.
Correspondingly, the method for determining the center point of the third class comprises the following steps: determining the image type of the second sample according to the prediction type label, and determining the image type of the second sample according to the prediction type label; for the second sample and the first sample of any image type, extracting the feature vectors of the first sample and the second sample, and determining the positions of the sample data in the first sample and the second sample according to the feature vectors and a preset function; and determining the third class center point according to the positions of the sample data in the first sample and the second sample. Similarly, the second-class center point of the c image type is determined according to the following formula (3):
Figure BDA0002026464720000101
in the training process of the first image classification model, a first sample is sequentially input into the first image classification model, output characteristic vectors of preset levels in the first image classification model and standard image types of an output layer are extracted, the characteristic vectors and the standard image types of all the sample data are stored in a loss function layer, and a first type central point of each image type is determined by the loss function layer according to the characteristic vectors and the standard image types of all the sample data in the first sample. Correspondingly, the second sample is input into the first image classification model, the output characteristic vector of a preset level and the predicted image type of an output layer in the first image classification model are extracted, the characteristic vector and the predicted image type of each sample data are stored in a loss function layer, the loss function layer determines a second center point of each image type according to the characteristic vector and the predicted image type of all sample data in the second sample, and meanwhile, the loss function layer determines a third center point of each image type according to the stored characteristic vector and standard image type of all sample data in the first sample, and the stored characteristic vector and predicted image type of all sample data in the second sample.
In this embodiment, the first-class central point, the second-class central point, and the third-class central point of each image type are respectively determined according to the standard type labels of the first sample and the first sample, the prediction type labels of the second sample and the second sample, the standard type label of the first sample, the prediction type labels of the second sample, and the first loss function is determined according to the first-class central point, the second-class central point, and the third-class central point. The first loss function is used for representing the consistency of the first-class central point, the second-class central point and the third-class central point, and the worse the consistency of the first-class central point, the second-class central point and the third-class central point is, the larger the first loss function is. Optionally, determining a first loss function according to at least two of the first-class central point, the second-class central point, and the third-class central point of each image type includes: for any image type, determining class center point difference according to at least two items of the first class center point, the second class center point and the third class center point; the first loss function is determined from the mean of the class midpoint differences for each image type. The difference between any two of the first type center point, the second type center point and the third type center point of the current image type may be determined as the class center point difference of the current image type, or three differences may be obtained according to the difference between any two of the first type center point, the second type center point and the third type center point of the current image type, and the class center point difference of the current image type may be determined according to the three differences. The class center point differences of the various image types are averaged to determine a first loss function. Optionally, the first class center point, the second class center point and the third class center point are mapped to a Hilbert space (RKHS) in which class center point differences are determined, where the Hilbert space is a generalization of an n-dimensional euclidean space and can be regarded as an "infinite-dimensional euclidean space", and the differences between the class center points in the Hilbert space represent distances between class data distributions, where the differences between the class center points can be represented by euclidean distances. Determining a first loss function according to the following equation (4):
Figure BDA0002026464720000111
wherein the content of the first and second substances,
Figure BDA0002026464720000112
a first loss function determined from the first type of center point, the second type of center point and the third type of center point,
Figure BDA0002026464720000121
and
Figure BDA0002026464720000122
respectively a first-class central point, a second-class central point and a third-class central point of the C image types in the Hilbert space, wherein C is the number of the types of the image types. It should be noted that the first loss functionCan also be
Figure BDA0002026464720000123
Figure BDA0002026464720000124
Or
Figure BDA0002026464720000125
In this embodiment, the first loss function may further include a classification error of the first sample, i.e. the first loss function is
Figure BDA0002026464720000126
And
Figure BDA0002026464720000127
the error is used to characterize the error of the sample data set standard type tag in the first sample. Wherein the content of the first and second substances,
Figure BDA0002026464720000128
is sample data
Figure BDA0002026464720000129
The distribution error of (a) is calculated,
Figure BDA00020264647200001210
wherein the content of the first and second substances,
Figure BDA00020264647200001211
is sample data
Figure BDA00020264647200001212
Probability of belonging to c image type, and
Figure BDA00020264647200001213
where d is a distance function, which may be a euclidean distance in this embodiment. C' is any image type under the total C image types, and the meaning of the formula is that each image type isThe probabilities belonging to the C image types are normalized on the basis of the probabilities of all the C image types.
Correspondingly, after the first loss function is determined, the first loss function is reversely input into the first image classification model, and the network parameters of the first image classification model are adjusted through a gradient descent method, wherein the network parameters at least comprise the weight. Optionally, after adjusting parameters of the first image classification model according to the loss function, the method further includes: the second sample is subjected to re-prediction according to the adjusted first image classification model, and the prediction type label of the second sample is updated; determining a new loss function according to the first sample, the standard type label of the first sample, the second sample and the updated prediction type label of the second sample; and iteratively adjusting parameters of the first image classification model according to the new loss function until the new loss function is smaller than a preset threshold value, and generating a target image classification model.
According to the technical scheme provided by the embodiment, a first class central point is determined according to a first sample, a second class central point is determined according to a second sample, a third class central point is determined according to the first sample and the second sample together, a loss function is determined according to difference values among the first class central point, the second class central point and the third class central point of the same image type, parameters of the first image classification model are adjusted in an iterative mode, the first image classification model is updated, a predicted image type of the second sample is updated based on the updated first image classification model, and differences among the first class central point, the second class central point and the third class central point of the same image type are reduced continuously, so that the first class central point, the second class central point and the third class central point of the same image type are close to or even overlapped, and a target image classification model is generated. According to the scheme, the first image classification model is continuously updated, and the second sample prediction result without the standard type label is updated, so that the first image classification model is unsupervised and trained, a supervised model training method in the related technology is replaced, the processing workload of sample data is greatly reduced, and the total training time of the target image classification model is reduced.
EXAMPLE III
Fig. 4 is a schematic flow chart of an image classification method provided in a third embodiment of the present invention, and on the basis of the third embodiment, an unsupervised training method of a target image classification model is optimized, where the method specifically includes:
s310, establishing a first image classification model according to the first sample with the standard type label.
S320, predicting the second sample without the standard type label according to the first image classification model, and generating a prediction type label of the second sample according to a prediction result.
S330, determining a first type central point of each image type according to the first sample and the standard type label of the first sample.
S340, determining a second class center point of each image type according to the second sample and the prediction type label of the second sample.
And S350, determining a third class center point of each image type according to the first sample, the standard type label of the first sample, the second sample and the prediction type label of the second sample.
S360, determining a first loss function according to at least two items of the first-class central point, the second-class central point and the third-class central point of each image type.
And S370, determining first distribution probability of the test sample in various image types according to the first type central point of each image type.
And S380, determining second distribution probabilities of the test sample in various image types according to the second type central point of each image type.
And S390, determining a third distribution probability of the test sample in each image type according to the third class central point of each image type.
S3100, determining a second loss function according to at least two of the first distribution probability, the second distribution probability and the third distribution probability of the test sample in various image types.
S3110, adjusting parameters of the first image classification model according to the first loss function and/or the second loss function, and generating the target image classification model.
S3120, acquiring an image to be processed, inputting the image into a pre-trained target image classification model, and determining the image type of the image to be processed according to the output result of the target image classification model.
In this embodiment, since the classification of the image may be performed by determining the euclidean distances between the image and the class center points of various image types, the image type of the image is determined according to the class center point closest to the image. Three measurement spaces for image classification are formed by determining the first type center point, the second type center point or the third type center point of each image type, such as a measurement space composed of the first type center points of each image type, a measurement space composed of the second type center points of each image type and a measurement space composed of the third type center points of each image type. Classifying the same test sample based on the three measurement spaces, determining the distribution probability of various image types of the test sample in each measurement space, and further adjusting the first image classification model according to the difference of the distribution probabilities so as to reduce the difference of the distribution probabilities of various image types of the test sample in each measurement space.
In step S370, the first distribution probability determining method includes: extracting a feature vector of the test sample; determining a first distance between a feature vector of the test sample and a first type center point of each image type; determining a first probability that the test sample belongs to each image type as a function of the first distance, wherein the first probability is inversely related to the first distance, the first distribution probability being determined as a function of the first probability that the test sample belongs to the each image type. The extracting of the feature vector of the test sample may be inputting the test sample to the first image classification model, and taking an output feature vector of a prediction level in the first image classification model as the feature vector of the test sample. And calculating Euclidean distances between the feature vectors of the test sample and the first class central point of each image type, namely a first distance. Corresponding to any image type, the first probability that the test sample belongs to the current image type is in negative correlation with the first distance between the test sample and the first type central point of the current image type.
Correspondingly, in step S380, the method for determining the second distribution probability includes: extracting a feature vector of the test sample; determining a second distance between the feature vector of the test sample and a second class center point of each image type; and determining a second probability of the test sample belonging to each image type according to the second distance, wherein the second probability is inversely related to the second distance, and determining the second distribution probability according to the second probability of the test sample belonging to each image type. In step S390, the method for determining the third distribution probability includes: extracting a feature vector of the test sample; determining a third distance between the feature vector of the test sample and a third class center point of each image type; determining a third probability that the test sample belongs to each image type according to the third distance, wherein the third probability is inversely related to the third distance, and determining the third distribution probability according to the third probability that the test sample belongs to each image type.
Optionally, determining a second loss function according to at least two of the first distribution probability, the second distribution probability and the third distribution probability of the test sample in various image types includes: determining a probability distribution difference of the test sample according to at least two of the first distribution probability, the second distribution probability and the third distribution probability; determining the first loss function according to the probability distribution difference of the test samples. In the present embodiment, the probability distribution difference between two distribution probabilities is represented by a KL divergence. The KL divergence of any two items of the first distribution probability, the second distribution probability and the third distribution probability of the current image type may be determined as the probability distribution difference of the current image type, or the KL divergence of any two items of the first distribution probability, the second distribution probability and the third distribution probability of the current image type may be used to determine the probability distribution difference of the current image type. And averaging the class center point differences of various image types to determine a second loss function. Illustratively, the second loss function may be determined according to the following equation (6):
Figure BDA0002026464720000161
wherein the content of the first and second substances,
Figure BDA0002026464720000162
a second loss function determined jointly from the first distribution probability, the second distribution probability and the third distribution probability,
Figure BDA0002026464720000163
is the first distribution probability that is the probability of the first distribution,
Figure BDA0002026464720000164
is the second distribution probability and is the second distribution probability,
Figure BDA0002026464720000165
is a third distribution probability, DKL(. to) is the divergence function, | SsL is the first number of samples,
Figure BDA0002026464720000166
is the second number of samples. Wherein the content of the first and second substances,
Figure BDA0002026464720000167
Figure BDA0002026464720000171
Figure BDA0002026464720000172
at the same time, the user can select the desired position,
Figure BDA0002026464720000173
by the same token can obtain
Figure BDA0002026464720000174
Figure BDA0002026464720000175
And
Figure BDA0002026464720000176
and will not be described in detail herein.
In some embodiments, the second loss function may also be
Figure BDA0002026464720000177
Or
Figure BDA0002026464720000178
And are not limited herein.
In this embodiment, a first loss function is determined according to the class center point difference, a second loss function is determined according to the distribution probability difference of the test sample, the first image classification model may be individually parameter-adjusted based on the first loss function or the second loss function, or the first image classification model may be parameter-adjusted according to the first loss function and the second loss function together until the first class center point, the second class center point, and the third class center point of the same image type are close to or overlapped with each other, and when the distribution probabilities of the test sample in the various image types are the same or similar, it is determined that the training of the target image classification model is completed.
The new loss function determined by the first loss function and the second loss function at this time may be:
Figure BDA0002026464720000179
wherein α and β are preset constants, it should be noted that the first loss function may be a function of
Figure BDA00020264647200001710
Figure BDA00020264647200001711
Or
Figure BDA00020264647200001712
The second loss function may also be
Figure BDA00020264647200001713
Figure BDA00020264647200001714
Or
Figure BDA00020264647200001715
And are not limited herein.
According to the technical scheme provided by the embodiment, the training precision and the training efficiency of the target image classification model are improved by determining the first loss function and the second loss function, performing unsupervised training on the first image classification model based on the first loss function and/or the second loss function, and adjusting the first image classification model by the double loss functions.
Example four
Fig. 5 is a schematic structural diagram of an image classification apparatus provided in the fourth embodiment of the present invention, the apparatus includes a to-be-processed image acquisition module 410 and an image type determination module 420, where:
a to-be-processed image obtaining module 410, configured to obtain an image to be processed, and input the image to a pre-trained target image classification model, where the target image classification model is obtained by performing unsupervised learning on a first sample with a standard type label and a second sample without the standard type label;
and an image type determining module 420, configured to determine, according to an output result of the target image classification model, an image type to which the image to be processed belongs.
Optionally, the apparatus further comprises:
the model establishing module is used for establishing a first image classification model according to the first sample with the standard type label;
the sample prediction module is used for predicting the second sample without the standard type label according to the first image classification model and generating a prediction type label of the second sample according to a prediction result;
a first loss function determination module for determining a first loss function based on the first sample, the standard type label of the first sample, the second sample, and the prediction type label of the second sample;
and the first model adjusting module is used for adjusting parameters of the first image classification model according to the first loss function to generate the target image classification model.
Optionally, the first loss function determining module includes:
the first type central point determining unit is used for determining a first type central point of each image type according to the first sample and a standard type label of the first sample;
the second-class central point determining unit is used for determining a second-class central point of each image type according to the second sample and the prediction type label of the second sample;
a third-class central point determining unit, configured to determine a third-class central point of each image type according to the first sample, the standard type label of the first sample, the second sample, and the prediction type label of the second sample;
a first loss function determining unit, configured to determine a first loss function according to at least two of the first class center point, the second class center point, and the third class center point of each image type.
Optionally, the class center point determining unit is configured to:
image types of the first sample and/or the second sample according to the standard type label or the prediction type label;
for the first sample and/or the second sample of any image type, extracting a feature vector of the first sample and/or the second sample, and determining the position of the first sample and/or the position of the second sample according to the feature vector and a preset function;
determining the first class of center points, the second class of center points, or the third class of center points according to the position of the first sample and/or the position of the second sample.
Optionally, the first loss function determining unit is configured to:
for any image type, determining class center point difference according to at least two items of the first class center point, the second class center point and the third class center point;
the first loss function is determined from the mean of the class midpoint differences for each image type.
Optionally, the apparatus further comprises:
the first distribution probability determining module is used for determining first distribution probabilities of the test sample in various image types according to the first type central point of each image type;
the second distribution probability determining module is used for determining second distribution probabilities of the test sample in various image types according to the second type central point of each image type;
the third distribution probability determining module is used for determining the third distribution probability of the test sample in various image types according to the third class central point of each image type;
a second loss function determination module, configured to determine a second loss function according to at least two of the first distribution probability, the second distribution probability, and the third distribution probability of the test sample in each image type;
correspondingly, the model adjusting module is configured to adjust parameters of the first image classification model according to the first loss function and/or the second loss function, so as to generate the target image classification model.
Optionally, the distribution probability determining module is configured to:
extracting a feature vector of the test sample;
determining a first distance, a second distance or a third distance between a feature vector of the test sample and the first type center point, the second type center point or the third type center point of each image type;
determining a first probability, a second probability or a third probability of the test sample belonging to each image type according to the first distance, the second distance or the third distance, wherein the first probability is negatively correlated with the first distance, the second probability is negatively correlated with the second distance, and the third probability is negatively correlated with the third distance;
determining the first, second or third distribution probabilities according to the first, second or third probabilities that the test sample belongs to the each image type.
Optionally, the second loss function determining module is configured to:
determining a probability distribution difference of the test sample according to at least two of the first distribution probability, the second distribution probability and the third distribution probability;
determining the first loss function according to the probability distribution difference of the test samples.
Optionally, the apparatus further comprises:
the prediction type label updating module is used for carrying out re-prediction on the second sample according to the adjusted first image classification model after the parameters of the first image classification model are adjusted according to the loss function, and updating the prediction type label of the second sample;
a loss function updating module, configured to determine a new loss function according to the first sample, the standard type label of the first sample, the second sample, and the updated prediction type label of the second sample;
and the second model adjusting module is used for iteratively adjusting the parameters of the first image classification model according to the new loss function until the new loss function is smaller than a preset threshold value.
The image classification device provided by the embodiment of the invention can execute the image classification method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the image classification method.
EXAMPLE five
Fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 6 illustrates a block diagram of an electronic device 512 that is suitable for use in implementing embodiments of the present invention. The electronic device 512 shown in fig. 6 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention. Device 512 is typically an electronic device that undertakes image classification functions.
As shown in fig. 6, the electronic device 512 is in the form of a general purpose computing device. Components of the electronic device 512 may include, but are not limited to: one or more processors 516, a storage device 528, and a bus 518 that couples the various system components including the storage device 528 and the processors 516.
Bus 518 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 512 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 512 and includes both volatile and nonvolatile media, removable and non-removable media.
Storage 528 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 530 and/or cache Memory 532. The electronic device 512 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 534 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk-Read Only Memory (CD-ROM), a Digital Video disk (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 518 through one or more data media interfaces. Storage 528 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
Program 536 having a set (at least one) of program modules 526 may be stored, for example, in storage 528, such program modules 526 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination may include an implementation of a network environment. Program modules 526 generally perform the functions and/or methodologies of the described embodiments of the invention.
The electronic device 512 may also communicate with one or more external devices 514 (e.g., keyboard, pointing device, camera, display 524, etc.), with one or more devices that enable a user to interact with the electronic device 512, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 512 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 522. Also, the electronic device 512 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), and/or a public Network such as the internet) via the Network adapter 520. As shown, the network adapter 520 communicates with the other modules of the electronic device 512 via the bus 518. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 512, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape drives, and data backup storage systems, to name a few.
The processor 516 executes various functional applications and data processing by executing programs stored in the storage device 528, for example, to implement the image classification method provided by the above-described embodiment of the present invention.
EXAMPLE six
The sixth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the image classification method provided in the sixth embodiment of the present invention.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the method operations described above, and may also execute the image classification method provided by any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. An image classification method, comprising:
acquiring an image to be processed, and inputting the image into a pre-trained target image classification model, wherein the target image classification model is obtained by training a first sample with a standard type label and a second sample without the standard type label in an unsupervised learning manner;
and determining the image type of the image to be processed according to the output result of the target image classification model.
2. The method of claim 1, wherein the training method of the target image classification model comprises:
establishing a first image classification model according to the first sample with the standard type label;
predicting the second sample without the standard type label according to the first image classification model, and generating a prediction type label of the second sample according to a prediction result;
determining a first loss function from the first sample, the standard type label for the first sample, the second sample, and the predicted type label for the second sample;
and adjusting parameters of the first image classification model according to the first loss function to generate the target image classification model.
3. The method of claim 2, wherein determining a first loss function based on the first sample, the standard type label for the first sample, the second sample, and the predicted type label for the second sample comprises:
determining a first type central point of each image type according to the first sample and a standard type label of the first sample;
determining a second type central point of each image type according to the second sample and the prediction type label of the second sample;
determining a third class center point of each image type according to the first sample, the standard type label of the first sample, the second sample and the prediction type label of the second sample;
determining a first loss function according to at least two of the first class center point, the second class center point, and the third class center point of each image type.
4. The method according to claim 3, wherein the determining method of the first type of center point, the second type of center point or the third type of center point comprises:
determining an image type of the first sample and/or the second sample according to the standard type label or the prediction type label;
for the first sample and/or the second sample of any image type, extracting a feature vector of the first sample and/or the second sample, and determining the position of the first sample and/or the position of the second sample according to the feature vector and a preset function;
determining the first class of center points, the second class of center points, or the third class of center points according to the position of the first sample and/or the position of the second sample.
5. The method of claim 3, wherein determining a first loss function from at least two of the first center point class, the second center point class, and the third center point class for each image type comprises:
for any image type, determining class center point difference according to at least two items of the first class center point, the second class center point and the third class center point;
the first loss function is determined from the mean of the class midpoint differences for each image type.
6. The method of claim 3, further comprising:
determining first distribution probability of the test sample in various image types according to the first type central point of each image type;
determining a second distribution probability of the test sample in various image types according to the second type central point of each image type;
determining a third distribution probability of the test sample in various image types according to the third type central point of each image type;
determining a second loss function according to at least two items of the first distribution probability, the second distribution probability and the third distribution probability of the test sample in various image types;
correspondingly, adjusting parameters of the first image classification model according to the first loss function to generate the target image classification model includes:
and adjusting parameters of the first image classification model according to the first loss function and/or the second loss function to generate the target image classification model.
7. The method of claim 6, wherein the determining of the first distribution probability, the second distribution probability, or the third distribution probability comprises:
extracting a feature vector of the test sample;
determining a first distance, a second distance or a third distance between a feature vector of the test sample and the first type center point, the second type center point or the third type center point of each image type;
determining a first probability, a second probability or a third probability of the test sample belonging to each image type according to the first distance, the second distance or the third distance, wherein the first probability is negatively correlated with the first distance, the second probability is negatively correlated with the second distance, and the third probability is negatively correlated with the third distance;
determining the first, second or third distribution probabilities according to the first, second or third probabilities that the test sample belongs to the each image type.
8. The method of claim 6, wherein determining a second loss function based on at least two of the first distribution probability, the second distribution probability, and the third distribution probability of the test sample at various image types comprises:
determining a probability distribution difference of the test sample according to at least two of the first distribution probability, the second distribution probability and the third distribution probability;
determining the first loss function according to the probability distribution difference of the test samples.
9. The method of claim 2, further comprising, after adjusting parameters of the first image classification model according to the loss function:
the second sample is subjected to re-prediction according to the adjusted first image classification model, and the prediction type label of the second sample is updated;
determining a new loss function according to the first sample, the standard type label of the first sample, the second sample and the updated prediction type label of the second sample;
and iteratively adjusting parameters of the first image classification model according to the new loss function until the new loss function is smaller than a preset threshold value.
10. An image classification apparatus, comprising:
the image classification model comprises a to-be-processed image acquisition module, a to-be-processed image acquisition module and a pre-trained target image classification model, wherein the to-be-processed image acquisition module is used for acquiring an image to be processed and inputting the image into the pre-trained target image classification model, and the target image classification model is obtained by training a first sample with a standard type label and a second sample without the standard type label in an unsupervised learning mode;
and the image type determining module is used for determining the image type of the image to be processed according to the output result of the target image classification model.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the image classification method according to any one of claims 1 to 9 when executing the program.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the image classification method according to any one of claims 1 to 9.
CN201910295876.6A 2019-04-12 2019-04-12 Image classification method and device, electronic equipment and storage medium Pending CN111753863A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910295876.6A CN111753863A (en) 2019-04-12 2019-04-12 Image classification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910295876.6A CN111753863A (en) 2019-04-12 2019-04-12 Image classification method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111753863A true CN111753863A (en) 2020-10-09

Family

ID=72672670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910295876.6A Pending CN111753863A (en) 2019-04-12 2019-04-12 Image classification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111753863A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112463844A (en) * 2020-12-15 2021-03-09 北京百奥智汇科技有限公司 Data processing method and device, electronic equipment and storage medium
CN112733970A (en) * 2021-03-31 2021-04-30 腾讯科技(深圳)有限公司 Image classification model processing method, image classification method and device
CN112884060A (en) * 2021-03-09 2021-06-01 联仁健康医疗大数据科技股份有限公司 Image annotation method and device, electronic equipment and storage medium
CN113177602A (en) * 2021-05-11 2021-07-27 上海交通大学 Image classification method and device, electronic equipment and storage medium
CN113222050A (en) * 2021-05-26 2021-08-06 北京有竹居网络技术有限公司 Image classification method and device, readable medium and electronic equipment
CN114443896A (en) * 2022-01-25 2022-05-06 百度在线网络技术(北京)有限公司 Data processing method and method for training a predictive model
CN114743043A (en) * 2022-03-15 2022-07-12 北京迈格威科技有限公司 Image classification method, electronic device, storage medium and program product
WO2023137911A1 (en) * 2022-01-21 2023-07-27 平安科技(深圳)有限公司 Intention classification method and apparatus based on small-sample corpus, and computer device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104408469A (en) * 2014-11-28 2015-03-11 武汉大学 Firework identification method and firework identification system based on deep learning of image
CN108197666A (en) * 2018-01-30 2018-06-22 咪咕文化科技有限公司 A kind of processing method, device and the storage medium of image classification model
US10025950B1 (en) * 2017-09-17 2018-07-17 Everalbum, Inc Systems and methods for image recognition
CN108416370A (en) * 2018-02-07 2018-08-17 深圳大学 Image classification method, device based on semi-supervised deep learning and storage medium
CN109101602A (en) * 2018-08-01 2018-12-28 腾讯科技(深圳)有限公司 Image encrypting algorithm training method, image search method, equipment and storage medium
CN109241816A (en) * 2018-07-02 2019-01-18 北京交通大学 It is a kind of based on label optimization image identifying system and loss function determine method again

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104408469A (en) * 2014-11-28 2015-03-11 武汉大学 Firework identification method and firework identification system based on deep learning of image
US10025950B1 (en) * 2017-09-17 2018-07-17 Everalbum, Inc Systems and methods for image recognition
CN108197666A (en) * 2018-01-30 2018-06-22 咪咕文化科技有限公司 A kind of processing method, device and the storage medium of image classification model
CN108416370A (en) * 2018-02-07 2018-08-17 深圳大学 Image classification method, device based on semi-supervised deep learning and storage medium
CN109241816A (en) * 2018-07-02 2019-01-18 北京交通大学 It is a kind of based on label optimization image identifying system and loss function determine method again
CN109101602A (en) * 2018-08-01 2018-12-28 腾讯科技(深圳)有限公司 Image encrypting algorithm training method, image search method, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李绣心;凌志刚;邹文;: "基于卷积神经网络的半监督高光谱图像分类", 电子测量与仪器学报, no. 10 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112463844B (en) * 2020-12-15 2022-10-18 北京百奥智汇科技有限公司 Data processing method and device, electronic equipment and storage medium
CN112463844A (en) * 2020-12-15 2021-03-09 北京百奥智汇科技有限公司 Data processing method and device, electronic equipment and storage medium
CN112884060A (en) * 2021-03-09 2021-06-01 联仁健康医疗大数据科技股份有限公司 Image annotation method and device, electronic equipment and storage medium
CN112884060B (en) * 2021-03-09 2024-04-26 联仁健康医疗大数据科技股份有限公司 Image labeling method, device, electronic equipment and storage medium
CN112733970A (en) * 2021-03-31 2021-04-30 腾讯科技(深圳)有限公司 Image classification model processing method, image classification method and device
CN112733970B (en) * 2021-03-31 2021-06-18 腾讯科技(深圳)有限公司 Image classification model processing method, image classification method and device
CN113177602B (en) * 2021-05-11 2023-05-26 上海交通大学 Image classification method, device, electronic equipment and storage medium
CN113177602A (en) * 2021-05-11 2021-07-27 上海交通大学 Image classification method and device, electronic equipment and storage medium
CN113222050A (en) * 2021-05-26 2021-08-06 北京有竹居网络技术有限公司 Image classification method and device, readable medium and electronic equipment
CN113222050B (en) * 2021-05-26 2024-05-03 北京有竹居网络技术有限公司 Image classification method and device, readable medium and electronic equipment
WO2023137911A1 (en) * 2022-01-21 2023-07-27 平安科技(深圳)有限公司 Intention classification method and apparatus based on small-sample corpus, and computer device
CN114443896A (en) * 2022-01-25 2022-05-06 百度在线网络技术(北京)有限公司 Data processing method and method for training a predictive model
CN114443896B (en) * 2022-01-25 2023-09-15 百度在线网络技术(北京)有限公司 Data processing method and method for training predictive model
CN114743043A (en) * 2022-03-15 2022-07-12 北京迈格威科技有限公司 Image classification method, electronic device, storage medium and program product
CN114743043B (en) * 2022-03-15 2024-04-26 北京迈格威科技有限公司 Image classification method, electronic device, storage medium and program product

Similar Documents

Publication Publication Date Title
CN111753863A (en) Image classification method and device, electronic equipment and storage medium
US10936911B2 (en) Logo detection
CN108280477B (en) Method and apparatus for clustering images
CN108229419B (en) Method and apparatus for clustering images
CN110472675B (en) Image classification method, image classification device, storage medium and electronic equipment
CN108830329B (en) Picture processing method and device
CN109697451B (en) Similar image clustering method and device, storage medium and electronic equipment
CN111414946B (en) Artificial intelligence-based medical image noise data identification method and related device
CN113128478B (en) Model training method, pedestrian analysis method, device, equipment and storage medium
CN111027605A (en) Fine-grained image recognition method and device based on deep learning
CN111368878B (en) Optimization method based on SSD target detection, computer equipment and medium
CN111428557A (en) Method and device for automatically checking handwritten signature based on neural network model
CN113361593B (en) Method for generating image classification model, road side equipment and cloud control platform
US20200082213A1 (en) Sample processing method and device
CN112149754B (en) Information classification method, device, equipment and storage medium
CN113128536A (en) Unsupervised learning method, system, computer device and readable storage medium
CN112883990A (en) Data classification method and device, computer storage medium and electronic equipment
CN108229680B (en) Neural network system, remote sensing image recognition method, device, equipment and medium
CN111062440B (en) Sample selection method, device, equipment and storage medium
CN111340213B (en) Neural network training method, electronic device, and storage medium
CN110135428B (en) Image segmentation processing method and device
CN110490058B (en) Training method, device and system of pedestrian detection model and computer readable medium
CN113239883A (en) Method and device for training classification model, electronic equipment and storage medium
CN113902944A (en) Model training and scene recognition method, device, equipment and medium
CN112818946A (en) Training of age identification model, age identification method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination