CN109376786A - A kind of image classification method, device, terminal device and readable storage medium storing program for executing - Google Patents
A kind of image classification method, device, terminal device and readable storage medium storing program for executing Download PDFInfo
- Publication number
- CN109376786A CN109376786A CN201811284267.2A CN201811284267A CN109376786A CN 109376786 A CN109376786 A CN 109376786A CN 201811284267 A CN201811284267 A CN 201811284267A CN 109376786 A CN109376786 A CN 109376786A
- Authority
- CN
- China
- Prior art keywords
- image
- activation value
- classification
- sample
- class
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
Abstract
The present invention is suitable for technical field of image processing, provides image classification method, device, terminal device and readable storage medium storing program for executing, which comprises is trained by known class image to depth convolutional neural networks, obtains network training model;Probability Distribution Model is established respectively to every a kind of sample in the known class image according to the network training model;The activation value of the known class image is corrected according to the probability Distribution Model;The activation value of unknown classification image is obtained according to the activation value of the known class image data;According to the activation value of the known class image and the activation value of the unknown classification image, classify to image.It through the invention can be in practical applications to the rationally accurately classification of the image progress except training collection class in known class image.
Description
Technical field
The invention belongs to technical field of image processing more particularly to a kind of image classification method, device, terminal devices and can
Read storage medium.
Background technique
Image classification can be extracted by characteristic of the computer to image, handled and be analyzed, and identify difference
Target and object, classify according to different characteristics to image.
Currently, when carrying out image classification, the algorithm based on deep neural network utilizes trained image classification model
Classify to known image data, wherein trained image classification model is according to the instruction in identical classification space
Practice image data and test image data and generates;Or it is carried out pair according to certain a kind of image pattern activation value correctly classified
The judgement of unknown images classification;However it then can not be into for the image of the unknown classification except the training set of known image classification
The reasonable classification of row, and have the defects that image classification inaccuracy.
Summary of the invention
In view of this, the embodiment of the invention provides a kind of image classification method, device, terminal device and readable storage mediums
Matter, with solve in the prior art however then can not be into for the image of the unknown classification except the training set of known image classification
The reasonable classification of row, and there are problems that the defect of image classification inaccuracy.
The first aspect of the embodiment of the present invention provides a kind of image classification method, comprising:
Depth convolutional neural networks are trained by known class image, obtain network training model;
Probability distribution is established respectively to every a kind of sample in the known class image according to the network training model
Model;
The activation value of the known class image is corrected according to the probability Distribution Model;
The activation value of unknown classification image is obtained according to the activation value of the known class image data;
According to the activation value of the known class image and the activation value of the unknown classification image, image is divided
Class.
In one embodiment, depth convolutional neural networks are trained by known class image, obtain network instruction
Practice model, comprising:
The known class image that will acquire is divided into training set and test set;
By the image training depth convolutional neural networks of the training set, by the image of the test set to institute
The test that depth convolutional neural networks carry out classification performance is stated, network class result is exported;
It is exercised supervision operation by loss function to the network class result, obtains supervision operation result;
The network parameter of the depth convolutional neural networks is adjusted according to the supervision operation result.
In one embodiment, according to the network training model to every a kind of sample point in the known class image
Do not establish probability Distribution Model, comprising:
Obtain the mean vector of every one kind sample in the known class image;
Calculate every one kind the distance between sample and the mean vector in the known class image;
According to the distance, several input samples are chosen from every a kind of sample by preset ratio;
The model of the probability Distribution Model corresponding with input sample classification is estimated according to several described input samples
Parameter.
In one embodiment, the activation value of the known class image is corrected according to the probability Distribution Model, comprising:
Pass through the first activation value of test sample in known class image described in the network training model extraction;
According to first activation value, the sample class of preset quantity is chosen from the test sample;
According in the probability Distribution Model corresponding with the sample class of the preset quantity and the sample class
Test sample the first activation value, calculate the affiliated probability of the test sample in the sample class of the preset quantity;
The first activation value that the test sample in the sample class of the preset quantity is corrected according to the affiliated probability, is obtained
Take the second activation value.
In one embodiment, the activation of unknown classification image is obtained according to the activation value of the known class image data
Value, comprising:
According to first activation value of the test sample in the sample class of the preset quantity of selection and described second
Activation value calculates the activation value of unknown classification imageCalculation formula are as follows:
Wherein,For the first activation value of test sample,For revised second activation value, C is known class figure
Total classification number of picture, c are any c class sample in total classification number.
In one embodiment, according to the activation of the activation value of the known class image and the unknown classification image
Value, classifies to image, comprising:
The activation value of the image of the known class and the activation value of unknown classification image are normalized, obtained
The new activation value of image;
Class label undetermined corresponding to the maximum current test image of activation value is selected from the new activation value;
Judge whether the class label undetermined is corresponding with the unknown class label;
If so, refusal identifies the current test image, and determine that the current test image is undefined classification;
If it is not, then judging whether the corresponding activation value of the current altimetric image is less than preset threshold;
If so, refusal identifies the current test image, and determine that the current test image is undefined classification;
If it is not, then determining that the current test image belongs to known class, known class is carried out to the current test image
Other classification.
The second aspect of the embodiment of the present invention provides a kind of image classification device, comprising:
First model acquiring unit is obtained for being trained by known class image to depth convolutional neural networks
Network training model;
Second model acquiring unit, for according to the network training model to every one kind in the known class image
Sample establishes probability Distribution Model respectively;
Amending unit, for correcting the activation value of the known class image according to the probability Distribution Model;
Activation value acquiring unit, for obtaining unknown classification image according to the activation value of the known class image data
Activation value;
Image classification judging unit, for the activation value and the unknown classification image according to the known class image
Activation value, classify to image.
In one embodiment, the first model acquiring unit includes:
Data division module, the known class image for will acquire are divided into training set and test set;
First result-generation module, for the image data training depth convolutional Neural net by the training set
Network carries out the test of classification performance by the image data of the test set to the depth convolutional neural networks after training,
Export network class result;
Second result-generation module is obtained for being exercised supervision operation by loss function to the network class result
Supervise operation result;
Parameter adjustment module, the network for adjusting the depth convolutional neural networks according to the supervision operation result are joined
Number.
The third aspect of the embodiment of the present invention provides a kind of terminal device, including memory, processor and is stored in
In the memory and the computer program that can run on the processor, when the processor executes the computer program
The step of realizing above-mentioned image classification method.
The fourth aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the computer-readable storage
Media storage has the step of computer program, the computer program realizes above-mentioned image classification method when being executed by processor.
Existing beneficial effect is the embodiment of the present invention compared with prior art: through the embodiment of the present invention, by known
Classification image is trained depth convolutional neural networks, obtains network training model;According to the network training model to institute
The every a kind of sample stated in known class image establishes probability Distribution Model respectively;According to probability Distribution Model amendment
The activation value of known class image;The activation of unknown classification image is obtained according to the activation value of the known class image data
Value;According to the activation value of the known class image and the activation value of the unknown classification image, classify to image;It is logical
The training to depth convolutional neural networks is crossed, the recognition capability to image is optimized, improves network training model to image point
The performance of class;By establishing probability Distribution Model and amendment to image activation value to every a kind of sample, can preferably carve
The classification of picture picture improves the accuracy and reasonability of image classification, avoids image the asking by mistake point to unknown classification
Topic;It realizes to the Rational Classification in practical applications to the image except training collection class in known class image;Have
Stronger ease for use and practicability.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some
Embodiment for those of ordinary skill in the art without any creative labor, can also be according to these
Attached drawing obtains other attached drawings.
Fig. 1 is the implementation process schematic diagram for the image classification method that the embodiment of the present invention one provides;
Fig. 2 is schematic diagram of the loss function to network monitoring of the offer of the embodiment of the present invention one;
Fig. 3 is the implementation process schematic diagram for the acquisition network training model that the embodiment of the present invention one provides;
Fig. 4 is the implementation process schematic diagram for establishing probability Distribution Model that the embodiment of the present invention one provides;
Fig. 5 is the implementation process schematic diagram for the amendment activation value that the embodiment of the present invention one provides;
Fig. 6 is the implementation process schematic diagram classified to image that the embodiment of the present invention one provides;
Fig. 7 is the schematic diagram of image classification device provided by Embodiment 2 of the present invention;
Fig. 8 is the schematic diagram of terminal device provided in an embodiment of the present invention.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed
Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific
The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
The detailed description of road and method, in case unnecessary details interferes description of the invention.
It should be appreciated that ought use in this specification and in the appended claims, term " includes " instruction is described special
Sign, entirety, step, operation, the presence of element and/or component, but be not precluded one or more of the other feature, entirety, step,
Operation, the presence or addition of element, component and/or its set.
It is also understood that mesh of the term used in this description of the invention merely for the sake of description specific embodiment
And be not intended to limit the present invention.As description of the invention and it is used in the attached claims, unless on
Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
Description and claims of this specification and term " includes " and their any deformations in above-mentioned attached drawing, meaning
Figure, which is to cover, non-exclusive includes.Such as process, method or system comprising a series of steps or units, product or equipment do not have
It is defined in listed step or unit, but optionally further comprising the step of not listing or unit, or optionally also wrap
Include the other step or units intrinsic for these process, methods, product or equipment.In addition, term " first ", " second " and
" third " etc. is for distinguishing different objects, not for description particular order.
It will be further appreciated that the term "and/or" used in description of the invention and the appended claims is
Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
In order to illustrate technical solutions according to the invention, the following is a description of specific embodiments.
It is the implementation process schematic diagram of image classification method provided in an embodiment of the present invention referring to Fig. 1, this method is intended to solve
Certainly current image classification method cannot directly expanded application go to identify the image class not occurred in network model training process
Not, image classification method lacks the problem of migration and flexibility.It can not only be realized by this method to having now been found that, remember
The image category of record carries out rationally correctly identification and classification, can also select to the image of unknown classification, meet base
In the actual demand of opener image recognition application scenarios.As shown, method includes the following steps:
Step S101 is trained depth convolutional neural networks by known class image, obtains network training model.
In embodiments of the present invention, the known class image is the other picture of fixed class of selection under conditions of closed set
Depth convolutional neural networks are trained;Classification relative to the image under open set condition is known classification, and opener
Under the conditions of image then include unknown classification image.Known class image includes for depth volume machine neural network training figure
Picture and test image, the classification space of training image and the classification space of test image are identical;For example, as it is known that classification figure
It altogether include 5 classifications as including 500 images, each classification includes 100, wherein 80 belong to training image, 20 belong to
Test image.
In addition, selection corresponds to according to the complexity of the class number of the data set of known class image and data set
Basis of classification network, such as selected basic network may include residual error network ResNet50 network, but be not limited only to this
Network.
It should be noted that during being trained by known class image to depth convolutional neural networks, for
Network training model output result is also provided with network monitoring part;Network monitoring part calculates depth convolution by loss function
The loss function value of neural network, backpropagation realize network training model gradient revolution, to depth convolutional neural networks into
Row optimization and the update of parameter improve depth convolutional neural networks in the recognition capability of training stage.The loss
Function may include the superposition of a variety of loss functions, such as supervise net jointly using cross entropy loss function and center loss function
Network classification results, loss function as shown in Figure 2 is to the schematic diagram of network monitoring, after depth convolutional neural networks input picture,
It exports by network training model as a result, being supervised by cross entropy loss function and center loss function to network training model
It superintends and directs, turns round gradient using supervision result, promote the update of network training model parameter, improve the identification of network training model image
Ability.
The embodiment implemented as step S101 mono-, the realization stream of acquisition network training model as shown in Figure 3
Journey schematic diagram is trained depth convolutional neural networks by known class image, obtains network training model, comprising:
Step S1011, the known class image that will acquire are divided into training set and test set.
Step S1012 passes through the test set by the image training depth convolutional neural networks of the training set
Image to the depth convolutional neural networks carry out classification performance test, export network class result.
In the present embodiment, the data set for compiling known class image, using data set as a closed space collection
Conjunction is divided, and training set and test can be divided into;For example, as it is known that the data set of classification image includes 500 images, wrap altogether
Containing 5 classifications, each classification includes 100, wherein 80 belong to training set, 20 belong to test set;Test set and training have
There is identical classification space, includes the other image of 5 types.Pass through the image training depth convolutional neural networks of training set, benefit
With the classification performance of the image detection depth convolutional neural networks of test set, and export network class result.
Step S1013 exercises supervision operation to the network class result by loss function, obtains supervision operation knot
Fruit.
In the present embodiment, set loss function can be the superposition of a variety of loss functions, including cross entropy
Loss function (Cross-Entropy Loss), expression formula are as follows:
Wherein, x is the image for inputting network training model, and C is the total quantity of image category in data set, yiIndicate input
Image whether belong to the i-th class image (1 indicate belong to, 0 indicate be not belonging to), P (yi| x) indicate that the image of input belongs to the i-th class
Probability.
Set loss function further includes center loss function (Center Loss), is supervised to network class result
It superintends and directs, the center loss function indicates are as follows:
Lcenter=0.5* | | x-xc| | (2),
Wherein, x indicates that the image of input passes through the extracted feature of depth convolutional neural networks, xcIndicate the image of input
The central feature vector of affiliated classification.
In the training process to depth convolutional neural networks, in conjunction with above two loss function, two kinds of loss letters are utilized
Several pairs of depth convolutional neural networks are supervised jointly, to obtain comprehensive loss function, expression formula are as follows:
Ltotal=λcross_entropyLcross_entropy+λcenterLcenter(3),
Wherein, λcross_entropyAnd λcenterThe corresponding weight of two kinds of loss functions is respectively indicated, usually by λcross_entropy
It is set as 1, by λcenterIt is set as 8 × 10-7。
By the loss function of setting, the loss function value of depth convolutional neural networks is calculated, loss function value is reversed
It propagates, the gradient passback of depth convolutional neural networks is realized, to generate supervision operation result.
Step S1014 adjusts the network parameter of the depth convolutional neural networks according to the supervision operation result.
In the present embodiment, network parameter is the hyper parameter being arranged before the study of depth convolutional neural networks, including but
It is not limited to learning rate and the batch size etc. of depth convolutional neural networks, by choosing suitable hyper parameter, is trained and obtains
Supplemental characteristic is taken, operation is supervised by loss function, constantly updates the hyper parameter of the set depth convolutional neural networks of optimization,
More preferably hyper parameter is selected, the image recognition classification performance of network is promoted.
Through this embodiment, it is exercised supervision to network class result using a variety of loss functions, is intersected in addition to introducing jointly
Entropy function, while it being also provided with center loss function, be conducive to reduce in the training process in image in the class of every one kind sample
Distance increases the between class distance between different classes of sample, makes classification results with more distinction;Meanwhile letter is lost at center
Number also helps the calculating of subsequent every a kind of center of a sample vector;In comprehensive two kinds of loss function ratios, to two kinds of loss letters
Number is combined by the way of weighting, improves the accuracy of the image classification of network training model.
Step S102 establishes every a kind of sample in the known class image according to the network training model respectively
Probability Distribution Model.
In the present embodiment, it is known that the distribution presence of every a kind of sample in classification image is certain probability, passes through training
Good network training model is that every a kind of sample establishes a corresponding probability Distribution Model respectively, is determining probability Distribution Model
Before, it needs to calculate model parameter corresponding with such sample by existing sample, and then determines the sample in every one kind
Affiliated probability distribution, to determine whether image belongs to known class and provide probabilistic determination.
By probability Distribution Model, to analyzing for the sample distribution of each classification of image, figure can be preferably portrayed
As classification, determines whether image belongs to some classification by probability, promote the accuracy of the image classification of network training model.
The embodiment implemented as step S102 mono-, the realization stream for establishing probability Distribution Model as shown in Figure 4
Journey schematic diagram establishes probability distribution to every a kind of sample in the known class image according to the network training model respectively
Model, comprising:
Step S1021 obtains the mean vector of every one kind sample in the known class image.
In the present embodiment, correct sample of classifying in training set in known class image is selected, network training mould is passed through
The activation value (activation) of type calculating input image, input picture are correct sample of classifying, and calculate every a kind of sample and swash
The average value of value living obtains the mean vector (mean vector) of every a kind of sample, calculation formula are as follows:
Wherein, ucFor the mean vector of c class sample, NcClassify correct sample size for c class,For c class sample
In n-th of sample activation value.
By the activation value of the every a kind of correct classification samples of network training model extraction, it is corresponding to calculate every a kind of sample
It is worth vector.
Step S1022 calculates every one kind the distance between sample and the mean vector in the known class image.
In the present embodiment, in the training set of known class image, for correct sample of classifying, every a kind of sample is calculated
The distance between this and the mean vector of such sample, the distance may include Euclidean distance (Euclidean
Distance) and COS distance (Cosine distance), it is not limited here.With Euclidean distance (Euclidean
Distance) and for COS distance (Cosine distance) calculation, by calculated Euclidean distance deuclideaWith it is remaining
Chordal distance dcosineBy weighted combination, as final distance, calculation formula are as follows:
dtotal=λeuclideandeuclidean+λcosinedcosine(5);
Wherein, λeuclideanAnd λcosineThe weighting coefficient of respectively corresponding two kinds of distances, λeuclideanIt is set as 1/200,
λcosineIt is set as 1.
Step S1023 chooses several input samples from every a kind of sample by preset ratio according to the distance.
In the present embodiment, to the calculated final distance of every a kind of sample, according to extreme value estimation theory, most by calculating
Whole distance sorts by size, and a number of sample is obtained with preset ratio, and selected sample is corresponding final distance value
Sort forward sample, i.e., apart from several biggish samples.
The calculation formula of several sample sizes is obtained by preset ratio are as follows:
Ndf=λ Nc(6);
Wherein, NcFor the number of samples that c class is correctly classified, λ is the ratio that sample is chosen from c class sample, and λ can be with
Value 20%~40%, NdfTo be chosen in c class sample for the quantity as input sample.
Using certain several a kind of sample of selection as the input sample of probability Distribution Model, estimated according to the sample of input
Count the model parameter in probability Distribution Model corresponding with such sample.
Step S1024 estimates the probability distribution corresponding with input sample classification according to several described input samples
The model parameter of model.
In the present embodiment, the probability Distribution Model can be Weibull distribution model, the table of the probability Distribution Model
Up to formula are as follows:
Wherein, wnThe probability of Weibull distribution model estimation, a are inputted for certain a kind of input sampletestTo be extracted to input sample
Activation value, α be choose the biggish sample class of activation value number, n value range be [1, α], γ be Weibull distribution mould
The control coefrficient of type probability size, τn、κn、λnIndicate the parameter of n-th of Weibull distribution model.
The input sample of selection is inputted into above-mentioned expression formula (7), the sample proportion of Weibull distribution tail portion is chosen by control,
The parameter that the Weibull distribution model of corresponding sample class can be calculated determines one and certain a kind of sample according to each group of parameter
This corresponding Weibull distribution model.
Through this embodiment, when determining the probability Distribution Model of every a kind of sample, every a kind of sample is calculated first
Mean vector, i.e. center vector calculate such sample at a distance from such sample mean vector, choose sample by a certain percentage,
Model parameter as input sample estimated probability distributed model;Calculate apart from when using Euclidean distance and COS distance
In conjunction with corresponding general out to the distance estimations of corresponding center vector using every a kind of sample based on the angle of sample probability distribution
Rate model improves the accuracy of probability Distribution Model estimation;The test sample of input can be obtained according to probability Distribution Model
Whether belong to certain a kind of sample and mutually should belong to the probability of such sample, improves the performance and image classification of image classification
Correctness, have more practicability.
Step S103 corrects the activation value of the known class image according to the probability Distribution Model.
In the present embodiment, according to the image of the training set of known class image and test set, by the way that test image is defeated
Enter network training pattern, the corresponding activation value of test image can be extracted, extracted activation value is at normalized function
The activation value of output before reason.
The probability value for belonging to the category by the test image that probability Distribution Model calculates a certain classification by probability value and is somebody's turn to do
The activation value of test image is multiplied to obtain revised activation value.
Specifically, the implementation process schematic diagram of amendment activation value as shown in Figure 5, is corrected according to the probability Distribution Model
The activation value of the known class image, comprising:
Step S1031 swashs by first of test sample in known class image described in the network training model extraction
Value living.
In the present embodiment, the different classes of test sample in known class figure is inputted into network training model, it can
To extract the first activation value a of different classes of test sampletest, first activation value is to activate to test sample
The activation value before function processing is normalized in value.
It should be noted that in the present embodiment, the first activation value does not represent only one activation value, but for difference
Multiple activation values without passing through normalized function processing that the test sample of classification obtains.
Step S1032 chooses the sample class of preset quantity according to first activation value from the test sample.
In the present embodiment, according to extreme value estimation theory, for multiple sample class, the first activation value that will acquire is by big
Small sequence is chosen and comes the sample class of the biggish preset quantity of forward activation value, for example, selection sample class number
It is α.
Step S1033, according to the probability Distribution Model corresponding with the sample class of the preset quantity and described
First activation value of the test sample in sample class, calculates the affiliated of the test sample in the sample class of the preset quantity
Probability.
In the present embodiment, according to the expression formula (7) of the step S1024 probability Distribution Model determined, and in sample class
The biggish test sample activation value of sample activation value chosen in not calculates selected test sample to should belong to a certain classification
Affiliated probability.
It should be noted that joined probability control coefrficient γ in probability Distribution Model, probability value is reduced, is avoided
The activation value for the test sample of certain particular category occur occupies main component, and keeps part sample other by the unknown picture category of mistake point
Problem.
Step S1034 corrects of the test sample in the sample class of the preset quantity according to the affiliated probability
One activation value obtains the second activation value.
In the present embodiment, the affiliated probability that test sample is obtained by probability Distribution Model is corrected according to affiliated probability
The test sample activation value of the biggish sample class of selected activation value, modified calculation formula are as follows:
Wherein,For the first activation value of the test sample in the biggish sample class of activation value that selects, wcFor
Probability belonging to the test sample is corresponding,For the second activation value of the test sample.
Through this embodiment, it by choosing the biggish sample class of test sample activation value in known class image, will survey
The activation value input probability distributed model of sample sheet calculates the affiliated probability of test sample, according to affiliated probability to test sample
Original activation value is modified, and obtains revised activation value, preferably controls the size of test sample activation value, it is therefore prevented that
The sample activation value of a certain particular category accounts for main component, and leads to the problem of part sample is by mistake point unknown images classification, mentions
The high accuracy of image classification.
Step S104 obtains the activation value of unknown classification image according to the activation value of the known class image data.
In the present embodiment, for α selected sample class, swashed according to the first activation value of test sample and second
Value living, constructs the activation value of unknown classification image.
Specifically, obtaining the activation value of unknown classification image according to the activation value of the known class image data, comprising:
According to first activation value and second activation value of the test sample in sample class, unknown class is calculated
The activation value of other imageCalculation formula are as follows:
Wherein,For the first activation value of test sample,For revised second activation value, C is known class
Total classification number of image, c are any c class sample in total classification number.
Step S105 is right according to the activation value of the known class image and the activation value of the unknown classification image
Image is classified.
In the present embodiment, by the construction of the activation value to unknown classification image, in conjunction with the activation of known class image
Value forms the activation value of C+1 dimension, and wherein C is classification sum under the conditions of known class image closed set;To C+1 dimension activation value into
Image classification under the conditions of closed set is extended to the image classification under open set condition by row normalized.
Specifically, the implementation process schematic diagram as shown in FIG. 6 classified to image, according to the known class image
Activation value and the unknown classification image activation value, classify to image, comprising:
The activation value of the activation value of the image of the known class and unknown classification image is carried out normalizing by step S1051
Change processing, obtains the new activation value of image,.
In the present embodiment, normalizing is carried out to revised C+1 micro-activated value using normalized function softmax function
Change, i.e. the activation value of the activation value to known class image and unknown classification image is normalized, and obtains new model
Enclose belong to 0 to 1 new C+1 dimension activation value, the new activation value includes the activation value of unknown classification image, new activation value
Calculation formula are as follows:
Wherein,For the activation value of revised c class image, pcBelong to for test image current after normalization
The probability of c class image.
According to the probability vector P of the available C+1 dimension of formula (10)c。
Step S1052 selects class undetermined corresponding to the maximum current test image of activation value from the new activation value
It is not worth.
In the present embodiment, the probability vector of C+1 dimension is indexed, extracts maximum activation value in new activation value,
Current test image corresponding with maximum activation value is obtained, the corresponding class label c undetermined of current test image is further obtained*。
Step S1053 judges whether the class label undetermined is corresponding with the unknown class label.
In the present embodiment, according to the corresponding class label c undetermined of the current test image of acquisition*, judge class label c undetermined*
It is whether equal with unknown class label C+1.
Step S1054 if so, refusal identifies the current test image, and determines that the current test image is not
Define classification.
In the present embodiment, if class label c undetermined*It is equal with unknown class label C+1, then the execution of current test image is refused
Identification absolutely determines that current test image is undefined classification, realizes the classification in opener image conditions to image.
Step S1055, if it is not, then judging whether the corresponding activation value of the current altimetric image is less than preset threshold.
In the present embodiment, the threshold value for setting image recognition extracts current test image swashing after normalized
Value living, determines whether the activation value of current test picture is less than the threshold value of preset image recognition.The preset threshold can be with
To refuse threshold value, the i.e. threshold value of system refusal identification image;Or can also be acceptance threshold, i.e. system can receive image knowledge
The threshold value of other result.
Step S1056 if so, refusal identifies the current test image, and determines that the current test image is not
Define classification.
In the present embodiment, if the current corresponding activation value of altimetric image is less than preset threshold, refuse to identify described current
Test image, and current test image is determined as undefined classification;To realize in the training for encountering known class image
When image except collection, correctly classification can also be carried out rationally to image.
Step S1057, if it is not, then determining that the current test image belongs to known class, to the current test image
Carry out the classification of known class.
In the present embodiment, if current test image is not belonging to undefined classification, according to closed set image conditions to
Know that classification image is classified, determines that current test image belongs to known class c*。
Through this embodiment, the picture for choosing certain classification trains depth convolutional neural networks mould under closed set image condition
Type updates network parameter, obtains image classification network training model of good performance by choosing suitable hyper parameter;To closing
Collect each sample classification under image condition and estimate corresponding probability Distribution Model, rationally portrays the sample distribution probability of each classification;
The image activation value under closed set image condition is modified by sample distribution probability, constructs the activation value of unknown classification, it will
All activated value normalized, using the activation value after normalization to being identified under booting image condition to image, to not
Know that classification image realizes rejection, known class image is correctly classified, the reasonability and accuracy of image classification are improved, meets
The demand of practical application scene, more practicability.
It should be noted that those skilled in the art are in the technical scope disclosed by the present invention, can be readily apparent that other
Sequencing schemes should also will not repeat them here within protection scope of the present invention.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process
Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit
It is fixed.
Embodiment two
It is that the schematic diagram of image classification device provided in an embodiment of the present invention is only shown for ease of description referring to Fig. 7
Part related to the embodiment of the present invention.
Described image sorter includes:
First model acquiring unit 71 is obtained for being trained by known class image to depth convolutional neural networks
Take network training model;
Second model acquiring unit 72, for according to the network training model to each in the known class image
Class sample establishes probability Distribution Model respectively;
Amending unit 73, for correcting the activation value of the known class image according to the probability Distribution Model;
Activation value acquiring unit 74, for obtaining unknown classification image according to the activation value of the known class image data
Activation value;
Image classification judging unit 75, for the activation value and the unknown classification figure according to the known class image
The activation value of picture, classifies to image.
Optionally, the first model acquiring unit includes:
Data division module, the known class image for will acquire are divided into training set and test set;
First result-generation module leads to for the image training depth convolutional neural networks by the training set
The image for crossing the test set carries out the test of classification performance to the depth convolutional neural networks, exports network class result;
Second result-generation module is obtained for being exercised supervision operation by loss function to the network class result
Supervise operation result;
Parameter adjustment module, the network for adjusting the depth convolutional neural networks according to the supervision operation result are joined
Number.
Through this embodiment, the picture for choosing certain classification trains depth convolutional neural networks mould under closed set image condition
Type updates network parameter, obtains image classification network training model of good performance by choosing suitable hyper parameter;To closing
Collect each sample classification under image condition and estimate corresponding probability Distribution Model, rationally portrays the sample distribution probability of each classification;
The image activation value under closed set image condition is modified by sample distribution probability, constructs the activation value of unknown classification, it will
All activated value normalized, using the activation value after normalization to being identified under booting image condition to image, to not
Know that classification image realizes rejection, known class image is correctly classified, the reasonability and accuracy of image classification are improved, meets
The demand of practical application scene.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different functions
Unit, module are completed, i.e., the internal structure of the mobile terminal is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional module in embodiment can integrate in one processing unit, be also possible to
Each unit physically exists alone, and can also be integrated in one unit with two or more units, above-mentioned integrated unit
Both it can take the form of hardware realization, can also realize in the form of software functional units.In addition, the tool of each functional module
Body title is also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Module in above-mentioned mobile terminal
Specific work process, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
Embodiment three
Fig. 8 is the schematic diagram for the terminal device that one embodiment of the invention provides.As shown in figure 8, the terminal of the embodiment is set
Standby 8 include: processor 80, memory 81 and are stored in the meter that can be run in the memory 81 and on the processor 80
Calculation machine program 82.The processor 80 is realized when executing the computer program 82 in above-mentioned each Shape classification embodiment
The step of, such as step 101 shown in FIG. 1 is to 105.Alternatively, realization when the processor 80 executes the computer program 82
The function of each module/unit in above-mentioned each Installation practice, such as the function of module 71 to 75 shown in Fig. 7.
Illustratively, the computer program 82 can be divided into one or more module/units, it is one or
Multiple module/units are stored in the memory 81, and are executed by the processor 80, to complete the present invention.Described one
A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for
Implementation procedure of the computer program 82 in the terminal device 8 is described.For example, the computer program 82 can be divided
It is cut into the first model acquiring unit, the second model acquiring unit, amending unit, activation value acquiring unit, image classification and determines list
Member, each module concrete function are as follows:
First model acquiring unit is obtained for being trained by known class image to depth convolutional neural networks
Network training model;
Second model acquiring unit, for according to the network training model to every one kind in the known class image
Sample establishes probability Distribution Model respectively;
Amending unit, for correcting the activation value of the known class image according to the probability Distribution Model;
Activation value acquiring unit, for obtaining unknown classification image according to the activation value of the known class image data
Activation value;
Image classification judging unit, for the activation value and the unknown classification image according to the known class image
Activation value, classify to image.
The terminal device 8 can be the calculating such as desktop PC, notebook, palm PC and cloud server and set
It is standby.The terminal device may include, but be not limited only to, processor 80, memory 81.It will be understood by those skilled in the art that Fig. 8
The only example of terminal device 8 does not constitute the restriction to terminal device 8, may include than illustrating more or fewer portions
Part perhaps combines certain components or different components, such as the terminal device can also include input-output equipment, net
Network access device, bus etc..
Alleged processor 80 can be central processing unit (Central Processing Unit, CPU), can also be
Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit
(Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field-
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor
Deng.
The memory 81 can be the internal storage unit of device/terminal device 8, for example, terminal device 8 hard disk or
Memory.The memory 81 is also possible to the External memory equipment of the terminal device 8, such as is equipped on the terminal device 8
Plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card,
Flash card (Flash Card) etc..Further, the memory 81 can also both include the storage inside of the terminal device 8
Unit also includes External memory equipment.The memory 81 is for storing needed for the computer program and the terminal device
Other programs and data.The memory 81 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also
To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list
Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to restrict the invention.Above system
The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment
The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, it can be with
It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute
The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as
Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately
A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device
Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or
In use, can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-mentioned implementation
All or part of the process in example method, can also instruct relevant hardware to complete, the meter by computer program
Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on
The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program generation
Code can be source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium
It may include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic that can carry the computer program code
Dish, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM,
Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that described
The content that computer-readable medium includes can carry out increasing appropriate according to the requirement made laws in jurisdiction with patent practice
Subtract, such as in certain jurisdictions, according to legislation and patent practice, computer-readable medium do not include be electric carrier signal and
Telecommunication signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although referring to aforementioned reality
Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each
Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified
Or replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution should all
It is included within protection scope of the present invention.
Claims (10)
1. a kind of image classification method characterized by comprising
Depth convolutional neural networks are trained by known class image, obtain network training model;
Probability Distribution Model is established respectively to every a kind of sample in the known class image according to the network training model;
The activation value of the known class image is corrected according to the probability Distribution Model;
The activation value of unknown classification image is obtained according to the activation value of the known class image data;
According to the activation value of the known class image and the activation value of the unknown classification image, classify to image.
2. image classification method as described in claim 1, which is characterized in that by known class image to depth convolutional Neural
Network is trained, and obtains network training model, comprising:
The known class image that will acquire is divided into training set and test set;
By the image training depth convolutional neural networks of the training set, by the image of the test set to the depth
The test that convolutional neural networks carry out classification performance is spent, network class result is exported;
It is exercised supervision operation by loss function to the network class result, obtains supervision operation result;
The network parameter of the depth convolutional neural networks is adjusted according to the supervision operation result.
3. image classification method as described in claim 1, which is characterized in that according to the network training model to described known
Every a kind of sample in classification image establishes probability Distribution Model respectively, comprising:
Obtain the mean vector of every one kind sample in the known class image;
Calculate every one kind the distance between sample and the mean vector in the known class image;
According to the distance, several input samples are chosen from every a kind of sample by preset ratio;
The model parameter of the probability Distribution Model corresponding with input sample classification is estimated according to several described input samples.
4. image classification method as described in claim 1, which is characterized in that described according to probability Distribution Model amendment
Know the activation value of classification image, comprising:
Pass through the first activation value of test sample in known class image described in the network training model extraction;
According to first activation value, the sample class of preset quantity is chosen from the test sample;
According to the survey in the probability Distribution Model corresponding with the sample class of the preset quantity and the sample class
First activation value of sample sheet calculates the affiliated probability of the test sample in the sample class of the preset quantity;
It corrects the first activation value of the test sample in the sample class of the preset quantity according to the affiliated probability, obtains the
Two activation values.
5. image classification method as claimed in claim 4, which is characterized in that according to the activation of the known class image data
Value obtains the activation value of unknown classification image, comprising:
According to first activation value of the test sample in the sample class of the preset quantity of selection and second activation
Value, calculates the activation value of unknown classification imageCalculation formula are as follows:
Wherein,For the first activation value of test sample,For revised second activation value, C is known class image
Total classification number, c are any c class sample in total classification number.
6. image classification method as described in claim 1, which is characterized in that according to the activation value of the known class image with
And the activation value of the unknown classification image, classify to image, comprising:
The activation value of the image of the known class and the activation value of unknown classification image are normalized, image is obtained
New activation value;
Class label undetermined corresponding to the maximum current test image of activation value is selected from the new activation value;
Judge whether the class label undetermined is corresponding with the unknown class label;
If so, refusal identifies the current test image, and determine that the current test image is undefined classification;
If it is not, then judging whether the corresponding activation value of the current altimetric image is less than preset threshold;
If so, refusal identifies the current test image, and determine that the current test image is undefined classification;
If it is not, then determining that the current test image belongs to known class, known class is carried out to the current test image
Classification.
7. a kind of image classification device characterized by comprising
First model acquiring unit obtains network for being trained by known class image to depth convolutional neural networks
Training pattern;
Second model acquiring unit, for according to the network training model to every a kind of sample in the known class image
Probability Distribution Model is established respectively;
Amending unit, for correcting the activation value of the known class image according to the probability Distribution Model;
Activation value acquiring unit, for obtaining the activation of unknown classification image according to the activation value of the known class image data
Value;
Image classification judging unit, for according to the activation value of the known class image and swashing for the unknown classification image
Value living, classifies to image.
8. image classification device as claimed in claim 7, which is characterized in that the first model acquiring unit includes:
Data division module, the known class image for will acquire are divided into training set and test set;
First result-generation module passes through institute for the image training depth convolutional neural networks by the training set
The image for stating test set carries out the test of classification performance to the depth convolutional neural networks, exports network class result;
Second result-generation module obtains supervision for exercising supervision operation by loss function to the network class result
Operation result;
Parameter adjustment module, for adjusting the network parameter of the depth convolutional neural networks according to the supervision operation result.
9. a kind of terminal device, including memory, processor and storage are in the memory and can be on the processor
The computer program of operation, which is characterized in that the processor realizes such as claim 1 to 6 when executing the computer program
The step of any one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists
In when the computer program is executed by processor the step of any one of such as claim 1 to 6 of realization the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811284267.2A CN109376786A (en) | 2018-10-31 | 2018-10-31 | A kind of image classification method, device, terminal device and readable storage medium storing program for executing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811284267.2A CN109376786A (en) | 2018-10-31 | 2018-10-31 | A kind of image classification method, device, terminal device and readable storage medium storing program for executing |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109376786A true CN109376786A (en) | 2019-02-22 |
Family
ID=65390741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811284267.2A Pending CN109376786A (en) | 2018-10-31 | 2018-10-31 | A kind of image classification method, device, terminal device and readable storage medium storing program for executing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109376786A (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109919241A (en) * | 2019-03-15 | 2019-06-21 | 中国人民解放军国防科技大学 | Hyperspectral unknown class target detection method based on probability model and deep learning |
CN109977781A (en) * | 2019-02-26 | 2019-07-05 | 上海上湖信息技术有限公司 | Method for detecting human face and device, readable storage medium storing program for executing |
CN110059754A (en) * | 2019-04-22 | 2019-07-26 | 厦门大学 | A kind of batch data steganography method, terminal device and storage medium |
CN110135505A (en) * | 2019-05-20 | 2019-08-16 | 北京达佳互联信息技术有限公司 | Image classification method, device, computer equipment and computer readable storage medium |
CN110147456A (en) * | 2019-04-12 | 2019-08-20 | 中国科学院深圳先进技术研究院 | A kind of image classification method, device, readable storage medium storing program for executing and terminal device |
CN110222704A (en) * | 2019-06-12 | 2019-09-10 | 北京邮电大学 | A kind of Weakly supervised object detection method and device |
CN110472675A (en) * | 2019-07-31 | 2019-11-19 | Oppo广东移动通信有限公司 | Image classification method, image classification device, storage medium and electronic equipment |
CN110472681A (en) * | 2019-08-09 | 2019-11-19 | 北京市商汤科技开发有限公司 | The neural metwork training scheme and image procossing scheme of knowledge based distillation |
CN110567967A (en) * | 2019-08-20 | 2019-12-13 | 武汉精立电子技术有限公司 | Display panel detection method, system, terminal device and computer readable medium |
CN110751675A (en) * | 2019-09-03 | 2020-02-04 | 平安科技(深圳)有限公司 | Urban pet activity track monitoring method based on image recognition and related equipment |
CN110826713A (en) * | 2019-10-25 | 2020-02-21 | 广州思德医疗科技有限公司 | Method and device for acquiring special convolution kernel |
CN110909760A (en) * | 2019-10-12 | 2020-03-24 | 中国人民解放军国防科技大学 | Image open set identification method based on convolutional neural network |
CN111612010A (en) * | 2020-05-21 | 2020-09-01 | 京东方科技集团股份有限公司 | Image processing method, device, equipment and computer readable storage medium |
WO2020191988A1 (en) * | 2019-03-23 | 2020-10-01 | 南京智慧光信息科技研究院有限公司 | New category identification method and robot system based on fuzzy theory and deep learning |
CN111930935A (en) * | 2020-06-19 | 2020-11-13 | 普联国际有限公司 | Image classification method, device, equipment and storage medium |
CN112508062A (en) * | 2020-11-20 | 2021-03-16 | 普联国际有限公司 | Open set data classification method, device, equipment and storage medium |
CN112541905A (en) * | 2020-12-16 | 2021-03-23 | 华中科技大学 | Product surface defect identification method based on lifelong learning convolutional neural network |
CN113705446A (en) * | 2021-08-27 | 2021-11-26 | 电子科技大学 | Open set identification method for individual radiation source |
CN113743443A (en) * | 2021-05-31 | 2021-12-03 | 高新兴科技集团股份有限公司 | Image evidence classification and identification method and device |
CN115083442A (en) * | 2022-04-29 | 2022-09-20 | 马上消费金融股份有限公司 | Data processing method, data processing device, electronic equipment and computer readable storage medium |
CN116071600A (en) * | 2023-02-17 | 2023-05-05 | 中国科学院地理科学与资源研究所 | Crop remote sensing identification method and device based on multi-classification probability |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107229942A (en) * | 2017-04-16 | 2017-10-03 | 北京工业大学 | A kind of convolutional neural networks rapid classification method based on multiple graders |
CN107506799A (en) * | 2017-09-01 | 2017-12-22 | 北京大学 | A kind of opener classification based on deep neural network is excavated and extended method and device |
CN107622272A (en) * | 2016-07-13 | 2018-01-23 | 华为技术有限公司 | A kind of image classification method and device |
CN107895170A (en) * | 2017-10-31 | 2018-04-10 | 天津大学 | A kind of Dropout regularization methods based on activation value sensitiveness |
CN107967484A (en) * | 2017-11-14 | 2018-04-27 | 中国计量大学 | A kind of image classification method based on multiresolution |
CN108596258A (en) * | 2018-04-27 | 2018-09-28 | 南京邮电大学 | A kind of image classification method based on convolutional neural networks random pool |
CN108710831A (en) * | 2018-04-24 | 2018-10-26 | 华南理工大学 | A kind of small data set face recognition algorithms based on machine vision |
-
2018
- 2018-10-31 CN CN201811284267.2A patent/CN109376786A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107622272A (en) * | 2016-07-13 | 2018-01-23 | 华为技术有限公司 | A kind of image classification method and device |
CN107229942A (en) * | 2017-04-16 | 2017-10-03 | 北京工业大学 | A kind of convolutional neural networks rapid classification method based on multiple graders |
CN107506799A (en) * | 2017-09-01 | 2017-12-22 | 北京大学 | A kind of opener classification based on deep neural network is excavated and extended method and device |
CN107895170A (en) * | 2017-10-31 | 2018-04-10 | 天津大学 | A kind of Dropout regularization methods based on activation value sensitiveness |
CN107967484A (en) * | 2017-11-14 | 2018-04-27 | 中国计量大学 | A kind of image classification method based on multiresolution |
CN108710831A (en) * | 2018-04-24 | 2018-10-26 | 华南理工大学 | A kind of small data set face recognition algorithms based on machine vision |
CN108596258A (en) * | 2018-04-27 | 2018-09-28 | 南京邮电大学 | A kind of image classification method based on convolutional neural networks random pool |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109977781A (en) * | 2019-02-26 | 2019-07-05 | 上海上湖信息技术有限公司 | Method for detecting human face and device, readable storage medium storing program for executing |
CN109919241A (en) * | 2019-03-15 | 2019-06-21 | 中国人民解放军国防科技大学 | Hyperspectral unknown class target detection method based on probability model and deep learning |
CN109919241B (en) * | 2019-03-15 | 2020-09-29 | 中国人民解放军国防科技大学 | Hyperspectral unknown class target detection method based on probability model and deep learning |
WO2020191988A1 (en) * | 2019-03-23 | 2020-10-01 | 南京智慧光信息科技研究院有限公司 | New category identification method and robot system based on fuzzy theory and deep learning |
CN110147456A (en) * | 2019-04-12 | 2019-08-20 | 中国科学院深圳先进技术研究院 | A kind of image classification method, device, readable storage medium storing program for executing and terminal device |
CN110147456B (en) * | 2019-04-12 | 2023-01-24 | 中国科学院深圳先进技术研究院 | Image classification method and device, readable storage medium and terminal equipment |
CN110059754A (en) * | 2019-04-22 | 2019-07-26 | 厦门大学 | A kind of batch data steganography method, terminal device and storage medium |
CN110135505B (en) * | 2019-05-20 | 2021-09-17 | 北京达佳互联信息技术有限公司 | Image classification method and device, computer equipment and computer readable storage medium |
CN110135505A (en) * | 2019-05-20 | 2019-08-16 | 北京达佳互联信息技术有限公司 | Image classification method, device, computer equipment and computer readable storage medium |
CN110222704B (en) * | 2019-06-12 | 2022-04-01 | 北京邮电大学 | Weak supervision target detection method and device |
CN110222704A (en) * | 2019-06-12 | 2019-09-10 | 北京邮电大学 | A kind of Weakly supervised object detection method and device |
CN110472675A (en) * | 2019-07-31 | 2019-11-19 | Oppo广东移动通信有限公司 | Image classification method, image classification device, storage medium and electronic equipment |
CN110472681A (en) * | 2019-08-09 | 2019-11-19 | 北京市商汤科技开发有限公司 | The neural metwork training scheme and image procossing scheme of knowledge based distillation |
CN110567967B (en) * | 2019-08-20 | 2022-06-17 | 武汉精立电子技术有限公司 | Display panel detection method, system, terminal device and computer readable medium |
CN110567967A (en) * | 2019-08-20 | 2019-12-13 | 武汉精立电子技术有限公司 | Display panel detection method, system, terminal device and computer readable medium |
CN110751675A (en) * | 2019-09-03 | 2020-02-04 | 平安科技(深圳)有限公司 | Urban pet activity track monitoring method based on image recognition and related equipment |
CN110751675B (en) * | 2019-09-03 | 2023-08-11 | 平安科技(深圳)有限公司 | Urban pet activity track monitoring method based on image recognition and related equipment |
CN110909760A (en) * | 2019-10-12 | 2020-03-24 | 中国人民解放军国防科技大学 | Image open set identification method based on convolutional neural network |
CN110826713A (en) * | 2019-10-25 | 2020-02-21 | 广州思德医疗科技有限公司 | Method and device for acquiring special convolution kernel |
CN110826713B (en) * | 2019-10-25 | 2022-06-10 | 广州思德医疗科技有限公司 | Method and device for acquiring special convolution kernel |
CN111612010A (en) * | 2020-05-21 | 2020-09-01 | 京东方科技集团股份有限公司 | Image processing method, device, equipment and computer readable storage medium |
CN111930935B (en) * | 2020-06-19 | 2024-06-07 | 普联国际有限公司 | Image classification method, device, equipment and storage medium |
CN111930935A (en) * | 2020-06-19 | 2020-11-13 | 普联国际有限公司 | Image classification method, device, equipment and storage medium |
CN112508062A (en) * | 2020-11-20 | 2021-03-16 | 普联国际有限公司 | Open set data classification method, device, equipment and storage medium |
CN112541905B (en) * | 2020-12-16 | 2022-08-05 | 华中科技大学 | Product surface defect identification method based on lifelong learning convolutional neural network |
CN112541905A (en) * | 2020-12-16 | 2021-03-23 | 华中科技大学 | Product surface defect identification method based on lifelong learning convolutional neural network |
CN113743443A (en) * | 2021-05-31 | 2021-12-03 | 高新兴科技集团股份有限公司 | Image evidence classification and identification method and device |
CN113743443B (en) * | 2021-05-31 | 2024-05-17 | 高新兴科技集团股份有限公司 | Image evidence classification and recognition method and device |
CN113705446B (en) * | 2021-08-27 | 2023-04-07 | 电子科技大学 | Open set identification method for individual radiation source |
CN113705446A (en) * | 2021-08-27 | 2021-11-26 | 电子科技大学 | Open set identification method for individual radiation source |
CN115083442A (en) * | 2022-04-29 | 2022-09-20 | 马上消费金融股份有限公司 | Data processing method, data processing device, electronic equipment and computer readable storage medium |
CN115083442B (en) * | 2022-04-29 | 2023-08-08 | 马上消费金融股份有限公司 | Data processing method, device, electronic equipment and computer readable storage medium |
CN116071600A (en) * | 2023-02-17 | 2023-05-05 | 中国科学院地理科学与资源研究所 | Crop remote sensing identification method and device based on multi-classification probability |
CN116071600B (en) * | 2023-02-17 | 2023-08-04 | 中国科学院地理科学与资源研究所 | Crop remote sensing identification method and device based on multi-classification probability |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109376786A (en) | A kind of image classification method, device, terminal device and readable storage medium storing program for executing | |
CN107766929B (en) | Model analysis method and device | |
CN109934269B (en) | Open set identification method and device for electromagnetic signals | |
CN109241418A (en) | Abnormal user recognition methods and device, equipment, medium based on random forest | |
CN109522304A (en) | Exception object recognition methods and device, storage medium | |
CN108509413A (en) | Digest extraction method, device, computer equipment and storage medium | |
CN109460793A (en) | A kind of method of node-classification, the method and device of model training | |
CN110135505B (en) | Image classification method and device, computer equipment and computer readable storage medium | |
CN108681742B (en) | Analysis method for analyzing sensitivity of driver driving behavior to vehicle energy consumption | |
CN111062036A (en) | Malicious software identification model construction method, malicious software identification medium and malicious software identification equipment | |
CN108228684A (en) | Training method, device, electronic equipment and the computer storage media of Clustering Model | |
CN103310235B (en) | A kind of steganalysis method based on parameter identification and estimation | |
CN113541834B (en) | Abnormal signal semi-supervised classification method and system and data processing terminal | |
CN109214444B (en) | Game anti-addiction determination system and method based on twin neural network and GMM | |
CN110019939A (en) | Video temperature prediction technique, device, terminal device and medium | |
CN116596095B (en) | Training method and device of carbon emission prediction model based on machine learning | |
CN104268572A (en) | Feature extraction and feature selection method oriented to background multi-source data | |
CN112966072A (en) | Case prediction method and device, electronic device and storage medium | |
CN111523964A (en) | Clustering-based recall method and apparatus, electronic device and readable storage medium | |
CN114783021A (en) | Intelligent detection method, device, equipment and medium for wearing of mask | |
CN111047406A (en) | Telecommunication package recommendation method, device, storage medium and equipment | |
CN111539612A (en) | Training method and system of risk classification model | |
CN109583492A (en) | A kind of method and terminal identifying antagonism image | |
CN108764296A (en) | More sorting techniques of study combination are associated with multitask based on K-means | |
CN117197559A (en) | Pork classification model based on deep learning, construction method, electronic equipment and computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190222 |
|
RJ01 | Rejection of invention patent application after publication |