CN109583501A - Picture classification, the generation method of Classification and Identification model, device, equipment and medium - Google Patents
Picture classification, the generation method of Classification and Identification model, device, equipment and medium Download PDFInfo
- Publication number
- CN109583501A CN109583501A CN201811457125.1A CN201811457125A CN109583501A CN 109583501 A CN109583501 A CN 109583501A CN 201811457125 A CN201811457125 A CN 201811457125A CN 109583501 A CN109583501 A CN 109583501A
- Authority
- CN
- China
- Prior art keywords
- picture
- classification
- training
- grades
- neural net
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000012549 training Methods 0.000 claims abstract description 485
- 230000001537 neural effect Effects 0.000 claims abstract description 238
- 230000006870 function Effects 0.000 claims description 206
- 238000003062 neural network model Methods 0.000 claims description 85
- 238000013528 artificial neural network Methods 0.000 claims description 23
- 238000003860 storage Methods 0.000 claims description 20
- 210000005036 nerve Anatomy 0.000 claims description 9
- 238000012986 modification Methods 0.000 claims description 7
- 230000004048 modification Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 3
- 235000013399 edible fruits Nutrition 0.000 claims description 2
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 210000004218 nerve net Anatomy 0.000 description 8
- 238000013527 convolutional neural network Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000005291 magnetic effect Effects 0.000 description 4
- 230000000644 propagated effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005284 excitation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000003475 lamination Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000004630 mental health Effects 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a kind of picture classification, the generation method of Classification and Identification model, device, equipment and media.This method comprises: obtaining pictures to be sorted, pictures include at least two pictures;Pictures are input in the current grade classification identification model of training in advance, the classification score of every picture is obtained;If the classification score of picture meets preset condition, the classification recognition result of picture is determined according to classification score;If the classification score of picture is unsatisfactory for preset condition, continue for picture to be input in next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of picture;Wherein, every grade of classification identification model is generated based on neural metwork training.The embodiment of the present invention improves the accuracy rate and efficiency of picture classification.
Description
Technical field
The present embodiments relate to the generations of data processing technique more particularly to a kind of picture classification, Classification and Identification model
Method, apparatus, equipment and medium.
Background technique
With the fast development of depth learning technology, deep neural network is largely used in picture classification field.
In the prior art, in order to make based on deep neural network training generate the classification with higher of Classification and Identification model
Accuracy rate generallys use the mode for increasing the depth of deep neural network.
In the implementation of the present invention, at least there are the following problems: one, due to depth for the discovery prior art by inventor
The method that neural network is mainly propagated using reversed gradient in the training process, therefore, with the continuous increase of network depth, instruction
Practice difficulty also gradually to increase;Secondly, due to the operand of the forward inference process of deep neural network it is huge, with net
The continuous increase of network depth, operand are also stepped up, and thereby reduce classification effectiveness.
Summary of the invention
The embodiment of the present invention provides a kind of picture classification, the generation method of Classification and Identification model, device, equipment and medium,
To improve the accuracy rate and efficiency of picture classification.
In a first aspect, the embodiment of the invention provides a kind of picture classification methods, this method comprises:
Pictures to be sorted are obtained, the pictures include at least two pictures;
The pictures are input in the current grade classification identification model of training in advance, classifying for every picture is obtained
Point;
If the classification results of picture meet preset condition, determine that the classification of the picture is known according to the classification score
Other result;If the classification score of picture is unsatisfactory for preset condition, continue for the picture to be input to the next of training in advance
In grade classification identification model, until obtaining the classification recognition result of the pictures;Wherein, every grade of classification identification model is based on mind
It is generated through network training.
Further, described that the pictures are input in the current grade classification identification model of training in advance, it obtains every
After the classification score of picture, further includes:
According to the classification score of every picture, the class probability of every picture is obtained;
The classification score of picture meets the class probability that preset condition is picture and is more than or equal to probability threshold value;The classification of picture
Score is unsatisfactory for the class probability that preset condition is picture and is less than probability threshold value.
Second aspect, the embodiment of the invention also provides a kind of generation methods of Classification and Identification model, this method comprises:
Training sample is obtained, the training sample includes the original classification label of trained picture and the trained picture;
The original classification label of the trained picture and the trained picture is input in neural network model, is obtained every
Grade neural net layer to the classification score of training picture, and, classification score and classification of the every grade of full articulamentum to trained picture
Label, the neural network model include N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1
After grade neural net layer, N >=3, i ∈ [1, N-1];
According to first order neural net layer to the classification score of training picture and the original classification label of training picture, obtain
The first order loss function of the first order neural net layer;
According to P-1 grades of full articulamentums to the classification score and tag along sort of training picture, the P grades of nerve nets are obtained
P grades of loss functions of network layers, P ∈ [2, N];
The loss function of neural network model is determined according to loss functions at different levels, and adjusts neural net layers at different levels and each
The network parameter of the full articulamentum of grade, until the loss function of neural network model reaches preset function value, then every grade of neural network
Classification and Identification model of the layer as respective stages.
Further, every grade of full articulamentum generates the classification score of training picture in the following way:
According to first order neural net layer to the classification score and second level neural net layer of training picture to training picture
Classification score, obtain the full articulamentum of the first order to training picture classification score;
According to P-1 grades of full articulamentums to the classification score of training picture and P+1 grades of neural net layers to training picture
Classification score, obtain P grades of full articulamentums to training picture classification score, P ∈ [2, N].
Further, every grade of full articulamentum generates the tag along sort of training picture in the following way:
The original classification label of training picture is updated according to classification score of the first order neural net layer to training picture,
The full articulamentum of the first order is obtained to the tag along sort of training picture;
P-1 grades of full articulamentums are updated to training picture according to classification score of the P-1 grades of full articulamentums to training picture
Tag along sort, obtain P grades of full articulamentums to training picture tag along sort, P ∈ [2, N].
Further, described that training picture is updated according to classification score of the first order neural net layer to training picture
Original classification label obtains the full articulamentum of the first order to the tag along sort of training picture, comprising:
According to the first order neural net layer to the classification score of training picture, the first order neural net layer is obtained
To the class probability of training picture;
The first order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then will be described
The original classification tag modification of training picture is default tag along sort, and the default tag along sort is complete as the first order
Tag along sort of the articulamentum to training picture;
The first order neural net layer less than the first probability threshold value, then keeps the instruction to the class probability of training picture
The original classification label for practicing picture is constant, and using the original classification label of the trained picture as the full articulamentum of the first order
To the tag along sort of training picture.
Further, described that P-1 grades of full connections are updated according to classification score of the P-1 grades of full articulamentums to training picture
Layer obtains P grades of full articulamentums to the tag along sort of training picture, P ∈ [2, N] to the tag along sort of training picture, comprising:
According to described P-1 grades full articulamentums to the classification score of training picture, described P-1 grades full articulamentums pair are obtained
The class probability of training picture, P ∈ [2, N];
Described P-1 grades full articulamentums are more than or equal to P probability threshold value to the class probability of training picture, then by described the
P-1 grades of full articulamentums are revised as the default tag along sort to the tag along sort of training picture, and by the default tag along sort
As described P grades full articulamentums to the tag along sort of training picture;
Described P-1 grades full articulamentums less than P probability threshold value, then keep the P- to the class probability of training picture
1 grade of full articulamentum is constant to the tag along sort of training picture, and by described P-1 grades full articulamentums to the contingency table of training picture
Sign the tag along sort as described P grades full articulamentums to training picture.
Further, the loss function that neural network model is determined according to loss functions at different levels, and adjust at different levels
The network parameter of neural net layer and full articulamentum at different levels, until the loss function of neural network model reaches preset function value,
Then Classification and Identification model of the every grade of neural net layer as respective stages, comprising:
The loss function of neural network model is determined according to loss functions at different levels;
The loss function is calculated to the partial derivative of the network parameter of neural net layers at different levels and full articulamentum at different levels, it is described
It is zero that the partial derivative of the corresponding trained picture of tag along sort is preset in loss function;
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to the partial derivative, and recalculates institute
Loss function is stated, until the loss function reaches the preset function value, then every grade of neural net layer dividing as respective stages
Class identification model.
The third aspect, the embodiment of the invention also provides a kind of picture classifier, which includes:
Pictures obtain module, and for obtaining pictures to be sorted, the pictures include at least two pictures;
Classification results generation module, for the pictures to be input to the current grade classification identification model of training in advance
In, obtain the classification score of every picture;
Classification recognition result generation module, if the classification score for picture meets preset condition, according to described point
Class score determines the classification recognition result of the picture;If the classification score of picture is unsatisfactory for preset condition, continue institute
It states picture to be input in next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of the picture;Its
In, every grade of classification identification model is generated based on neural metwork training.
Further, the device further include:
Class probability obtains module and obtains the class probability of every picture for the classification score according to every picture;
The classification score of picture meets the class probability that preset condition is picture and is more than or equal to probability threshold value;The classification of picture
Score is unsatisfactory for the class probability that preset condition is picture and is less than probability threshold value.
Fourth aspect, the embodiment of the invention also provides a kind of generating means of Classification and Identification model, which includes:
Training sample obtains module, and for obtaining training sample, the training sample includes training picture and the training
The original classification label of picture;
Classify score and tag along sort generation module, for by the original classification of the trained picture and the trained picture
Label is input in neural network model, obtains every grade of neural net layer to the classification score of training picture, and, every grade connects entirely
Layer is connect to the classification score and tag along sort of training picture, the neural network model includes that N grades of neural net layers and N-1 grades are complete
Articulamentum, the full articulamentum of i-stage are located at after i+1 grade neural net layer, N >=3, i ∈ [1, N-1];
First order loss function generation module, for according to first order neural net layer to training picture classification score and
The original classification label of training picture, obtains the first order loss function of the first order neural net layer;
P grades of loss function generation modules, for the classification score of training picture and being divided according to P-1 grades of full articulamentums
Class label obtains P grades of loss functions of the P grades of neural net layers, P ∈ [2, N];
Classification and Identification model generation module, for determining the loss function of neural network model according to loss functions at different levels,
And the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until the loss function of neural network model reaches
Preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages.
Further, every grade of full articulamentum generates the classification score of training picture in the following way:
According to first order neural net layer to the classification score and second level neural net layer of training picture to training picture
Classification score, obtain the full articulamentum of the first order to training picture classification score;
According to P-1 grades of full articulamentums to the classification score of training picture and P+1 grades of neural net layers to training picture
Classification score, obtain P grades of full articulamentums to training picture classification score, P ∈ [2, N].
Further, every grade of full articulamentum generates the tag along sort of training picture in the following way:
The original classification label of training picture is updated according to classification score of the first order neural net layer to training picture,
The full articulamentum of the first order is obtained to the tag along sort of training picture;
P-1 grades of full articulamentums are updated to training picture according to classification score of the P-1 grades of full articulamentums to training picture
Tag along sort, obtain P grades of full articulamentums to training picture tag along sort, P ∈ [2, N].
Further, described that training picture is updated according to classification score of the first order neural net layer to training picture
Original classification label obtains the full articulamentum of the first order to the tag along sort of training picture, comprising:
According to the first order neural net layer to the classification score of training picture, the first order neural net layer is obtained
To the class probability of training picture;
The first order neural net layer to training picture class probability less than the first probability threshold value, then by the training
The original classification tag modification of picture is default tag along sort, and is connected entirely using the default tag along sort as the first order
Tag along sort of the layer to training picture;
The first order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then keeps institute
The original classification label for stating trained picture is constant, and the original classification label of the trained picture is connected entirely as the first order
Layer is connect to the tag along sort of training picture.
Further, described that P-1 grades of full connections are updated according to classification score of the P-1 grades of full articulamentums to training picture
Layer obtains P grades of full articulamentums to the tag along sort of training picture, P ∈ [2, N] to the tag along sort of training picture, comprising:
According to described P-1 grades full articulamentums to the classification score of training picture, described P-1 grades full articulamentums pair are obtained
The class probability of training picture, P ∈ [2, N];
Described P-1 grades full articulamentums to training picture class probability less than P probability threshold value, then by the P-1
The full articulamentum of grade is revised as the default tag along sort to the tag along sort of training picture, and using the default tag along sort as
Tag along sort of the described P grades full articulamentums to training picture;
Described P-1 grades full articulamentums are more than or equal to P probability threshold value to the class probability of training picture, then described in holding
P-1 grades of full articulamentums are constant to the tag along sort of training picture, and described P-1 grades full articulamentums divide training picture
Class label is as described P grades full articulamentums to the tag along sort of training picture.
Further, the loss function that neural network model is determined according to loss functions at different levels, and adjust at different levels
The network parameter of neural net layer and full articulamentum at different levels, until the loss function of neural network model reaches preset function value,
Then Classification and Identification model of the every grade of neural net layer as respective stages, comprising:
The loss function of neural network model is determined according to loss functions at different levels;
The loss function is calculated to the partial derivative of the network parameter of neural net layers at different levels and full articulamentum at different levels, it is described
It is zero that the partial derivative of the corresponding trained picture of tag along sort is preset in loss function;
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to the partial derivative, and recalculates institute
Loss function is stated, until the loss function reaches the preset function value, then every grade of neural net layer dividing as respective stages
Class identification model.
5th aspect, the embodiment of the invention also provides a kind of equipment, which includes:
One or more processors;
Memory, for storing one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processing
Device realizes the method as described in first aspect of the embodiment of the present invention or second aspect.
6th aspect, the embodiment of the invention also provides a kind of computer readable storage mediums, are stored thereon with computer
Program realizes the method as described in first aspect of the embodiment of the present invention or second aspect when the program is executed by processor.
For the embodiment of the present invention by obtaining pictures to be sorted, pictures include at least two pictures, and pictures are defeated
Enter into the current grade classification identification model of training in advance, the classification score of every picture is obtained, if the classification score of picture
Meet preset condition, then determines the classification recognition result of picture according to classification score;If the classification score of picture is unsatisfactory for pre-
If condition, then continue for picture to be input in next stage Classification and Identification model trained in advance, until the classification for obtaining picture is known
Not as a result, every grade of classification identification model is generated based on neural metwork training, picture is divided using multiclass classification identification model
Class improves the accuracy rate and efficiency of picture classification.
Detailed description of the invention
Fig. 1 is the flow chart of one of embodiment of the present invention picture classification method;
Fig. 2 is the flow chart of another picture classification method in the embodiment of the present invention;
Fig. 3 is the flow chart of the generation method of one of embodiment of the present invention Classification and Identification model;
Fig. 4 is the flow chart of the generation method of another Classification and Identification model in the embodiment of the present invention;
Fig. 5 is the structural schematic diagram of one of embodiment of the present invention neural network model;
Fig. 6 is the structural schematic diagram of one of embodiment of the present invention picture classifier;
Fig. 7 is the structural schematic diagram of the generating means of one of embodiment of the present invention Classification and Identification model;
Fig. 8 is the structural schematic diagram of one of embodiment of the present invention equipment.
Specific embodiment
In following each embodiments, optional feature and example are provided simultaneously in each embodiment, that records in embodiment is each
A feature can be combined, and form multiple optinal plans, and the embodiment of each number should not be considered merely as to a technical solution.Under
The present invention is described in further detail in conjunction with the accompanying drawings and embodiments in face.It is understood that specific reality described herein
Example is applied to be used only for explaining the present invention rather than limiting the invention.It also should be noted that for ease of description, it is attached
Only the parts related to the present invention are shown in figure rather than entire infrastructure.
Embodiment
With the continuous development of network technology, the function of network is stronger and stronger.People can be clapped oneself by network
The picture taken the photograph is uploaded to the network platform, watches for the other users of the network platform, for example short video application of the network platform or straight
Broadcast platform.Since the quality of the picture of user's upload is irregular, some pictures not only will affect the physical and mental health of other users,
It is also possible to contrary to law.It is therefore desirable to which the premise that the picture uploaded to user is audited, and audited is accurately to realize
Classification and Identification is carried out to the picture of upload.Also, the difference of the picture due to upload, there are simple picture, and difficult picture it
Point, it is described here it is simple or difficult refer to Classification and Identification difficulty, can be by the figure if being easy to determine classification belonging to the picture
Piece is known as simple picture;If being not easy to determine classification belonging to the picture, which can be known as to difficult picture.It can of course manage
It solves, above-mentioned is only the application scenarios for needing to carry out picture classification.
In traditional technology, picture can be divided using the Classification and Identification model generated based on deep neural network training
Class.Make Classification and Identification model classification accuracy with higher to obtain, i.e., either simple picture, or difficult picture,
It can accurately determine its affiliated classification, can be by the way of the depth for increasing deep neural network, it will production but following
Raw following problem: the method mainly propagated in the training process using reversed gradient due to deep neural network, with net
The continuous increase of network depth, training difficulty also gradually increase.Further, since the operation of the forward inference process of deep neural network
Measure huge, therefore, with the continuous increase of network depth, operand is also stepped up, and thereby reduces classification effectiveness.
To solve the above-mentioned problems, that is, it realizes on the basis of not increasing network depth, obtains higher classification accuracy
And improve classification effectiveness, it is contemplated that by the way of multiclass classification identification model, multistage described here refers to not at the same level
Other Classification and Identification model, every grade of classification identification model are used to carry out Classification and Identification to the picture of corresponding complexity, below will
Above content is further described in conjunction with specific embodiments.
Fig. 1 is a kind of flow chart of picture classification method provided in an embodiment of the present invention, and the present embodiment is applicable to improve
The case where accuracy rate and efficiency of picture classification, this method can be executed by picture classifier, which can use soft
The mode of part and/or hardware realizes that the device can be configured in equipment, such as typically computer or mobile terminal etc..
As shown in Figure 1, this method specifically comprises the following steps:
Step 110 obtains pictures to be sorted, and pictures include at least two pictures.
In an embodiment of the present invention, pictures to be sorted can be uploaded to the pictures of the network platform for user,
It can be pre-stored pictures, the specific source of pictures can select according to the actual situation, not set specifically herein.
Pictures include at least two pictures, wherein the Classification and Identification difficulty of each picture may be identical, it is also possible to difference, i.e. picture
Each picture is concentrated there are difficulty or ease, in other words, each picture in pictures may need to identify mould by different fractions class
Type can just determine.
Pictures are input in the current grade classification identification model of training in advance by step 120, obtain point of every picture
Class score.
Step 130 determines whether the classification score of picture meets preset condition;If so, thening follow the steps 140;If it is not, then
Execute step 150.
Step 140, the classification recognition result that picture is determined according to the classification score of picture.
Picture is continued to be input in next stage Classification and Identification model trained in advance by step 150, obtains the classification of picture
Score, and return to step 130.
In an embodiment of the present invention, there are the Classification and Identification model of different stage, every grade of classification identification model can be used
Classification and Identification is carried out in the picture to corresponding complexity it will be appreciated that, it is carried out as used in every grade of classification identification model
The complexity of the picture of Classification and Identification is different, and therefore, the complexity of the network structure of every grade of classification identification model is generally also
It is different, the network structure of Classification and Identification model is more complicated, and the picture for Classification and Identification is more difficult to, above-mentioned to may be implemented to figure
The hierarchical identification of piece.It is also to be appreciated that during above-mentioned hierarchical identification, of the picture by every grade of identification model of classifying
Number is constantly reduced, correspondingly, reducing operand, to improve classification effectiveness.It should be noted that network described here
In contrast the complexity of structure is.It should also be noted that, every grade of classification identification model is all based on neural network instruction
Practice generation, and is that coorinated training generates, rather than generation is respectively trained, i.e., in the training process, Classification and Identifications at different levels
The classification score of model influences each other.
Current grade classification identification model can refer to the Classification and Identification model of the most simple picture of Classification and Identification, can will be current
Grade classification identification model is interpreted as first order Classification and Identification model, and next stage Classification and Identification model can refer to Classification and Identification and remove
Next stage Classification and Identification model can be interpreted as the second level by the Classification and Identification model of other difficulty degree pictures outside most simple picture
Classification and Identification model, third level Classification and Identification model etc..
After getting pictures to be sorted, pictures can be input to the current grade classification identification model of training in advance
In, the classification score of every picture in pictures is obtained, and determine the need for the figure according to the classification score of every picture
Piece is input in next stage Classification and Identification model, until obtaining the classification score of every picture in pictures.It is specific: by picture
Collection is input in the current grade classification identification model of training in advance, is obtained the classification score of every picture in pictures, is determined figure
Whether the classification score of piece meets preset condition, can be according to the picture if the classification score of picture meets preset condition
Classification score determine the classification recognition result of the picture, and no longer the picture is input in next stage Classification and Identification model;
If the classification score of picture is unsatisfactory for preset condition, which is continued to be input in next stage Classification and Identification model, is obtained
To the classification score of every picture, and continue to determine whether the classification score of every picture meets preset condition, if picture
Classification score meets preset condition, then the classification recognition result of the picture is determined according to the classification score of the picture, and no longer will
The picture is input in next stage Classification and Identification model, if the classification score of picture is unsatisfactory for preset condition, by the picture
It is input in next stage Classification and Identification model, until obtaining the classification recognition result of every picture in pictures.Wherein, item is preset
Part can be more than or equal to probability threshold value for the class probability of picture, wherein the class probability of picture is according to the classification score of picture
It is calculated.
It should be noted that technical solution provided by the embodiment of the present invention, is directed to two classification problems of picture, institute
Two classification of meaning is that presentation class is scored at and is or no, wherein is or no can use default mark characterization.Illustratively, such as
Determine whether certain picture includes illicit content, wherein "Yes" is characterized with " 1 ", and "No" is characterized with " 0 ".If classification is scored at " 1 "
(being), then it represents that the picture includes illicit content, alternatively, if classification is scored at " 0 " (i.e. no), then it represents that the picture does not wrap
Containing illicit content.
Based on above-mentioned, the classification recognition result of picture is determined according to the classification score of picture, can be understood as follows: right in advance
Classification score is set, and score of such as classifying may include " 1 " and " 0 ", wherein " 1 " indicates "Yes", " 0 " indicates "No", "Yes"
It needs to determine determine whether certain picture includes illegal as previously described according to the content to be identified with the concrete meaning of "No"
Content, if classification is scored at " 1 " (being), then it represents that the picture includes illicit content, alternatively, if classification is scored at " 0 "
(not being), then it represents that the picture does not include illicit content.
It is understood that being input to next stage Classification and Identification mould if the complexity of the picture in pictures is different
The number of the picture of type is successively reduced.In addition, the number for being typically due to simple picture is more, in other words, in pictures mostly
Number picture can be identified by current grade classification identification model by Accurate classification, therefore, for next stage Classification and Identification model
For, since the number of picture to be sorted is less, it is thus possible to improve the classification effectiveness of Classification and Identification model.Meanwhile it is above-mentioned
Process also embodies the complexity according to picture, carries out hierarchical identification, compared to all complexities picture by same
For Classification and Identification model carries out Classification and Identification, classification accuracy is improved.The above-mentioned classification accuracy that improves can be managed as follows
Solution: when being trained to same Classification and Identification model, the picture including various complexities to work is propagated reversed gradient
Classification score, and not only include the classification score of difficult picture.It is right when being trained to the Classification and Identification model of different stage
Reversed gradient propagate work by do not include the picture that Classification and Identification goes out classification score, i.e. the classification knowledge of rank more rearward
Other model when training, propagates reversed gradient the classification score that picture is more difficult to being to work, above-mentioned training pattern machine
System, so that the specificity for the Classification and Identification model that training obtains is more prominent.
It should be noted that if there are still the classification of certain pictures by all Classification and Identification models trained in advance
Score is unsatisfactory for preset condition, then can be determined according to the picture by the classification score that afterbody Classification and Identification model obtains should
The classification recognition result of picture.
Illustratively, such as there are N grades of classification identification models, N grades of classification identification models can specifically include first order classification
Identification model, second level Classification and Identification model ..., N-1 grades of classification identification models and N grades of classification identification models, to point
The pictures of class include M picture.M picture is input in first order Classification and Identification model, the classification of every picture is obtained
Score determines that the classification score of U picture meets preset condition, then determines its Classification and Identification according to the classification score of U picture
As a result, (M-U) picture is continued to be input in the Classification and Identification model of the second level, it is pre- to determine that the classification score of K picture meets
If condition, then its classification recognition result is determined according to the classification score of K picture, then (M-U-K) picture is continued to be input to
In third level Classification and Identification model, determine that the classification score of (M-U-K) picture meets preset condition, then according to (M-U-K)
The classification score of picture determines its classification recognition result.So far, the classification recognition result of every picture in pictures is obtained, is terminated
Classification and Identification operation to pictures to be sorted.
The technical solution of the present embodiment, by obtaining pictures to be sorted, pictures include at least two pictures, will be schemed
Piece collection is input in the current grade classification identification model of training in advance, obtains the classification score of every picture, if point of picture
Class score meets preset condition, then the classification recognition result of picture is determined according to classification score;If the classification score of picture is not
Meet preset condition, then continue for picture to be input in next stage Classification and Identification model trained in advance, until obtaining picture
Classification recognition result, every grade of classification identification model is generated based on neural metwork training, using multiclass classification identification model to picture
Classify, improves the accuracy rate and efficiency of picture classification.
Optionally, based on the above technical solution, pictures are input to the current grade Classification and Identification of training in advance
In model, after obtaining the classification score of every picture, it specifically can also include: the classification score according to every picture, obtain
The class probability of every picture.The classification score of picture meets the class probability that preset condition is picture and is more than or equal to probability threshold
Value;The classification score of picture is unsatisfactory for the class probability that preset condition is picture and is less than probability threshold value.
In an embodiment of the present invention, pictures are input in the current grade classification identification model of training in advance, will
To the classification score of every picture, classification score can be interpreted as vector it will be appreciated that, the classification score of pictures is by each
The classification score of picture is formed.
Using classifier, the class probability of picture is calculated according to the classification score of picture, correspondingly, if the classification of picture
Score meets preset condition, then determines the classification recognition result of picture according to the classification score of picture, can specifically include: if
The class probability of picture is more than or equal to probability threshold value, then the classification recognition result of picture is determined according to the classification score of picture.Such as
The classification score of fruit picture is unsatisfactory for preset condition, then is input to picture in next stage Classification and Identification model trained in advance,
Until obtaining the classification recognition result of picture, it can specifically include:, will figure if the class probability of picture is less than probability threshold value
Piece is input in next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of picture.It needs to illustrate
It is that classifier can be Softmax or Logistic etc..
Fig. 2 is the flow chart of another picture classification method provided in an embodiment of the present invention, as shown in Fig. 2, this method has
Body includes the following steps:
Step 210 obtains pictures to be sorted, and pictures include at least two pictures.
Pictures are input in the current grade classification identification model of training in advance by step 220, obtain point of every picture
Class score.
Step 230, the classification score according to every picture, obtain the class probability of every picture.
Step 240 determines whether the class probability of picture is more than or equal to probability threshold value;If so, thening follow the steps 250;If
It is no, then follow the steps 260.
Step 250, the classification recognition result that picture is determined according to the classification score of picture.
Picture is continued to be input in next stage Classification and Identification model trained in advance by step 260, obtains the classification of picture
Score, and return to step 230.
In an embodiment of the present invention, it should be noted that every grade of classification identification model is generated based on neural metwork training.
The technical solution of the present embodiment, by obtaining pictures to be sorted, pictures include at least two pictures, will be schemed
Piece collection is input in the current grade classification identification model of training in advance, the classification score of every picture is obtained, according to every picture
Classification score, obtain the class probability of every picture, if the class probability of picture be more than or equal to probability threshold value, according to point
Class score determines the classification recognition result of picture;If the class probability of picture is less than probability threshold value, continue to input in picture
Into next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of picture, every grade of classification identification model
It is generated based on neural metwork training, is classified using multiclass classification identification model to picture, improve the accurate of picture classification
Rate and efficiency.
Fig. 3 is a kind of flow chart of the generation method of Classification and Identification model provided in an embodiment of the present invention, and the present embodiment can
The case where accuracy rate and efficiency suitable for improving picture classification, this method can be held by the generating means of Classification and Identification model
Row, the device can realize that the device can be configured in equipment, such as be typically by the way of software and/or hardware
Computer or mobile terminal etc..As shown in figure 3, this method specifically comprises the following steps:
Step 310 obtains training sample, and training sample includes the original classification label of trained picture and training picture.
In an embodiment of the present invention, training sample is obtained, training sample may include trained picture and training picture
The number of original classification label, training picture is at least two.Tag along sort is used to characterize the affiliated classification of trained picture.
The original classification label of training picture and training picture is input in neural network model by step 320, is obtained every
Grade neural net layer to the classification score of training picture, and, classification score and classification of the every grade of full articulamentum to trained picture
Label, neural network model include N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade mind
After network layer, N >=3, i ∈ [1, N-1].
In an embodiment of the present invention, neural network model may include N grades of neural net layers and N-1 grades of full articulamentums,
Wherein, the full articulamentum of i-stage is between i+1 grade neural net layer and the i-th+2 grades neural net layers, N >=3, i ∈ [1, N-
1], wherein neural network is the basic principle based on neural network in biology, is understanding and be abstracted human brain structure and the external world
After stimuli responsive mechanism, using network topology knowledge as theoretical basis, processor of the nervous system to complex information of human brain is simulated
A kind of mathematical model of system.The model is specifically the complexity for relying on system, by adjusting internal great deal of nodes (neuron)
Between weight interconnected, to realize processing information.
Neural network may include convolutional neural networks, Recognition with Recurrent Neural Network and deep neural network, below with convolution mind
Through being illustrated for network, the key problem that convolutional neural networks solve is how to automatically extract and abstract characteristics, in turn
By Feature Mapping to task object solving practical problems, a convolutional neural networks are generally made of following three parts, and first
Dividing is input layer, and second part is composed of convolutional layer, excitation layer and pond layer (or down-sampling layer), and Part III is by one
The multi-layer perception (MLP) classifier (i.e. full articulamentum) connected entirely is constituted.There is convolutional neural networks weight to share characteristic, and weight is total
It enjoys and refers to convolution kernel, the same feature of the different location of image data can be extracted by the operation of a convolution kernel, in other words
It says, is the same target of the different location in an image data, they are characterized in essentially identical.It is understood that
It arrives, can only obtain a part of feature using a convolution kernel, can be learnt not by the way that multi-kernel convolution is arranged with each convolution kernel
With feature extract the feature of picture.In picture classification, the effect of convolutional layer is to be by the feature extraction analysis of low level
High-level feature, low level are characterized in essential characteristic, the features such as texture and edge, high-level feature such as face and object
Shape etc., can more show the attribute of sample, this process is exactly the hierarchy of convolutional neural networks.
It should be noted that full articulamentum plays the role of " classifier " in entire convolutional neural networks.If volume
The operations such as lamination, excitation layer and pond layer are if initial data to be mapped to hidden layer feature space, full articulamentum then play by
" the distributed nature expression " acquired is mapped to the effect in sample labeling space.In actual use, full articulamentum can be by convolution
Operation is realized: being the convolution that the full articulamentum connected entirely can be converted into that convolution kernel is 1x1 to front layer;And front layer is convolutional layer
Full articulamentum can be converted into the global convolution that convolution kernel is H × W, and H and W are respectively the height and width of front layer convolution results.At present
Due to connecting layer parameter redundancy entirely, only connection layer parameter can account for whole network parameter 80% or so entirely, therefore some performances are excellent
Different network model such as residual error network model etc. replaces full articulamentum to merge the depth characteristic acquired using the average pondization of the overall situation,
That is, convolutional neural networks may not include full articulamentum.
It should also be noted that, N provided by the embodiment of the present invention grades of full articulamentum is complete except N grades of neural net layers
Articulamentum, i.e. every grade of neural net layer itself may include full articulamentum, still, included full articulamentum in neural net layer
It is different from the full articulamentum of the N grade in neural network model.
Training sample is input in neural network model, i.e., it is the original classification label of training picture and training picture is defeated
Enter into neural network model, obtains every grade of neural net layer to the classification score of training picture, and, every grade of full articulamentum pair
The classification score and tag along sort of training picture, full articulamentum is to the classification score and tag along sort of training picture for calculating mind
Loss function through network layer, neural net layer to the classification score of training picture for calculate the classification score of full articulamentum with
And tag along sort.
Step 330, according to first order neural net layer to the classification score of training picture and the original classification of trained picture
Label obtains the first order loss function of first order neural net layer.
Step 340, according to P-1 grades of full articulamentums to the classification score and tag along sort of training picture, obtain P grade it is refreshing
The P grades of loss functions through network layer, P ∈ [2, N].
Step 350, the loss function that neural network model is determined according to loss functions at different levels, and adjust nerve nets at different levels
The network parameter of network layers and full articulamentum at different levels, until the loss function of neural network model reaches preset function value, then every grade
Classification and Identification model of the neural net layer as respective stages.
In an embodiment of the present invention, loss function is that be mapped as the event or value of one or more variables can be with
Intuitively indicate that the function of the real number of certain associated " cost ", i.e. loss function map the event of one or more variables
Onto real number relevant to some cost.Loss function can be used between measurement model performance and actual value and predicted value
Inconsistency, model performance increase with the reduction of the value of loss function.For the embodiment of the present invention, prediction here
Value refers to classification of the first order neural net layer to the classification score and full articulamentum at different levels of training picture to training picture
Score, the original classification label and full articulamentum at different levels that actual value refers to training picture are to the tag along sort for training picture.
It should be noted that loss function can be cross entropy loss function, 0-1 loss function, quadratic loss function, absolutely loss letter
Several and logarithm loss function etc., can specifically be set according to the actual situation, be not specifically limited herein.
The training process of neural network model is the loss function that neural network model is calculated by propagated forward, and is calculated
Loss function is to the partial derivative of network parameter, using reversed gradient transmission method, to neural net layers at different levels and full connection at different levels
The network parameter of layer is adjusted, until the loss function of neural network model reaches preset function value.Work as neural network model
Loss function value when reaching preset function value, indicate that neural network model train completion, at this point, neural net layer at different levels with
The network parameter of full articulamentums at different levels is also determined.On this basis, using every grade of neural net layer as the classification of respective stages
Identification model, i.e. first order neural net layer are as first order Classification and Identification model, and P grades of neural net layers are as P fraction
Class identification model, P ∈ [2, N].
It should be noted that the loss function of neural network model described in the embodiment of the present invention is by N grades of neural networks
What the loss function weighted sum of layer obtained.Wherein, the first order loss function of first order neural net layer is according to the first order
The classification score of training picture and the original classification label of training picture is calculated in neural net layer, P grades of nerve nets
P grades of loss functions of network layers are calculated according to classification score and tag along sort of the P-1 grades of full articulamentums to training picture
It arrives, P ∈ [2, N].
Furthermore, it is to be understood that arriving, full articulamentums at different levels have the tag along sort to training picture, in other words, every process
The full articulamentum of level-one will update once the tag along sort of training picture, and update is further described here, specific:
For the tag along sort of picture trained for every, P grades of full articulamentums may be with higher level to the tag along sort of the training picture
Full articulamentum is identical to the tag along sort of training picture, it is also possible to the full articulamentum of higher level to the tag along sort of training picture not
Together, all grades before higher level described here refers to P grades, therefore, update described here, which refers to executing, updates behaviour
Make, the result for updating operation may be to be updated that (i.e. P grades full articulamentums are to the instruction to the tag along sort of the training picture
The tag along sort for practicing picture is different to the training tag along sort of picture from the full articulamentum of higher level), it is also possible to the training picture
Tag along sort be not updated that (i.e. P grades full articulamentums are to the tag along sort of the training picture and the full articulamentum pair of higher level
The tag along sort of training picture is identical).
It is also to be appreciated that the classification as based on the loss function of determination neural net layers at different levels to training picture
Score and tag along sort be not identical, and the loss function of obtained neural net layers at different levels is not also identical, therefore, is based on nerve net
The loss function of network model is adjusted the network parameter of neural net layers at different levels and full articulamentum at different levels, and what is finally determined is each
The complexity of grade neural net layer structure is not also identical, correspondingly, the complexity of Classification and Identification model structure at different levels is not yet
It is identical.Based on above-mentioned, Classification and Identification models at different levels can be used for classifying to the picture of corresponding difficulty degree, in other words, letter
Free hand drawing piece can obtain satisfactory classification results by the simple Classification and Identification model of structure, and difficult picture then needs to lead to
Crossing the more complex disaggregated model of structure, just available satisfactory classification results, i.e., Classification and Identification models at different levels are handled respectively
The training picture of corresponding complexity, rather than Classification and Identification model at different levels handles all training pictures.It is above-mentioned so that classifying
Efficiency is greatly improved.
Meanwhile N grades of neural net layers and N-1 grades of full articulamentums are that coorinated training generates, rather than generation is respectively trained
, the result of neural net layers at different levels and full articulamentum at different levels influences each other.The nerves at different levels that training obtains through the above way
The performance of network layer will be better than the neural net layer being only trained to a neural net layer.And due to every grade of nerve net
Classification and Identification model of the network layers as respective stages, therefore, the property for the Classification and Identification models at different levels that training obtains through the above way
It can will be better than the Classification and Identification model being only trained to a neural net layer.
In addition, can be carried out to it just by way of loading pre-training model when to N grades of neural net layer training
Beginningization, pre-training model described here refer to having trained the model of completion, the model and N grade neural net layer to be trained
It is used to classify to similar training sample.
The technical solution of the present embodiment, by obtaining training sample, training sample includes trained picture and training picture
The original classification label of training picture and training picture is input in neural network model, obtains every grade by original classification label
Neural net layer to the classification score of training picture, and, classification score and contingency table of the every grade of full articulamentum to trained picture
Label, neural network model include N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade nerve
After network layer, N >=3, i ∈ [1, N-1].According to first order neural net layer to the classification score and training picture of training picture
Original classification label, the first order loss function of first order neural net layer is obtained, according to P-1 grades of full articulamentums to training
The classification score and tag along sort of picture obtain the P grades of loss functions of P grades of neural net layers, P ∈ [2, N], according at different levels
Loss function determines the loss function of neural network model, and adjusts the network of neural net layers at different levels and full articulamentum at different levels
Parameter, until the loss function of neural network model reaches preset function value, then every grade of neural net layer dividing as respective stages
Class identification model obtains multiclass classification identification model using coorinated training mode, improves Classification and Identification model and carries out picture point
The accuracy rate and efficiency of class.
Optionally, based on the above technical solution, every grade of full articulamentum can lead to the classification score of training picture
It crosses under type such as to generate: according to first order neural net layer to the classification score and second level neural net layer of training picture to instruction
The classification score for practicing picture obtains the full articulamentum of the first order to the classification score of training picture.According to P-1 grades of full articulamentums pair
The classification score of training picture and P+1 grades of neural net layers obtain P grades of full articulamentums pair to the classification score of training picture
The classification score of training picture, P ∈ [2, N].
In an embodiment of the present invention, other at different levels complete except the full articulamentum of the first order is classified exceptionally to training picture
Articulamentum can generate the classification score of training picture in the following way: classification of the P-1 grades of full articulamentums to training picture
Score and P+1 grades of neural net layers obtain classification of the P grades of full articulamentums to training picture to the classification score of training picture
Score, wherein P ∈ [2, N].
The full articulamentum of the first order can generate the classification score of training picture in the following way: according to first order nerve net
Network layers, to the classification score of training picture, obtain the first order and connect entirely to the classification score and second level neural net layer of training picture
Layer is connect to the classification score of training picture.
Optionally, based on the above technical solution, every grade of full articulamentum can lead to the tag along sort of training picture
It crosses under type such as to generate: updating original point of training picture according to classification score of the first order neural net layer to training picture
Class label obtains the full articulamentum of the first order to the tag along sort of training picture.According to P-1 grades of full articulamentums to training picture
Score of classifying updates P-1 grades of full articulamentums to the tag along sort of training picture, obtains P grades of full articulamentums to training picture
Tag along sort, P ∈ [2, N].
In an embodiment of the present invention, full articulamentums at different levels have the tag along sort to training picture, in other words, every warp
The tag along sort of training picture will be updated once by crossing the full articulamentum of level-one, and specific every grade of full articulamentum divides training picture
Class label can generate in the following way: update training picture according to classification score of the first order neural net layer to training picture
Original classification label, obtain the full articulamentum of the first order to training picture tag along sort, according to P-1 grades of full articulamentums to instruction
The classification score for practicing picture updates P-1 grades of full articulamentums to the tag along sort of training picture.It should be noted that described here
Update refer to execute update operation, specifically whether to tag along sort be updated can by network layer to training picture divide
Whether class score, which meets preset condition, is determined, preset condition described here can be with are as follows: according to network layer to training picture
Classification score obtains network layer to the class probability of training picture;Network layer is more than or equal to the class probability of training picture general
Rate threshold value.
Optionally, based on the above technical solution, the classifying to training picture according to first order neural net layer
Divide the original classification label for updating training picture, obtains the full articulamentum of the first order to the tag along sort of training picture, specifically may be used
To include: the classification score according to first order neural net layer to training picture, obtains first order neural net layer and training is schemed
The class probability of piece.First order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then will instruction
The original classification tag modification for practicing picture is default tag along sort, and using default tag along sort as the full articulamentum of the first order to instruction
Practice the tag along sort of picture.First order neural net layer less than the first probability threshold value, then keeps the class probability of training picture
The original classification label of training picture is constant, and using the original classification label of training picture as the full articulamentum of the first order to training
The tag along sort of picture.
In an embodiment of the present invention, the classification score of training picture can be converted to by trained picture using classifier
Class probability, classifier described here can be Softmax function, Softmax function score of classifying can be mapped to (0,
1) in section, probability can be regarded as to understand, the classification score of training picture can be converted into trained picture by Softmax
Class probability.In addition, classifier can also be Logistic function, specifically select which kind of classifier can according to the actual situation into
Row determines, is not specifically limited herein.
According to first order neural net layer to the classification score of training picture, obtains first order neural net layer and training is schemed
The class probability of piece, if first order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture,
It can be default tag along sort by the original classification tag modification of training picture, and be connected entirely using default tag along sort as the first order
Tag along sort of the layer to training picture;If first order neural net layer is to the class probability of training picture less than the first probability threshold
Value can then keep training the original classification label of picture constant, and the original classification label of training picture is complete as the first order
Tag along sort of the articulamentum to training picture.Wherein, whether the first probability threshold value can be used as to the original classification mark for training picture
The standard modified is signed, specific value size can be set according to the actual situation, is not specifically limited herein.
It should be noted that herein for object be every trained picture, i.e., it needs to be determined that point of every trained picture
The relationship of class probability and the first probability threshold value, and determine that the tag along sort of the Zhang Xunlian picture be modification or guarantor according to result
It stays.
Separately it should be noted that if first order neural net layer is more than or equal to first generally to the class probability of training picture
Rate threshold value, then the reason of tag along sort of training picture being revised as default tag along sort, are: if first order neural network
Layer is more than or equal to the first probability threshold value to the class probability of training picture, it can be said that bright first order neural net layer is to the training
The classification results of picture have met the requirements, by the way that the tag along sort of the training picture is revised as default tag along sort, so that after
Continuous when being adjusted according to loss function to network parameter, the default corresponding trained picture of tag along sort is not involved in junior's nerve
The adjustment of the network parameter of network layer and full articulamentum.
If first order neural net layer, less than the first probability threshold value, keeps training figure to the class probability of training picture
The reason that the original classification label of piece is constant is: if first order neural net layer is to the class probability of training picture less than the
One probability threshold value, it can be said that bright first order neural net layer is undesirable to the classification results of the training picture, by this
The original classification label of training picture is constant, so that subsequent when being adjusted according to loss function to network parameter, the training figure
Piece participates in the adjustment to the network parameter of junior's neural net layer and full articulamentum.
Optionally, based on the above technical solution, according to P-1 grades of full articulamentums to the classification score of training picture
P-1 grades of full articulamentums are updated to the tag along sort of training picture, obtain P grades of full articulamentums to the contingency table of training picture
Label, P ∈ [2, N] can specifically include: according to P-1 grades of full articulamentums to the classification score of training picture, obtain P-1 grades
Class probability of the full articulamentum to training picture, P ∈ [2, N].P-1 grades of full articulamentums are greater than the class probability of training picture
Equal to P probability threshold value, then tag along sort of the P-1 grades of full articulamentums to training picture is revised as default tag along sort, and
Using default tag along sort as P grades of full articulamentums to the tag along sort of training picture.P-1 grades of full articulamentums are to training picture
Class probability less than P probability threshold value, then keep P-1 grades of full articulamentums to training picture tag along sort it is constant, and will
P-1 grades of full articulamentums are to the tag along sort of training picture as P grades of full articulamentums to the tag along sort of training picture.
In an embodiment of the present invention, as it was noted above, can will equally train the classification score of picture using classifier
The class probability of trained picture is converted to, it can also be Logistic letter that classifier described here, which can be Softmax function,
Which kind of classifier number, specifically select to be determined according to the actual situation, be not specifically limited herein.
According to P-1 grades of full articulamentums to the classification score of training picture, P-1 grades of full articulamentums are obtained to training picture
Class probability, P ∈ [2, N], if P-1 grades of neural net layers to training picture class probability be more than or equal to P probability
Tag along sort of the P-1 grades of full articulamentums to training picture, then can be revised as default tag along sort by threshold value, and by default classification
Label is as P grades of full articulamentums to the tag along sort of training picture;If classification of the P-1 grades of full articulamentums to training picture
Probability can then keep P-1 grades of full articulamentums constant to the tag along sort of training picture less than P probability threshold value, and by P-1
The full articulamentum of grade is to the tag along sort of training picture as P grades of full articulamentums to the tag along sort of training picture.Wherein, P
Probability threshold value can be used as the standard whether modified to tag along sort of the P-1 grades of full articulamentums to training picture, specific
Numerical values recited can be set according to the actual situation, be not specifically limited herein.
It should be noted that as it was noted above, herein for object be still every trained picture, i.e., it needs to be determined that it is every
The class probability of Zhang Xunlian picture and the relationship of P probability threshold value, and the contingency table to the Zhang Xunlian picture is determined according to result
Label are modifications or retain.
Separately it should be noted that if P-1 grades of full articulamentums are more than or equal to P probability to the class probability of training picture
P-1 grades of full articulamentums are then by threshold value to the reason of training the tag along sort of picture to be revised as default tag along sort: if
P-1 grades of full articulamentums are more than or equal to the first probability threshold value to the class probability of training picture, it can be said that bright P grades of nerve nets
Network layers have met the requirements to the classification results of the training picture, by the way that the tag along sort of the training picture is revised as default classification
Label, so that subsequent when being adjusted according to loss function to network parameter, the default corresponding trained picture of tag along sort is not joined
With the adjustment of the network parameter to junior's neural net layer and full articulamentum.
If P-1 grades of full articulamentums, less than P probability threshold value, are kept for P-1 grades to the class probability of training picture
The full articulamentum reason constant to the tag along sort of training picture is: if P-1 grades of full articulamentums divide training picture
Class probability is less than P probability threshold value, it can be said that bright P grades of neural net layers do not meet the classification results of the training picture and want
It asks, the tag along sort by the training picture is constant, so that subsequent when being adjusted according to loss function to network parameter, the instruction
Practice adjustment of the picture participation to the network parameter of junior's neural net layer and full articulamentum.
It is above-mentioned every just primary to the tag along sort update of training picture by the full articulamentum of level-one, realize simple exercise figure
Piece is not involved in the adjustment of the network parameter to junior's neural net layer and full articulamentum, so that the nerves at different levels that training obtains
The complexity of network layer structure is different.
Optionally, based on the above technical solution, the loss of neural network model is determined according to loss functions at different levels
Function, and the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until the loss letter of neural network model
Number reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages, can specifically include: according to each
Grade loss function determines the loss function of neural network model.Loss function is calculated to neural net layers at different levels and full connection at different levels
The partial derivative of the network parameter of layer, the partial derivative that the corresponding trained picture of tag along sort is preset in loss function is zero.According to inclined
Derivative adjusts the network parameter of neural net layers at different levels and full articulamentum at different levels, and recalculates loss function, until loss letter
Number reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages.
In an embodiment of the present invention, according to loss functions at different levels determine neural network model loss function can make it is as follows
Understand: summation being weighted to loss functions at different levels, obtains the loss function of neural network model, loss letters at different levels can be set
Loss functions at different levels are multiplied to obtain weighted value with corresponding proportionality coefficient respectively, then add each by the corresponding proportionality coefficient of number
Weight is added to obtain the loss function of neural network model.Illustratively, such as the loss function Loss of every grade of neural net layer
(fi), the corresponding proportionality coefficient of every grade of loss function is Ti, wherein i ∈ [1, N], then the loss function of neural network model can table
It is shown asBased on above-mentioned it will be appreciated that, it can be corresponding by adjusting every grade of loss function
Proportionality coefficient adjust loss function Loss (f at different levelsi) ratio shared in the loss function of neural network model.It needs
Illustrate, loss function can for cross entropy loss function, 0-1 loss function, quadratic loss function, absolute loss function and
Logarithm loss function etc. can specifically be set according to the actual situation, is not specifically limited herein.
After the loss function for determining neural network model according to loss functions at different levels, loss function is calculated to nerve nets at different levels
The partial derivative of the network parameter of network layers and full articulamentum at different levels, network parameter described here includes weight and biasing, using anti-
To gradient transmission method, the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to partial derivative, and is counted again
Loss function is calculated, until loss function reaches preset function value, preset function value described here can be least disadvantage function
Value, after loss function reaches preset function value, it can be said that bright neural network model has trained completion, can join according to network
Number determines neural net layers at different levels, and using every grade of neural net layer as the Classification and Identification model of respective stages.
It should be noted that presetting the corresponding instruction of tag along sort in loss function in neural network model training process
The partial derivative for practicing picture is zero, i.e., the default corresponding trained picture of tag along sort is not involved in junior's neural net layer and full articulamentum
Network parameter adjustment.
Fig. 4 is the flow chart of the generation method of another Classification and Identification model provided in an embodiment of the present invention, the present embodiment
Be applicable to improve picture classification accuracy rate and efficiency the case where, this method can by the generating means of Classification and Identification model Lai
It executes, which can realize that the device can be configured in equipment, such as typically by the way of software and/or hardware
It is computer or mobile terminal etc..As shown in figure 4, this method specifically comprises the following steps:
Step 410 obtains training sample, and training sample includes the original classification label of trained picture and training picture.
The original classification label of training picture and training picture is input in neural network model by step 420, is obtained every
Classification score of the grade neural net layer to training picture.
Step 430, according to first order nerve stratum reticulare to training picture classification score and second level neural net layer to instruction
The classification score for practicing picture obtains the full articulamentum of the first order to the classification score of training picture;According to P-1 grades of full articulamentums pair
It is right to obtain P grades of full white silk grade layers to the classification score of training picture for the classification score of training picture and P+1 grades of neural net layers
The classification score of training picture, P ∈ [2, N].
Step 440, original point that training picture is updated according to classification score of the first order neural net layer to training picture
Class label obtains the first full articulamentum to the tag along sort of training picture;Classification according to P-1 grades of full articulamentums to training picture
Score updates P-1 grades of full articulamentums to the tag along sort of training picture, obtains classification of the P grades of full articulamentums to training picture
Label.
Step 450, according to first order neural net layer to the classification score of training picture and the original classification of trained picture
Label obtains the first order loss function of first order neural net layer;Classification according to P-1 grades of full articulamentums to training picture
Score and tag along sort obtain the P grades of loss functions of P grades of neural net layers.
Step 460, the loss function that neural network model is determined according to loss functions at different levels.
Step 470 calculates loss function to the partial derivative of the network parameter of neural net layers at different levels and full articulamentum at different levels,
It is zero that the partial derivative of the corresponding trained picture of tag along sort is preset in loss function.
Step 480, the network parameter that neural net layers at different levels and full articulamentum at different levels are adjusted according to partial derivative, and count again
Loss function is calculated, until loss function reaches preset function value, then Classification and Identification mould of the every grade of neural net layer as respective stages
Type.
In an embodiment of the present invention, training figure is updated according to classification score of the first order neural net layer to training picture
The original classification label of piece obtains the full articulamentum of the first order to the tag along sort of training picture, can specifically include: according to the
Level-one neural net layer obtains first order neural net layer to the class probability of training picture to the classification score of training picture.
First order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then by original point of training picture
Class label is revised as default tag along sort, and using default tag along sort as the full articulamentum of the first order to the contingency table of training picture
Label.First order neural net layer less than the first probability threshold value, then keeps training the original of picture to the class probability of training picture
Tag along sort is constant, and using the original classification label of training picture as the full articulamentum of the first order to the contingency table of training picture
Label.
P-1 grades of full articulamentums are updated to training picture according to classification score of the P-1 grades of full articulamentums to training picture
Tag along sort, obtain P grades of full articulamentums to the tag along sort of training picture, P ∈ [2, N] can specifically include: according to the
P-1 grades of full articulamentums obtain P-1 grades of full articulamentums to the class probability of training picture, P ∈ to the classification score of training picture
[2, N].P-1 grades of full articulamentums are more than or equal to P probability threshold value to the class probability of training picture, then connect P-1 grades entirely
It connects layer and default tag along sort is revised as to the tag along sort of training picture, and using default tag along sort as P grades of full articulamentums
To the tag along sort of training picture.P-1 grades of full articulamentums less than P probability threshold value, then protect the class probability of training picture
It is constant to the tag along sort of training picture to hold P-1 grades of full articulamentums, and the classification by P-1 grades of full articulamentums to training picture
Label is as P grades of full articulamentums to the tag along sort of training picture.
Technical solution provided by embodiment for a better understanding of the present invention includes below three-level with neural network model
It is illustrated for neural net layer and the full articulamentum of two-stage, specific:
Fig. 5 is a kind of structural schematic diagram of neural network model provided in an embodiment of the present invention.The neural network model packet
Include three-level neural net layer and the full articulamentum of two-stage, respectively first order neural net layer, second level neural net layer and third
Grade neural net layer, and, the full articulamentum of the full articulamentum of the first order and the second level, wherein the full articulamentum of the first order is located at second
After grade neural net layer, the full articulamentum in the second level is located at after third level neural net layer.
Training sample is obtained, training sample includes the original classification label of trained picture and training picture, by training sample
It is input in neural network model, obtains the classification score of every grade of neural net layer.Training is schemed according to first order nerve stratum reticulare
The classification score and second level neural net layer of piece obtain the full articulamentum of the first order and scheme to training to the classification score of training picture
The classification score of piece;Training is schemed according to classification score and third level neural net layer of the full articulamentum of the first order to training picture
The classification score of piece obtains the second level and practices grade layer entirely to the classification score of training picture.
The original classification label that training picture is updated according to classification score of the first order neural net layer to training picture, obtains
To the first full articulamentum to the tag along sort of training picture;It is updated according to classification score of the full articulamentum of the first order to training picture
The full articulamentum of the first order obtains the full articulamentum in the second level to the tag along sort of training picture to the tag along sort of training picture.
According to first order neural net layer to the classification score of training picture and the original classification label of training picture, obtain
The first order loss function of first order neural net layer;According to the full articulamentum of the first order to the classification score and classification of training picture
Label obtains the second level loss function of second level neural net layer.
The loss function of neural network model is determined according to first order loss function and second level loss function.
Calculate partial derivative of the loss function to the network parameter of neural net layers at different levels and full articulamentum at different levels, loss function
In preset the corresponding trained picture of tag along sort partial derivative be zero.
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to partial derivative, and recalculates loss letter
Number, until loss function reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages, i.e., first
Grade neural net layer is as first order Classification and Identification model, and second level neural net layer is as second level Classification and Identification model, and the
Three-level neural net layer is as third level Classification and Identification model.
The technical solution of the present embodiment, by obtaining training sample, training sample includes trained picture and training picture
The original classification label of training picture and training picture is input in neural network model, obtains every grade by original classification label
Neural net layer to the classification score of training picture, and, classification score and contingency table of the every grade of full articulamentum to trained picture
Label, neural network model include N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade nerve
After network layer, N >=3, i ∈ [1, N-1].According to first order neural net layer to the classification score and training picture of training picture
Original classification label, the first order loss function of first order neural net layer is obtained, according to P-1 grades of full articulamentums to training
The classification score and tag along sort of picture obtain the P grades of loss functions of P grades of neural net layers, P ∈ [2, N], according at different levels
Loss function determines the loss function of neural network model, and adjusts the network of neural net layers at different levels and full articulamentum at different levels
Parameter, until the loss function of neural network model reaches preset function value, then every grade of neural net layer dividing as respective stages
Class identification model obtains multiclass classification identification model using coorinated training mode, improves Classification and Identification model and carries out picture point
The accuracy rate and efficiency of class.
Fig. 6 is a kind of structural schematic diagram of picture classifier provided in an embodiment of the present invention, and the present embodiment is applicable to
The case where improving the accuracy rate and efficiency of picture classification, which can be realized by the way of software and/or hardware, the device
It can be configured in equipment, such as typically computer or mobile terminal etc..As shown in fig. 6, the device specifically includes:
Pictures obtain module 510, and for obtaining pictures to be sorted, pictures include at least two pictures.
Classification results generation module 520, for pictures being input in the current grade classification identification model of training in advance,
Obtain the classification score of every picture.
Classification recognition result generation module 530, if the classification score for picture meets preset condition, according to classification
Score determines the classification recognition result of picture;If the classification score of picture is unsatisfactory for preset condition, continue to input in picture
Into next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of picture;Wherein, every grade of Classification and Identification
Model is generated based on neural metwork training.
The technical solution of the present embodiment, by obtaining pictures to be sorted, pictures include at least two pictures, will be schemed
Piece collection is input in the current grade classification identification model of training in advance, obtains the classification score of every picture, if point of picture
Class score meets preset condition, then the classification recognition result of picture is determined according to classification score;If the classification score of picture is not
Meet preset condition, then continue for picture to be input in next stage Classification and Identification model trained in advance, until obtaining picture
Classification recognition result, every grade of classification identification model is generated based on neural metwork training, using multiclass classification identification model to picture
Classify, improves the accuracy rate and efficiency of picture classification.
Optionally, based on the above technical solution, which specifically can also include:
Class probability obtains module and obtains the class probability of every picture for the classification score according to every picture.
The classification score of picture meets the class probability that preset condition is picture and is more than or equal to probability threshold value;The classification of picture
Score is unsatisfactory for the class probability that preset condition is picture and is less than probability threshold value.
Any embodiment of that present invention institute can be performed in the picture classifier that equipment is configured at provided by the embodiment of the present invention
The picture classification method applied to equipment provided, has the corresponding functional module of execution method and beneficial effect.
Fig. 7 is a kind of structural schematic diagram of the generating means of Classification and Identification model provided in an embodiment of the present invention, this implementation
Example is applicable to the case where improving the accuracy rate and efficiency of picture classification, which can be by the way of software and/or hardware
It realizes, which can be configured in equipment, such as typically computer or mobile terminal etc..As shown in fig. 7, the device has
Body includes:
Training sample obtains module 610, and for obtaining training sample, training sample includes trained picture and training picture
Original classification label.
Score of classifying and tag along sort generation module 620, for the original classification label of picture and training picture will to be trained
It is input in neural network model, obtains every grade of neural net layer to the classification score of training picture, and, every grade of full articulamentum
To the classification score and tag along sort of training picture, neural network model includes N grades of neural net layers and N-1 grades of full articulamentums,
The full articulamentum of i-stage is located at after i+1 grade neural net layer, N >=3, i ∈ [1, N-1].
First order loss function generation module 630, for the classifying to training picture according to first order neural net layer
The original classification label for dividing and training picture, obtains the first order loss function of first order neural net layer.
P grades of loss function generation modules 640, for the classification score according to P-1 grades of full articulamentums to training picture
And tag along sort, obtain the P grades of loss functions of P grades of neural net layers, P ∈ [2, N].
Classification and Identification model generation module 650, for determining the loss letter of neural network model according to loss functions at different levels
Number, and the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until the loss function of neural network model
Reach preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages.
The technical solution of the present embodiment, by obtaining training sample, training sample includes trained picture and training picture
The original classification label of training picture and training picture is input in neural network model, obtains every grade by original classification label
Neural net layer to the classification score of training picture, and, classification score and contingency table of the every grade of full articulamentum to trained picture
Label, neural network model include N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade nerve
After network layer, N >=3, i ∈ [1, N-1].According to first order neural net layer to the classification score and training picture of training picture
Original classification label, the first order loss function of first order neural net layer is obtained, according to P-1 grades of full articulamentums to training
The classification score and tag along sort of picture obtain the P grades of loss functions of P grades of neural net layers, P ∈ [2, N], according at different levels
Loss function determines the loss function of neural network model, and adjusts the network of neural net layers at different levels and full articulamentum at different levels
Parameter, until the loss function of neural network model reaches preset function value, then every grade of neural net layer dividing as respective stages
Class identification model obtains multiclass classification identification model using coorinated training mode, improves Classification and Identification model and carries out picture point
The accuracy rate and efficiency of class.
Optionally, based on the above technical solution, every grade of full articulamentum can lead to the classification score of training picture
Under type such as is crossed to generate:
According to first order neural net layer to the classification score and second level neural net layer of training picture to training picture
Classification score, obtain the full articulamentum of the first order to training picture classification score.
According to P-1 grades of full articulamentums to the classification score of training picture and P+1 grades of neural net layers to training picture
Classification score, obtain P grades of full articulamentums to training picture classification score, P ∈ [2, N].
Optionally, based on the above technical solution, every grade of full articulamentum can lead to the tag along sort of training picture
Under type such as is crossed to generate:
The original classification label of training picture is updated according to classification score of the first order neural net layer to training picture,
The full articulamentum of the first order is obtained to the tag along sort of training picture.
P-1 grades of full articulamentums are updated to training picture according to classification score of the P-1 grades of full articulamentums to training picture
Tag along sort, obtain P grades of full articulamentums to training picture tag along sort, P ∈ [2, N].
Optionally, based on the above technical solution, the classifying to training picture according to first order neural net layer
Divide the original classification label for updating training picture, obtains the full articulamentum of the first order to the tag along sort of training picture, specifically may be used
To include:
According to first order neural net layer to the classification score of training picture, obtains first order neural net layer and training is schemed
The class probability of piece.
First order neural net layer less than the first probability threshold value, then will train the original of picture to the class probability of training picture
Beginning tag along sort is revised as default tag along sort, and divides using default tag along sort as the full articulamentum of the first order training picture
Class label.
First order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then keeps training figure
The original classification label of piece is constant, and using the original classification label of training picture as the full articulamentum of the first order to training picture
Tag along sort.
Optionally, based on the above technical solution, according to P-1 grades of full articulamentums to the classification score of training picture
P-1 grades of full articulamentums are updated to the tag along sort of training picture, obtain P grades of full articulamentums to the contingency table of training picture
Label, P ∈ [2, N] can specifically include:
According to P-1 grades of full articulamentums to the classification score of training picture, P-1 grades of full articulamentums are obtained to training picture
Class probability, P ∈ [2, N].
P-1 grades of full articulamentums less than P probability threshold value, then connect P-1 grades the class probability of training picture entirely
Layer is revised as default tag along sort to the tag along sort of training picture, and using default tag along sort as P grades of full articulamentums pair
The tag along sort of training picture.
P-1 grades of full articulamentums are more than or equal to P probability threshold value to the class probability of training picture, then keep P-1 grades
Full articulamentum is constant to the tag along sort of training picture, and using P-1 grade full articulamentums to the tag along sort of trained picture as
Tag along sort of the P grades of full articulamentums to training picture.
Optionally, based on the above technical solution, the loss of neural network model is determined according to loss functions at different levels
Function, and the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until the loss letter of neural network model
Number reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages, can specifically include:
The loss function of neural network model is determined according to loss functions at different levels.
Calculate partial derivative of the loss function to the network parameter of neural net layers at different levels and full articulamentum at different levels, loss function
In preset the corresponding trained picture of tag along sort partial derivative be zero.
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to partial derivative, and recalculates loss letter
Number, until loss function reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages.
The executable present invention of generating means that the Classification and Identification model of equipment is configured at provided by the embodiment of the present invention appoints
Generation method provided by embodiment of anticipating applied to the Classification and Identification model of equipment, has the corresponding functional module of execution method
And beneficial effect.
Fig. 8 is a kind of structural schematic diagram of equipment provided in an embodiment of the present invention.Fig. 8, which is shown, to be suitable for being used to realizing this hair
The block diagram of the example devices 712 of bright embodiment.The equipment 712 that Fig. 8 is shown is only an example, should not be to of the invention real
The function and use scope for applying example bring any restrictions.
As shown in figure 8, equipment 712 is showed in the form of universal computing device.The component of equipment 712 may include but unlimited
In one or more processor 716, system storage 728, it is connected to different system components (including system storage 728 He
Processor 716) bus 718.
Bus 718 indicates one of a few class bus structures or a variety of, including memory bus or Memory Controller,
Peripheral bus, graphics acceleration port, processor or the local bus using any bus structures in a variety of bus structures.It lifts
Example for, these architectures include but is not limited to industry standard architecture (Instruction Set Architecture,
ISA) bus, microchannel architecture (Micro Channel Architecture, MCA) bus are enhanced
(Instruction Set Architecture, ISA) bus, Video Electronics Standards Association (Video Electronics
Standards Association, VESA) local bus and peripheral component interconnection (Peripheral Component
Interconnect, PCI) bus.
Equipment 712 typically comprises a variety of computer system readable media.These media can be and any can be moved
The usable medium that terminal 712 accesses, including volatile and non-volatile media, moveable and immovable medium.
System storage 728 may include the computer system readable media of form of volatile memory, such as deposit at random
Access to memory (Random Access Memory, RAM) 730 and/or cache memory 732.Equipment 712 can be further
Including other removable/nonremovable, volatile/non-volatile computer system storage mediums.Only as an example, storage system
System 734 can be used for reading and writing immovable, non-volatile magnetic media (Fig. 8 do not show, commonly referred to as " hard disk drive ").To the greatest extent
It is not shown in pipe Fig. 8, the disc driver for reading and writing to removable non-volatile magnetic disk (such as " floppy disk ") can be provided, with
And to removable anonvolatile optical disk (such as (Computer Disc Read-Only Memory, CD-ROM), digital video disk
(Digital Video Disc-Read Only Memory, DVD-ROM) or other optical mediums) read-write CD drive.
In these cases, each driver can be connected by one or more data media interfaces with bus 718.Memory
728 may include at least one program product, which has one group of (for example, at least one) program module, these programs
Module is configured to perform the function of various embodiments of the present invention.
Program/utility 740 with one group of (at least one) program module 742, can store in such as memory
In 728, such program module 742 includes but is not limited to operating system, one or more application program, other program modules
And program data, it may include the realization of network environment in each of these examples or certain combination.Program module 742
Usually execute the function and/or method in embodiment described in the invention.
Equipment 712 can also be logical with one or more external equipments 714 (such as keyboard, sensing equipment, display 724 etc.)
Letter, can also be enabled a user to one or more equipment interact with the equipment 712 communicate, and/or with make the equipment 712
Any equipment (such as network interface card, modem etc.) that can be communicated with one or more of the other calculating equipment communicates.This
Kind communication can be carried out by input/output (I/O) interface 722.Also, equipment 712 can also by network adapter 720 with
One or more network (such as local area network (Local Area Network, LAN), wide area network (Wide Area Network,
WAN) and/or public network, for example, internet) communication.As shown, network adapter 720 passes through bus 718 and equipment 712
Other modules communication.It should be understood that although being not shown in Fig. 8 other hardware and/or software can be used with bonding apparatus 712
Module, including but not limited to: microcode, device driver, redundant processing unit, external disk drive array, disk array
(Redundant Arrays of Independent Disks, RAID) system, tape drive and data backup storage system
System etc..
Processor 716 by the program that is stored in system storage 728 of operation, thereby executing various function application and
Data processing, such as realize a kind of picture classification method provided by the embodiment of the present invention, this method comprises:
Pictures to be sorted are obtained, pictures include at least two pictures.
Pictures are input in the current grade classification identification model of training in advance, the classification score of every picture is obtained.
If the classification score of picture meets preset condition, the classification recognition result of picture is determined according to classification score;
If the classification score of picture is unsatisfactory for preset condition, continue for picture to be input to next stage Classification and Identification mould trained in advance
In type, until obtaining the classification recognition result of picture;Wherein, every grade of classification identification model is generated based on neural metwork training.
The embodiment of the invention also provides another equipment comprising: one or more processors;Memory, for depositing
Store up one or more programs;When one or more of programs are executed by one or more of processors, so that one
Or multiple processors realize a kind of generation method of Classification and Identification model provided by the embodiment of the present invention, this method comprises:
Training sample is obtained, training sample includes the original classification label of trained picture and training picture.
The original classification label of training picture and the trained picture is input in neural network model, every grade of mind is obtained
Through network layer to the classification score of training picture, and, every grade of full articulamentum to the classification score and tag along sort of trained picture,
Neural network model includes N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade neural network
After layer, N >=3, i ∈ [1, N-1].
According to first order neural net layer to the classification score of training picture and the original classification label of training picture, obtain
The first order loss function of first order neural net layer.
According to P-1 grades of full articulamentums to the classification score and tag along sort of training picture, P grades of neural net layers are obtained
P grades of loss functions, P ∈ [2, N].
The loss function of neural network model is determined according to loss functions at different levels, and adjusts neural net layers at different levels and each
The network parameter of the full articulamentum of grade, until the loss function of neural network model reaches preset function value, then every grade of neural network
Classification and Identification model of the layer as respective stages.
Certainly, it will be understood by those skilled in the art that processor can also realize that any embodiment of that present invention provides answers
The technology of the technical solution of picture classification method for equipment or the generation method of the Classification and Identification model applied to equipment
Scheme.The content that the hardware configuration and function of the equipment can be found in embodiment is explained.
The embodiment of the invention also provides a kind of computer readable storage mediums, are stored thereon with computer program, the journey
A kind of picture classification method as provided by the embodiment of the present invention is realized when sequence is executed by processor, this method comprises:
Pictures to be sorted are obtained, pictures include at least two pictures.
Pictures are input in the current grade classification identification model of training in advance, the classification score of every picture is obtained.
If the classification score of picture meets preset condition, the classification recognition result of picture is determined according to classification score;
If the classification score of picture is unsatisfactory for preset condition, continue for picture to be input to next stage Classification and Identification mould trained in advance
In type, until obtaining the classification recognition result of picture;Wherein, every grade of classification identification model is generated based on neural metwork training.
The computer storage medium of the embodiment of the present invention, can be using any of one or more computer-readable media
Combination.Computer-readable medium can be computer-readable signal media or computer readable storage medium.It is computer-readable
Storage medium for example may be-but not limited to-the system of electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, device or
Device, or any above combination.The more specific example (non exhaustive list) of computer readable storage medium includes: tool
There are electrical connection, the portable computer diskette, hard disk, random access memory (Random Access of one or more conducting wires
Memory, RAM), read-only memory (Read-Only Memory, ROM), erasable programmable read only memory (Erasable
Programmable Read Only Memory, EPROM), flash memory, optical fiber, portable compact disc read-only memory
(Computer Disc Read-Only Memory, CD-ROM), light storage device, magnetic memory device or above-mentioned any
Suitable combination.In this document, computer readable storage medium can be any tangible medium for including or store program, should
Program can be commanded execution system, device or device use or in connection.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By the use of instruction execution system, device or device or program in connection.
The program code for including on computer-readable medium can transmit with any suitable medium, including --- but it is unlimited
In wireless, electric wire, optical cable, RF etc. or above-mentioned any appropriate combination.
The computer for executing operation of the present invention can be write with one or more programming languages or combinations thereof
Program code, described program design language include object oriented program language-such as Java, Smalltalk, C++,
It further include conventional procedural programming language-such as " C " language or similar programming language.Program code can be with
It fully executes, partly execute on the user computer on the user computer, being executed as an independent software package, portion
Divide and partially executes or executed on a remote computer or server completely on the remote computer on the user computer.?
It is related in the situation of remote computer, remote computer can pass through the network of any kind --- including local area network (Local
Area Network, LAN) or wide area network (Wide Area Network, WAN)-be connected to subscriber computer, alternatively, can be with
It is connected to outer computer (such as connecting using ISP by internet).
The embodiment of the invention also provides another computer readable storage medium, the computer executable instructions by
For executing a kind of generation method of Classification and Identification model when computer processor executes, this method comprises:
Training sample is obtained, training sample includes the original classification label of trained picture and training picture.
The original classification label of training picture and the trained picture is input in neural network model, every grade of mind is obtained
Through network layer to the classification score of training picture, and, every grade of full articulamentum to the classification score and tag along sort of trained picture,
Neural network model includes N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade neural network
After layer, N >=3, i ∈ [1, N-1].
According to first order neural net layer to the classification score of training picture and the original classification label of training picture, obtain
The first order loss function of first order neural net layer.
According to P-1 grades of full articulamentums to the classification score and tag along sort of training picture, P grades of neural net layers are obtained
P grades of loss functions, P ∈ [2, N].
The loss function of neural network model is determined according to loss functions at different levels, and adjusts neural net layers at different levels and each
The network parameter of the full articulamentum of grade, until the loss function of neural network model reaches preset function value, then every grade of neural network
Classification and Identification model of the layer as respective stages.
Certainly, a kind of computer readable storage medium provided by the embodiment of the present invention, computer executable instructions are not
Be limited to method as described above operation, can also be performed equipment provided by any embodiment of the invention picture classification method and
Relevant operation in the generation method of Classification and Identification model.It can be found in the content in embodiment to the introduction of storage medium to explain.
Note that the above is only a better embodiment of the present invention and the applied technical principle.It will be appreciated by those skilled in the art that
The invention is not limited to the specific embodiments described herein, be able to carry out for a person skilled in the art it is various it is apparent variation,
It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out by above embodiments to the present invention
It is described in further detail, but the present invention is not limited to the above embodiments only, without departing from the inventive concept, also
It may include more other equivalent embodiments, and the scope of the invention is determined by the scope of the appended claims.
Claims (12)
1. a kind of picture classification method characterized by comprising
Pictures to be sorted are obtained, the pictures include at least two pictures;
The pictures are input in the current grade classification identification model of training in advance, the classification score of every picture is obtained;
If the classification score of picture meets preset condition, the Classification and Identification knot of the picture is determined according to the classification score
Fruit;If the classification score of picture is unsatisfactory for preset condition, continue for the picture to be input to next fraction of training in advance
In class identification model, until obtaining the classification recognition result of the picture;Wherein, every grade of classification identification model is based on neural network
Training generates.
2. the method according to claim 1, wherein described be input to the current of training in advance for the pictures
In grade classification identification model, after obtaining the classification score of every picture, further includes:
According to the classification score of every picture, the class probability of every picture is obtained;
The classification score of picture meets the class probability that preset condition is picture and is more than or equal to probability threshold value;The classification score of picture
It is unsatisfactory for the class probability that preset condition is picture and is less than probability threshold value.
3. a kind of generation method of Classification and Identification model characterized by comprising
Training sample is obtained, the training sample includes the original classification label of trained picture and the trained picture;
The original classification label of the trained picture and the trained picture is input in neural network model, every grade of mind is obtained
Through network layer to the classification score of training picture, and, every grade of full articulamentum to the classification score and tag along sort of trained picture,
The neural network model includes N grades of neural net layers and N-1 grades of full articulamentums, and the full articulamentum of i-stage is located at i+1 grade nerve
After network layer, N >=3, i ∈ [1, N-1];
According to first order neural net layer to the classification score of training picture and the original classification label of training picture, obtain described
The first order loss function of first order neural net layer;
According to P-1 grades of full articulamentums to the classification score and tag along sort of training picture, the P grades of neural net layers are obtained
P grades of loss functions, P ∈ [2, N];
The loss function of neural network model is determined according to loss functions at different levels, and adjusts neural net layers at different levels and at different levels complete
The network parameter of articulamentum, until the loss function of neural network model reaches preset function value, then every grade of neural net layer is made
For the Classification and Identification model of respective stages.
4. according to the method described in claim 3, it is characterized in that, every grade of full articulamentum passes through the classification score of training picture
As under type generates:
Training picture is divided according to classification score and second level neural net layer of the first order neural net layer to training picture
Class score obtains the full articulamentum of the first order to the classification score of training picture;
The classification score of training picture and P+1 grades of neural net layers divide training picture according to P-1 grades of full articulamentums
Class score obtains P grades of full articulamentums to the classification score of training picture, P ∈ [2, N].
5. according to the method described in claim 3, it is characterized in that, every grade of full articulamentum passes through the tag along sort of training picture
As under type generates:
The original classification label that training picture is updated according to classification score of the first order neural net layer to training picture, obtains
Tag along sort of the full articulamentum of the first order to training picture;
P-1 grades of full articulamentums are updated according to classification score of the P-1 grades of full articulamentums to training picture to divide training picture
Class label obtains P grades of full articulamentums to the tag along sort of training picture, P ∈ [2, N].
6. according to the method described in claim 5, it is characterized in that, it is described according to first order neural net layer to training picture
Classify score update training picture original classification label, obtain the full articulamentum of the first order to training picture tag along sort,
Include:
According to the first order neural net layer to the classification score of training picture, the first order neural net layer is obtained to instruction
Practice the class probability of picture;
The first order neural net layer is more than or equal to the first probability threshold value to the class probability of training picture, then by the training
The original classification tag modification of picture is default tag along sort, and is connected entirely using the default tag along sort as the first order
Tag along sort of the layer to training picture;
The first order neural net layer less than the first probability threshold value, then keeps the training figure to the class probability of training picture
The original classification label of piece is constant, and using the original classification label of the trained picture as the full articulamentum of the first order to instruction
Practice the tag along sort of picture.
7. according to the method described in claim 6, it is characterized in that, it is described according to P-1 grades of full articulamentums to training picture
Score of classifying updates P-1 grades of full articulamentums to the tag along sort of training picture, obtains P grades of full articulamentums to training picture
Tag along sort, P ∈ [2, N], comprising:
According to described P-1 grades full articulamentums to the classification score of training picture, described P-1 grades full articulamentums are obtained to training
The class probability of picture, P ∈ [2, N];
Described P-1 grades full articulamentums are more than or equal to P probability threshold value to the class probability of training picture, then by the P-1
The full articulamentum of grade is revised as the default tag along sort to the tag along sort of training picture, and using the default tag along sort as
Tag along sort of the described P grades full articulamentums to training picture;
Described P-1 grades full articulamentums less than P probability threshold value, then keep described P-1 grades to the class probability of training picture
Full articulamentum is constant to the tag along sort of training picture, and by described P-1 grades full articulamentums to the tag along sort of training picture
As described P grades full articulamentums to the tag along sort of training picture.
8. the method according to the description of claim 7 is characterized in that described determine neural network model according to loss functions at different levels
Loss function, and the network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until neural network model
Loss function reaches preset function value, then Classification and Identification model of the every grade of neural net layer as respective stages, comprising:
The loss function of neural network model is determined according to loss functions at different levels;
The loss function is calculated to the partial derivative of the network parameter of neural net layers at different levels and full articulamentum at different levels, the loss
It is zero that the partial derivative of the corresponding trained picture of tag along sort is preset in function;
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted according to the partial derivative, and recalculates the damage
Function is lost, until the loss function reaches the preset function value, then every grade of neural net layer is known as the classification of respective stages
Other model.
9. a kind of picture classifier characterized by comprising
Pictures obtain module, and for obtaining pictures to be sorted, the pictures include at least two pictures;
Classification results generation module is obtained for being input to the pictures in the current grade classification identification model of training in advance
To the classification score of every picture;
Classification recognition result generation module is classified if the classification score for picture meets preset condition according to described
Divide the classification recognition result for determining the picture;If the classification score of picture is unsatisfactory for preset condition, continue the figure
Piece is input in next stage Classification and Identification model trained in advance, until obtaining the classification recognition result of the picture;Wherein, often
Grade classification identification model is generated based on neural metwork training.
10. a kind of generating means of Classification and Identification model characterized by comprising
Training sample obtains module, and for obtaining training sample, the training sample includes training picture and the trained picture
Original classification label;
Classify score and tag along sort generation module, for by the original classification label of the trained picture and the trained picture
It is input in neural network model, obtains every grade of neural net layer to the classification score of training picture, and, every grade of full articulamentum
To the classification score and tag along sort of training picture, the neural network model includes N grades of neural net layers and N-1 grades of full connections
Layer, the full articulamentum of i-stage are located at after i+1 grade neural net layer, N >=3, i ∈ [1, N-1];
First order loss function generation module, for the classification score and training according to first order neural net layer to training picture
The original classification label of picture obtains the first order loss function of the first order neural net layer;
P grades of loss function generation modules, for the classification score and contingency table according to P-1 grades of full articulamentums to training picture
Label, obtain P grades of loss functions of the P grades of neural net layers, P ∈ [2, N];
Classification and Identification model generation module, for determining the loss function of neural network model according to loss functions at different levels, and
The network parameter of neural net layers at different levels and full articulamentum at different levels is adjusted, until the loss function of neural network model reaches default
Functional value, then Classification and Identification model of the every grade of neural net layer as respective stages.
11. a kind of equipment characterized by comprising
One or more processors;
Memory, for storing one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processors are real
Existing method a method as claimed in any one of claims 1-8.
12. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor
Method a method as claimed in any one of claims 1-8 is realized when execution.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811457125.1A CN109583501B (en) | 2018-11-30 | 2018-11-30 | Method, device, equipment and medium for generating image classification and classification recognition model |
PCT/CN2019/120903 WO2020108474A1 (en) | 2018-11-30 | 2019-11-26 | Picture classification method, classification identification model generation method and apparatus, device, and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811457125.1A CN109583501B (en) | 2018-11-30 | 2018-11-30 | Method, device, equipment and medium for generating image classification and classification recognition model |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109583501A true CN109583501A (en) | 2019-04-05 |
CN109583501B CN109583501B (en) | 2021-05-07 |
Family
ID=65926768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811457125.1A Active CN109583501B (en) | 2018-11-30 | 2018-11-30 | Method, device, equipment and medium for generating image classification and classification recognition model |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109583501B (en) |
WO (1) | WO2020108474A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110210356A (en) * | 2019-05-24 | 2019-09-06 | 厦门美柚信息科技有限公司 | A kind of picture discrimination method, apparatus and system |
CN110222724A (en) * | 2019-05-15 | 2019-09-10 | 平安科技(深圳)有限公司 | A kind of picture example detection method, apparatus, computer equipment and storage medium |
CN110738267A (en) * | 2019-10-18 | 2020-01-31 | 北京达佳互联信息技术有限公司 | Image classification method and device, electronic equipment and storage medium |
CN110990627A (en) * | 2019-12-05 | 2020-04-10 | 北京奇艺世纪科技有限公司 | Knowledge graph construction method and device, electronic equipment and medium |
WO2020108474A1 (en) * | 2018-11-30 | 2020-06-04 | 广州市百果园信息技术有限公司 | Picture classification method, classification identification model generation method and apparatus, device, and medium |
CN111738290A (en) * | 2020-05-14 | 2020-10-02 | 北京沃东天骏信息技术有限公司 | Image detection method, model construction and training method, device, equipment and medium |
CN111782905A (en) * | 2020-06-29 | 2020-10-16 | 中国工商银行股份有限公司 | Data packaging method and device, terminal equipment and readable storage medium |
CN112286440A (en) * | 2020-11-20 | 2021-01-29 | 北京小米移动软件有限公司 | Touch operation classification method and device, model training method and device, terminal and storage medium |
CN112445410A (en) * | 2020-12-07 | 2021-03-05 | 北京小米移动软件有限公司 | Touch event identification method and device and computer readable storage medium |
CN112465042A (en) * | 2020-12-02 | 2021-03-09 | 中国联合网络通信集团有限公司 | Generation method and device of classification network model |
CN112686289A (en) * | 2020-12-24 | 2021-04-20 | 微梦创科网络科技(中国)有限公司 | Picture classification method and device |
CN112784985A (en) * | 2021-01-29 | 2021-05-11 | 北京百度网讯科技有限公司 | Training method and device of neural network model, and image recognition method and device |
CN113063843A (en) * | 2021-02-22 | 2021-07-02 | 广州杰赛科技股份有限公司 | Pipeline defect identification method and device and storage medium |
CN113705735A (en) * | 2021-10-27 | 2021-11-26 | 北京值得买科技股份有限公司 | Label classification method and system based on mass information |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3975579A4 (en) * | 2019-05-23 | 2022-12-14 | LG Electronics Inc. | Display device |
US12033385B2 (en) | 2020-02-27 | 2024-07-09 | Lg Electronics Inc. | Display device |
CN111783861A (en) * | 2020-06-22 | 2020-10-16 | 北京百度网讯科技有限公司 | Data classification method, model training device and electronic equipment |
CN112084944B (en) * | 2020-09-09 | 2024-07-12 | 清华大学 | Dynamic evolution expression recognition method and system |
CN112182269B (en) * | 2020-09-27 | 2023-11-28 | 北京达佳互联信息技术有限公司 | Training of image classification model, image classification method, device, equipment and medium |
CN113361451B (en) * | 2021-06-24 | 2024-04-30 | 福建万福信息技术有限公司 | Ecological environment target identification method based on multistage model and preset point automatic adjustment |
CN113935407A (en) * | 2021-09-29 | 2022-01-14 | 光大科技有限公司 | Abnormal behavior recognition model determining method and device |
CN114170186A (en) * | 2021-12-08 | 2022-03-11 | 江苏硕世生物科技股份有限公司 | Gram staining slice integral Nugent analysis method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106096670A (en) * | 2016-06-17 | 2016-11-09 | 北京市商汤科技开发有限公司 | Concatenated convolutional neural metwork training and image detecting method, Apparatus and system |
US20170161592A1 (en) * | 2015-12-04 | 2017-06-08 | Pilot Ai Labs, Inc. | System and method for object detection dataset application for deep-learning algorithm training |
CN108509978A (en) * | 2018-02-28 | 2018-09-07 | 中南大学 | The multi-class targets detection method and model of multi-stage characteristics fusion based on CNN |
CN108875456A (en) * | 2017-05-12 | 2018-11-23 | 北京旷视科技有限公司 | Object detection method, object detecting device and computer readable storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103679185B (en) * | 2012-08-31 | 2017-06-16 | 富士通株式会社 | Convolutional neural networks classifier system, its training method, sorting technique and purposes |
US10915817B2 (en) * | 2017-01-23 | 2021-02-09 | Fotonation Limited | Method of training a neural network |
CN107403198B (en) * | 2017-07-31 | 2020-12-22 | 广州探迹科技有限公司 | Official website identification method based on cascade classifier |
CN109583501B (en) * | 2018-11-30 | 2021-05-07 | 广州市百果园信息技术有限公司 | Method, device, equipment and medium for generating image classification and classification recognition model |
-
2018
- 2018-11-30 CN CN201811457125.1A patent/CN109583501B/en active Active
-
2019
- 2019-11-26 WO PCT/CN2019/120903 patent/WO2020108474A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170161592A1 (en) * | 2015-12-04 | 2017-06-08 | Pilot Ai Labs, Inc. | System and method for object detection dataset application for deep-learning algorithm training |
CN106096670A (en) * | 2016-06-17 | 2016-11-09 | 北京市商汤科技开发有限公司 | Concatenated convolutional neural metwork training and image detecting method, Apparatus and system |
CN108875456A (en) * | 2017-05-12 | 2018-11-23 | 北京旷视科技有限公司 | Object detection method, object detecting device and computer readable storage medium |
CN108509978A (en) * | 2018-02-28 | 2018-09-07 | 中南大学 | The multi-class targets detection method and model of multi-stage characteristics fusion based on CNN |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020108474A1 (en) * | 2018-11-30 | 2020-06-04 | 广州市百果园信息技术有限公司 | Picture classification method, classification identification model generation method and apparatus, device, and medium |
CN110222724A (en) * | 2019-05-15 | 2019-09-10 | 平安科技(深圳)有限公司 | A kind of picture example detection method, apparatus, computer equipment and storage medium |
CN110222724B (en) * | 2019-05-15 | 2023-12-19 | 平安科技(深圳)有限公司 | Picture instance detection method and device, computer equipment and storage medium |
CN110210356A (en) * | 2019-05-24 | 2019-09-06 | 厦门美柚信息科技有限公司 | A kind of picture discrimination method, apparatus and system |
CN110738267A (en) * | 2019-10-18 | 2020-01-31 | 北京达佳互联信息技术有限公司 | Image classification method and device, electronic equipment and storage medium |
CN110738267B (en) * | 2019-10-18 | 2023-08-22 | 北京达佳互联信息技术有限公司 | Image classification method, device, electronic equipment and storage medium |
CN110990627A (en) * | 2019-12-05 | 2020-04-10 | 北京奇艺世纪科技有限公司 | Knowledge graph construction method and device, electronic equipment and medium |
CN111738290B (en) * | 2020-05-14 | 2024-04-09 | 北京沃东天骏信息技术有限公司 | Image detection method, model construction and training method, device, equipment and medium |
CN111738290A (en) * | 2020-05-14 | 2020-10-02 | 北京沃东天骏信息技术有限公司 | Image detection method, model construction and training method, device, equipment and medium |
CN111782905A (en) * | 2020-06-29 | 2020-10-16 | 中国工商银行股份有限公司 | Data packaging method and device, terminal equipment and readable storage medium |
CN111782905B (en) * | 2020-06-29 | 2024-02-09 | 中国工商银行股份有限公司 | Data packet method and device, terminal equipment and readable storage medium |
CN112286440A (en) * | 2020-11-20 | 2021-01-29 | 北京小米移动软件有限公司 | Touch operation classification method and device, model training method and device, terminal and storage medium |
CN112465042A (en) * | 2020-12-02 | 2021-03-09 | 中国联合网络通信集团有限公司 | Generation method and device of classification network model |
CN112465042B (en) * | 2020-12-02 | 2023-10-24 | 中国联合网络通信集团有限公司 | Method and device for generating classified network model |
CN112445410A (en) * | 2020-12-07 | 2021-03-05 | 北京小米移动软件有限公司 | Touch event identification method and device and computer readable storage medium |
CN112686289A (en) * | 2020-12-24 | 2021-04-20 | 微梦创科网络科技(中国)有限公司 | Picture classification method and device |
CN112784985A (en) * | 2021-01-29 | 2021-05-11 | 北京百度网讯科技有限公司 | Training method and device of neural network model, and image recognition method and device |
CN113063843A (en) * | 2021-02-22 | 2021-07-02 | 广州杰赛科技股份有限公司 | Pipeline defect identification method and device and storage medium |
CN113705735A (en) * | 2021-10-27 | 2021-11-26 | 北京值得买科技股份有限公司 | Label classification method and system based on mass information |
Also Published As
Publication number | Publication date |
---|---|
CN109583501B (en) | 2021-05-07 |
WO2020108474A1 (en) | 2020-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109583501A (en) | Picture classification, the generation method of Classification and Identification model, device, equipment and medium | |
Zhang et al. | C2FDA: Coarse-to-fine domain adaptation for traffic object detection | |
CN111754596B (en) | Editing model generation method, device, equipment and medium for editing face image | |
US10600171B2 (en) | Image-blending via alignment or photometric adjustments computed by a neural network | |
US11640518B2 (en) | Method and apparatus for training a neural network using modality signals of different domains | |
CN109685819B (en) | Three-dimensional medical image segmentation method based on feature enhancement | |
CN109376603A (en) | A kind of video frequency identifying method, device, computer equipment and storage medium | |
CN109325547A (en) | Non-motor vehicle image multi-tag classification method, system, equipment and storage medium | |
CN105210085A (en) | Image labeling using geodesic features | |
CN110837811A (en) | Method, device and equipment for generating semantic segmentation network structure and storage medium | |
CN112580720B (en) | Model training method and device | |
CN112508120B (en) | Student model training method, device, equipment, medium and program product | |
CN106778852A (en) | A kind of picture material recognition methods for correcting erroneous judgement | |
CN108985133B (en) | Age prediction method and device for face image | |
CN113592007B (en) | Knowledge distillation-based bad picture identification system and method, computer and storage medium | |
CN112418302A (en) | Task prediction method and device | |
CN113642621A (en) | Zero sample image classification method based on generation countermeasure network | |
CN110223515A (en) | A kind of track of vehicle generation method | |
CN107909638A (en) | Rendering intent, medium, system and the electronic equipment of dummy object | |
CN111046917A (en) | Object-based enhanced target detection method based on deep neural network | |
CN117611932B (en) | Image classification method and system based on double pseudo tag refinement and sample re-weighting | |
CN107341440A (en) | Indoor RGB D scene image recognition methods based on multitask measurement Multiple Kernel Learning | |
CN111723239A (en) | Multi-mode-based video annotation method | |
CN117149944B (en) | Multi-mode situation emotion recognition method and system based on wide time range | |
CN116935170B (en) | Processing method and device of video processing model, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20211125 Address after: 31a, 15 / F, building 30, maple mall, bangrang Road, Brazil, Singapore Patentee after: Baiguoyuan Technology (Singapore) Co.,Ltd. Address before: 511442 23-39 / F, building B-1, Wanda Plaza North, Wanbo business district, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province Patentee before: GUANGZHOU BAIGUOYUAN INFORMATION TECHNOLOGY Co.,Ltd. |