EP3726426A1 - Classification training method, server and storage medium - Google Patents

Classification training method, server and storage medium Download PDF

Info

Publication number
EP3726426A1
EP3726426A1 EP18887500.9A EP18887500A EP3726426A1 EP 3726426 A1 EP3726426 A1 EP 3726426A1 EP 18887500 A EP18887500 A EP 18887500A EP 3726426 A1 EP3726426 A1 EP 3726426A1
Authority
EP
European Patent Office
Prior art keywords
sketch
classification
model
real graph
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18887500.9A
Other languages
German (de)
French (fr)
Other versions
EP3726426A4 (en
Inventor
Fei Huang
Lin Ma
Wei Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of EP3726426A1 publication Critical patent/EP3726426A1/en
Publication of EP3726426A4 publication Critical patent/EP3726426A4/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19147Obtaining sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19167Active pattern learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19173Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/333Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables

Definitions

  • This application relates to the field of information processing technologies, and in particular, to a classification training method, a server, and a storage medium.
  • Sketch recognition may be applicable to a plurality of fields, such as early childhood education. Category identification of hand sketches and retrieval of these hand sketches based on their categorization may be of vital importance for the growth of divergent thinking and graph understanding capability of children. Sketch recognition may also be used in another graph retrieval system.
  • a user inputs a hand sketch through a terminal device and transmits the hand sketch to a backend server.
  • the backend server identifies a category of the received hand sketch according to a pre-trained classifier such as a support vector machine (SVM) or a pre-trained classification network.
  • SVM support vector machine
  • a pre-trained classifier such as a support vector machine (SVM) or a pre-trained classification network.
  • the classifier or the classification network are mainly obtained by training according to a large number of feature information of the sketch whose category has been marked in advance.
  • scarce training samples of sketches may often result in overfitting or underfitting of the trained classifier or classification network, affecting the accuracy of identifying the category of the hand sketch.
  • Embodiments of this application provide a classification training method, a server, and a storage medium.
  • An embodiment of this application provides a classification training method, including:
  • An embodiment of this application provides one or more storage media.
  • the storage media store computer readable instructions.
  • the computer readable instructions are loaded by a processor to perform the classification training method according to the embodiment of this application.
  • An embodiment of this application provides a server, including one or more processors and memories.
  • the memories store computer readable instructions.
  • the one or more processors are configured to implement each computer readable instruction.
  • the computer readable instructions are configured to be loaded by the one or more processors to perform the classification training method according to the embodiment of this application.
  • An embodiment of this application provides a classification training method.
  • training is performed mainly according to a sketch whose category has been marked and a real graph whose category has been marked, to obtain a sketch classification model and a real graph classification model.
  • a classification training apparatus performs training through the following method: determining the sketch classification model, the sketch classification model including a first feature extraction module and a first classification module, and determining a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module belonging to the real graph classification model; selecting a training set, the training set including sketches of a plurality of categories; determining a category of a sketch in the training set according to the sketch classification model to obtain a first category processing result, and analyzing, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a second sketch; calculating a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the
  • the sketch classification model and the real graph classification model obtained by training may be applicable but are not limited to the following scenarios: a user may input a hand sketch through a terminal device and transmits the hand sketch to a backend server; the backend server determines, through a pre-trained sketch classification model, a category of the hand sketch received by the backend server; and the backend server may retrieve a real image corresponding to the hand sketch according to the pre-trained sketch classification model and the real graph classification model.
  • An embodiment of this application provides a classification training method.
  • the classification training method is executed by a classification training apparatus.
  • the flowchart is shown in FIG. 1
  • the schematic diagram is shown in FIG. 2 .
  • the method includes the following steps.
  • Step 101 Determine a sketch classification model, the sketch classification model including a first feature extraction module and a first classification module; and determine a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model;
  • the sketch classification model is configured to identify a category of a sketch.
  • the sketch is an image drawn by a user through an electronic device, an image drawn by a user manually, or the like.
  • the sketch classification model usually includes a first feature extraction module extracting a feature of a sketch and a first classification module performing classification according to the extracted feature of the sketch, such as a convolutional-neural-network-based classification model or a classification model such as a support vector machine (SVM).
  • SVM support vector machine
  • the determining, by a classification training apparatus, the sketch classification model specifically includes determining a structure of the sketch classification model and a corresponding fixed parameter value.
  • the structure of the sketch classification model specifically includes the first feature extraction module and a structure of the first classification module.
  • the corresponding fixed parameter value specifically includes the first feature extraction module and a parameter value of a fixed parameter used by the first classification module in a calculation process.
  • the real graph classification model is configured to identify a category of a real graph.
  • the real graph is an image of an entity obtained by a camera or a webcam.
  • the real graph classification model usually includes a second feature extraction module extracting a feature of the real graph and a second classification module performing classification according to the extracted feature of the real graph.
  • the second classification module is, for example, a convolutional-neural-network-based classification model or a classification model such as SVM.
  • the second feature analysis model is configured to analyze the feature (that is, an output result) of the real graph extracted by the second feature extraction module in a process by which the real graph classification model identifies a category of the real graph.
  • the feature of the real graph may be determined as belonging to the real graph.
  • the determining, by the classification training apparatus, the second feature analysis model specifically includes determining a structure of the second feature analysis model and a parameter value of the fixed parameter used by the second feature analysis model in a calculating process.
  • a structure of the sketch classification model may be the same as or different from that of the real graph classification model.
  • a parameter value of a fixed calculation parameter of each calculating sub-module that forms the sketch classification model is different from a parameter value (a fixed parameter value for short below) of a fixed calculation parameter of each calculation sub-module that forms the real graph classification model if the structure of the sketch classification model is the same as that of the real graph classification model.
  • the fixed calculation parameter herein refers to a parameter that is used in a calculation process and that does not need to be assigned with a value at any time, such as a weight and an angle.
  • both the sketch classification model and the real graph classification model are a dense net model and include a same quantity of dense blocks and prediction modules.
  • FIG. 2 description is provided by using an example in which there are three dense blocks in FIG. 2 .
  • the dense blocks are connected in series through transition layers.
  • the dense blocks and the transition layers belong to the feature extraction module.
  • Each dense block performs convolution calculation and outputs a certain quantity of feature maps.
  • the transition layers connected to the dense blocks may perform dimension reduction on a dimension of the feature maps outputted by the corresponding dense blocks.
  • the prediction module mainly maps the extracted feature to a category probability distribution of a fixed dimension after running through a plurality of dense blocks and a plurality of transition layers, and predicts that a category of a certain image belongs to the classification module.
  • fixed calculation parameters of the dense blocks between the sketch classification model and the real graph classification model such as quantities of the outputted feature maps and sizes of convolution kernels, may be different.
  • Fixed calculation parameters of the transition layers, such as dimension reduction multiples, may also be different.
  • the classification training apparatus may obtain a well-trained sketch classification model and real graph classification model in some systems when performing this step and may initiate to perform step 102 to step 105 of this embodiment.
  • the classification training apparatus may perform training according to a sketch whose category has been marked to obtain the sketch classification model and perform training according to a real graph whose category has been marked to obtain the real graph classification model when performing this step.
  • the classification training apparatus initiates to perform step 102 to step 105 of this embodiment.
  • the classification training apparatus may perform training through a loss function related to an initial model of each category to obtain the sketch classification model and the real graph classification model when performing this step. Specifically: an initial model of a sketch classification and an initial model of a real graph classification are first determined, and the sketch whose category has been marked and the real graph whose category has been marked are determined.
  • the determining an initial model of a sketch classification is specifically determining a structure of the initial model of the sketch classification and an initial value of the fixed parameter.
  • the determining an initial model of a real graph classification is specifically determining a structure of the initial model of the real graph classification and an initial value of the fixed parameter.
  • the initial classification result may include an initial category of the sketch whose category has been marked and an initial category of the real graph whose category has been marked.
  • a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification are calculated according to the initial classification result.
  • the initial model of the sketch classification is adjusted according to the function value of the third loss function and a fixed parameter value in the initial model of the real graph classification is adjusted according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  • Step 102 Select a training set.
  • the training set may not only include sketches of a plurality of categories, but also include real graphs of the categories.
  • the training set may include sketches and real graphs of a plurality of categories.
  • Each category may correspond to a plurality of sketches and a plurality of real graphs.
  • Each image in the training set has the following marks: the mark of the sketch or the mark of the real graph, and the mark of a category corresponding to the image.
  • the classification training apparatus may first perform preprocessing on each image in the training set after selecting the training set, and then perform step 103.
  • the preprocessing process may include: performing zooming or clipping on each image, so that the sizes of processed images are the same. In this way, calculation performed when the categories of the images are determined in step 103 can be simplified.
  • the preprocessing process may further include: enhancing a main image of each image, so that the main image in each image is clearer and not fuzzy. In this way, when step 103 is performed, the effect of the unclear main image on the process of determining a category of the image is eliminated.
  • the main image herein is a major image in one image, rather than a background image, such as a figure and an entity included in one image.
  • There may further be other preprocessing, and any prepossessing method that can eliminate the effect on the process of determining, by the classification training apparatus, the category of the image in step 103 falls within the scope of this embodiment of this application, and this is not exemplified herein one by one.
  • step 103 Determine the category of the sketch in the training set according to the sketch classification model determined in step 101 to obtain a first category processing result.
  • a feature of a sketch extracted by the first feature extraction module in the sketch classification model is analyzed in a process in which the sketch classification model determines the category of the sketch, to obtain a feature analysis result of a second sketch. Analyzing the feature of the sketch may include determining whether the feature of the sketch belongs to the sketch. In another embodiment, there may further be other analysis processing, and this is not exemplified herein one by one.
  • the first category processing result obtained herein may specifically include the categories that are of sketches in the training set and that are determined by the sketch classification model.
  • the training set selected in step 102 includes n sketches.
  • the sketches may be sketches of categories of "plane", "tree”, and the like.
  • the categories of the n sketches are respectively determined according to the sketch classification model to obtain categories C1, C2, ..., Cn of the n sketches.
  • the feature analysis result of the second sketch may specifically include a result of determining, by the second feature analysis model, whether the feature of the sketch extracted by the first feature extraction module belongs to the sketch.
  • Step 104 Calculate a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch that are obtained in step 103.
  • the first loss function includes a loss function related to the first classification module and a loss function by which the second feature analysis model analyzes the feature of the sketch.
  • the loss function related to the first classification module may be obtained according to the first category processing result, and may be specifically a cross-entropy loss function, configured to indicate a difference between the category determined by the first classification module and an actual category of the sketch, that is a deviation.
  • the loss function related to the first classification module may also be a sorting loss function, configured to indicate a loss function in a process of sorting similar features. For example, the first classification module determines that a feature similarity between a sketch 1 and a sketch 2 is greater than a feature similarity between the sketch 1 and the sketch 2 in a process of determining a category of the sketch 1, and then determines that categories of the sketch 1 and the sketch 2 are the same.
  • the sorting loss function may indicate a difference between a sort determined in the process of sorting the feature similarity and a sort of an actual feature similarity.
  • the loss function by which the second feature analysis model analyzes the feature of the sketch is obtained according to the feature analysis result of the second sketch and is configured to indicate a difference between an analysis result obtained by analyzing, the second feature analysis model, the feature of the sketch extracted by the first feature extraction module and an actual feature of the sketch.
  • Step 105 Adjust a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  • the first fixed parameter value is a fixed parameter that is used in the calculation process separately by the first feature extraction module and the first classification module that are included in the sketch classification model and that does not need to be assigned with a value at any time, such as a parameter value of a parameter such as weight and angle. If the calculated function value of the first loss function is relatively large, for example, greater than a preset value, it is necessary to change the first fixed parameter value, for example, increase a weight value of a certain weight or reduce an angle value of a certain angle, to reduce a function value of a first loss function calculated according to a first fixed parameter value after the adjustment.
  • step 101 There is no absolute sequence relationship between step 101 and step 102. Step 101 and step 102 may be performed at the same time or in sequence. FIG. 1 shows only one specific implementation.
  • the classification training apparatus may further adjust a fixed parameter value in the second feature analysis model.
  • the classification training apparatus determines the real graph classification model, specifically including determining a structure of the sketch classification model and a corresponding fixed parameter value.
  • the real graph classification model includes the second feature extraction module and the second classification module.
  • the classification training apparatus analyzes, according to the second feature analysis model, the feature of the real graph extracted by the second feature extraction module in a process of determining, by the real graph classification model, the category of the real graph in the training set, to obtain a feature analysis result of a first real graph.
  • the classification training apparatus calculates a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch, and adjusts a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function, to reduce the function value of the second adversarial loss function calculated according to a fixed parameter value of the second feature analysis model after the adjustment.
  • the feature analysis result of the first real graph may specifically include the result of determining, by the second feature analysis model, whether the feature of the real graph extracted by the second feature extraction module belongs to the real graph.
  • the second adversarial loss function may include the loss function by which the second feature analysis model analyzes the feature of the real graph and a loss function by which the second feature analysis model analyzes the feature of the sketch.
  • the loss function by which the second feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the first real graph.
  • the loss function by which the second feature analysis model analyzes the feature of the sketch may be obtained according to the feature analysis result of the second sketch.
  • the loss function by which the second feature analysis model analyzes the feature of the real graph is configured to indicate a difference between an analysis result obtained by analyzing, by the second feature analysis model, the feature of the real graph extracted by the second feature extraction module and an actual feature of the real graph.
  • step 103 to step 105 are processes of adjusting, by the classification training apparatus, the first fixed parameter value after the sketch classification model determined in step 101 separately processes each sketch in the training set.
  • step 103 to step 105 it is necessary to perform step 103 to step 105 continuously and circularly, until the adjustment on the first fixed parameter value meets a certain stopping condition.
  • the classification training apparatus further needs to determine whether the current adjustment on the first fixed parameter value meets a preset stopping condition after performing step 101 to 105 of the foregoing embodiment. If yes, the procedure ends; and otherwise, for the sketch classification model whose first fixed parameter value is adjusted, return to operations of step 103 to step 105. To be specific, the operations of obtaining the first category processing result and the feature analysis result of the second sketch, calculating the function value of the first loss function of the sketch classification model, and adjusting the first fixed parameter value are performed.
  • the preset stopping condition includes but is not limited to any one of the following conditions: a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold, that is, the adjusted first fixed parameter value achieves convergence; the number of adjustment times of the first fixed parameter value reaches the preset number of times, and the like. It may be understood that the stopping condition herein is a condition of stopping adjusting the first fixed parameter value.
  • classification training apparatus may further adjust a second fixed parameter value of the real graph classification model through the following steps.
  • the flowchart is shown in FIG. 4 and the schematic diagram is shown in FIG. 2 . The following steps are included.
  • Step 201 Determine the first feature analysis model analyzing an output result of the first feature extraction module.
  • the first feature analysis model is configured to analyze the feature (that is, an output result) of the sketch extracted by the first feature extraction module in the process of identifying, by the sketch classification model, the category of the sketch. Specifically, the feature of the sketch may be determined as belonging to the sketch.
  • Step 202 Determine the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result; and analyze, according to the first feature analysis model, a feature of the real graph extracted by the second feature extraction module to obtain a feature analysis result of a second real graph.
  • the second category processing result obtained herein may specifically include categories that are of real graphs in the training set and that are determined by the real graph classification model.
  • the feature analysis result of the second real graph may specifically include a result of determining, by the first feature analysis model, whether the feature of the real graph extracted by the second feature extraction module belongs to the real graph.
  • Step 203 Calculate a function value of a second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph.
  • the second loss function includes a loss function related to the second classification module and a loss function by which the first feature analysis model analyzes the feature of the real graph.
  • the loss function related to the second classification module may be obtained according to the second category processing result.
  • the loss function related to the second classification module may be specifically a cross-entropy loss function, configured to indicate a difference between the category determined by the second classification module and an actual category of the real graph. If the second classification module uses a method of sorting similar features in the process of determining the category of the real graph, the loss function related to the second classification module may also be a sorting loss function, configured to indicate the loss function in the process of sorting the similar features.
  • the loss function by which the first feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the second real graph and is configured to indicate a difference between an analysis result obtained by analyzing, by the first feature analysis model, the feature of the real graph extracted by the second feature extraction module and an actual feature of the real graph.
  • Step 204 Adjust a second fixed parameter value in the real graph classification model according to a function value of a second loss function.
  • the second fixed parameter value is a fixed parameter that is used in the calculation process separately by the second feature extraction module and the second classification module that are included in the real graph classification model and that does not need to be assigned with a value at any time, such as a parameter value of a parameter such as weight and angle. If the calculated function value of the second loss function is relatively large, for example, greater than a preset value, it is necessary to change the second fixed parameter value, for example, increase a weight value of a certain weight or reduce an angle value of a certain angle, to reduce the function value of the second loss function calculated according to a second fixed parameter value after the adjustment.
  • the classification training apparatus may further adjust a fixed parameter value in the first feature analysis model. Specifically, the classification training apparatus analyzes, according to the first feature analysis model, the feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of the first sketch, then calculates a loss function of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph, and adjusts a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  • the feature analysis result of the first sketch may specifically include a result of determining, by the first feature analysis model, whether the feature of the sketch extracted by the first feature extraction module belongs to the sketch.
  • the first adversarial loss function may include a loss function by which the first feature analysis model analyzes the feature of the real graph and a loss function by which the first feature analysis model analyzes the feature of the sketch.
  • the loss function by which the first feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the second real graph.
  • the loss function by which the first feature analysis model analyzes the feature of the sketch may be obtained according to the feature analysis result of the first sketch.
  • the loss function by which the first feature analysis model analyzes the feature of the sketch is configured to indicate a difference between an analysis result obtained by analyzing, by the first feature analysis model, the feature of the sketch extracted by the first feature extraction module and an actual feature of the sketch.
  • step 202 to step 204 are processes of adjusting, by the classification training apparatus, the second fixed parameter value after the real graphs in the training set are processed.
  • the classification training apparatus needs to determine whether the current adjustment on the second fixed parameter value meets a preset stopping condition after performing step 201 to step 204. If yes, the procedure ends, and otherwise, for a real graph classification model whose second fixed parameter value is adjusted, return to step 202 to step 204. To be specific, operations of obtaining the second category processing result and the feature analysis result of the second real graph, calculating the function value of the second loss function of the real graph classification model, and adjusting the second fixed parameter value are performed.
  • the preset stopping condition includes but is not limited to any one of the following conditions: a first difference between a second fixed parameter value currently adjusted and a second fixed parameter value adjusted last time is less than a second threshold, that is, the adjusted second fixed parameter value achieves convergence; and the number of adjustment times to the second fixed parameter value reach the preset number of times, and the like.
  • Step 202 to step 204 may be performed alternately with step 103 to step 105.
  • the first fixed parameter value of the sketch classification model and the fixed parameter value of the second feature analysis model may be adjusted. That is, step 103 to step 105 are performed.
  • the second fixed parameter value of the real graph classification model and the fixed parameter value of the first feature analysis model are adjusted. That is, step 202 to step 204 are performed.
  • the first fixed parameter value of the sketch classification model and the fixed parameter value of the second feature analysis model are adjusted again. That is, step 103 to step 105 are performed, and so on.
  • the classification training apparatus obtains the sketch classification model and the real graph classification model after the adjustment through the methods in the foregoing embodiments.
  • the classification training apparatus may first obtain a to-be-classified sketch (such as a to-be-classified sketch input by a user through a terminal device), then classifies the to-be-classified sketch according to the sketch classification model after the adjustment, to obtain a category of the to-be-classified sketch, implementing classification of the sketches.
  • the classification training apparatus may first obtain the to-be-classified sketch (such as the to-be-classified sketch input by the user through the terminal device) and obtain each real graph stored in the classification training apparatus; then classify the to-be-classified sketch according to the sketch classification model after the adjustment to obtain a category of the to-be-classified sketch, and separately classify each stored real graph according to the real graph classification model after the adjustment to obtain categories of the real graphs; and finally select a real graph having a same category as that of the to-be-classified sketch, to provide the real graph for the terminal device of the user. In this way, retrieval of the sketch is implemented.
  • the to-be-classified sketch such as the to-be-classified sketch input by the user through the terminal device
  • the classification training apparatus first selects the training set, determines the category of the sketch in the training set according to the sketch classification model to obtain the first category processing result, and analyzes, according to the second feature analysis model, the feature of the sketch extracted by a first feature extracting model to obtain the analysis result of the second sketch; then obtains the function value of the first loss function according to the first category processing result and the analysis result of the second sketch; and finally adjusts the first fixed parameter value of the sketch classification model according to the function value of the first loss function.
  • the classification training apparatus further determines the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result, and analyzes, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module to obtain an analysis result of the second real graph; then obtains the function value of the second loss function according to the second category processing result and the analysis result of the second real graph; and finally adjusts the second fixed parameter value of the real graph classification model according to the function value of the second loss function.
  • the sketch classification model and the real graph classification model may use a convolutional-neural-network-based classification model of a same structure and are respectively denoted as CNN_1 and CNN_2.
  • the first feature extraction module and the first classification module that are included in the sketch classification model are respectively noted as CNN11 and CNN12.
  • the first feature analysis model is specifically a sketch identifier D_1.
  • the second feature extraction module and the second classification module that are included in the real graph classification model are respectively denoted as CNN21 and CNN22.
  • the second feature analysis model is specifically a real graph identifier D_2.
  • the classification training method of this embodiment may be implemented through the following steps.
  • the flowchart is shown in FIG. 6 , including:
  • the sketch identifier D_1 identifies the feature of the sketch extracted by the sketch feature extraction module CNN11 in the sketch classification model CNN_1 to obtain a sketch feature identification result 11.
  • the sketch identifier D_1 identifies the feature of the real graph extracted by the real graph feature extraction module CNN21 in the real graph classification model CNN_2 to obtain a real graph feature identification result 12.
  • the real graph identifier D_2 identifies the feature of the real graph extracted by the real graph feature extraction module CNN21 in the real graph classification model CNN_2 to obtain a real graph feature identification result 21.
  • the real graph identifier D_2 identifies the feature of the sketch extracted by the sketch feature extraction module CNN11 in the sketch classification model CNN_1 to obtain a sketch feature identification result 22.
  • Step 304 First fix the sketch classification model CNN_1 and the real graph identifier D_2, adjust a fixed parameter value of the real graph classification model CNN_2 and a fixed parameter value of the sketch identifier D_1, to ensure that the adjustment on the real graph classification model CNN_2 refers to useful information of the sketch classification model CNN_1 in the classification process.
  • the classification training apparatus may calculate a function value of an adversarial function GL S i of the sketch identifier D_1 according to the following formula 1 and the sketch feature identification result 11 and the real graph feature identification result 12 that are obtained in step 303 and adjusts the fixed parameter value of the sketch identifier D_1 according to the function value.
  • a function value of a loss function of the real graph classification model CNN_2 is calculated according to the following formula 2 and the category of the real graph and the real graph feature identification result 12 that are determined by the real graph classification model CNN_2 in step 303, and the fixed parameter value of the real graph classification model CNN_2 is adjusted according to the function value.
  • the loss function may specifically include a loss function related to the real graph classification module CNN22 in the real graph classification model CNN_2, such as a cross-entropy loss function CL I i , and a loss function by which the sketch identifier D_1 identifies the feature of the real graph.
  • the loss function by which the sketch identifier D_1 identifies the feature of the real graph may be obtained according to the real graph feature identification result 12.
  • the related loss function CL I i may be obtained according to the following formula 3 and the category of the real graph determined by the real graph classification model CNN_2 in step 303.
  • log( D _1( CNN _2( I i ))) may indicate a loss function by which the sketch identifier D_1 identifies the feature of the real graph.
  • Step 305 Fix the real graph classification model CNN_2 and the sketch identifier D_1 again, adjust a fixed parameter value of the sketch classification model CNN_1 and a fixed parameter value of the real graph identifier D_2, to ensure that the adjustment on the sketch classification model CNN_1 refers to useful information of the real graph classification model CNN_2 in the classification process.
  • the classification training apparatus may calculate a function value of an adversarial function GL I i of the real graph identifier D_2 according to the following formula 4 and the real graph feature identification result 21 and the sketch feature identification result 22 that are obtained in step 303 and adjusts the fixed parameter value of the real graph identifier D_2 according to the function value.
  • a function value of a loss function of the sketch classification model CNN_1 is calculated according to the following formula 5 and the category of the sketch and the sketch feature identification result 22 that are determined by the sketch classification model CNN_1 in step 303, and the fixed parameter value of the sketch classification model CNN_1 is adjusted according to the function value.
  • the function value may specifically include a loss function related to the sketch classification module CNN12 in the sketch classification model CNN_1, such as a cross-entropy loss function CL S i , and a loss function by which the real graph identifier D_2 identifies the feature of the sketch.
  • the loss function by which the real graph identifier D_2 identifies the feature of the sketch may be obtained according to the sketch feature identification result 22.
  • Step 306. Determine whether the adjustment on the fixed parameter value of the sketch classification model CNN_1 and the real graph classification model CNN_2 meets a preset condition after performing step 301 to step 305. If yes, the procedure ends, and otherwise, for the sketch classification model CNN_1 and the real graph classification model CNN_2 after the adjustment, and the sketch identifier D_1 and the real graph identifier D_2 after the adjustment, return to perform step 303 to step 305.
  • An embodiment of this application further provides a classification training apparatus.
  • a schematic structural diagram of the classification training apparatus is shown in FIG. 7 and the classification training apparatus may specifically include: a model determining unit 10, configured to determine a sketch classification model, the sketch classification model including a first feature extraction module and a first classification module; and determine a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model.
  • the model determining unit 10 is specifically configured to: determine an initial model of a sketch classification and an initial model of a real graph classification that have a same structure and determine a sketch whose category has been marked and a real graph whose category has been marked; determine a category of the sketch whose category has been marked according to the initial model of the sketch classification and determine a category of the real graph whose category has been marked according to the initial model of the real graph classification, to obtain an initial classification result; calculate a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification according to the initial classification result; and adjust a fixed parameter value in the initial model of the sketch classification according to the function value of the third loss function and adjust a fixed parameter value in the initial model of the real graph classification according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  • a training set unit 11 is configured to select a training set.
  • the training set includes sketches of a pluralit
  • a processing unit 12 is configured to: determine, according to the sketch classification model determined by the model determining unit 10, the category of the sketch in the training set selected by the training set unit 11, to obtain a first category processing result; and analyze, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module, to obtain a feature analysis result of a second sketch.
  • a function value calculation unit 13 is configured to calculate a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch that are obtained by the processing unit 12.
  • the first loss function includes a loss function related to the first classification module and a loss function by which the second feature analysis model analyzes the feature of the sketch.
  • An adjustment unit 14 is configured to adjust a first fixed parameter value in the sketch classification model according to the function value of the first loss function calculated by the function value calculation unit 13.
  • the training set selected by the training set unit 11 may further include a real graph of a corresponding category.
  • the model determining unit 10 may further determine the real graph classification model.
  • the real graph classification model includes a second feature extraction module and a second classification module.
  • the processing unit 12 is further configured to analyze, according to the second feature analysis model, the feature of the real graph extracted by the second feature extraction module when the real graph classification model determines the category of the real graph, to obtain a feature analysis result of a first real graph.
  • the function value calculation unit 13 is further configured to calculate a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch.
  • the adjustment unit 14 is further configured to adjust a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function.
  • the model determining unit 10 is further configured to determine the first feature analysis model analyzing an output result of the first feature extraction module.
  • the processing unit 12 is further configured to: determine the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result, and analyze, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module, to obtain the feature analysis result of the second real graph.
  • the function value calculation unit 13 is further configured to calculate the function value of the second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph.
  • the second loss function includes a loss function related to the second classification module and a loss function by which the first feature analysis model analyzes the feature of the real graph.
  • the adjustment unit 14 is further configured to adjust the second fixed parameter value in the real graph classification model according to the function value of the second loss function.
  • the processing unit 12 is further configured to analyze, according to the first feature analysis model, the feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a first sketch.
  • the function value calculation unit 13 is further configured to calculate a function value of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph.
  • the adjustment unit 14 is further configured to adjust a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  • the training set unit 11 selects the training set.
  • the processing unit 12 determines the category of the sketch in the training set according to the sketch classification model to obtain the first category processing result, and analyzes, according to the second feature analysis model, the feature of the sketch extracted by a first feature extracting model to obtain an analysis result of the second sketch.
  • the function value calculation unit 13 obtains the function value of the first loss function according to the first category processing result and the analysis result of the second sketch.
  • the adjustment unit 14 adjusts the first fixed parameter value of the sketch classification model according to the function value of the first loss function.
  • the processing unit 12 may further determine the category of the real graph in the training set according to the real graph classification model to obtain the second category processing result, and analyze, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module to obtain the analysis result of the second real graph. Then the function value calculation unit 13 obtains the function value of the second loss function according to the second category processing result and the analysis result of the second real graph. Finally the adjustment unit 14 adjusts the second fixed parameter value of the real graph classification model according to the function value of the second loss function.
  • the classification training apparatus may further include a determining unit 15 and a classification unit 16 in addition to the structure shown in FIG. 7 .
  • the determining unit 15 is configured to determine whether the adjustment of the adjustment unit 14 on the first fixed parameter value meets a preset stopping condition. If no, the processing unit 12 is notified to obtain the first category processing result and the feature analysis result of the second sketch for the sketch classification model whose first fixed parameter value is adjusted.
  • the preset stopping condition may include but is not limited to any one of the following conditions:
  • a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold, the number of adjustment times to a first fixed parameter reach the preset number of times, and the like.
  • the determining unit 15 is configured to determine whether the adjustment of the adjustment unit 14 on the second fixed parameter value meets the preset stopping condition. If no, the processing unit 12 is notified to obtain the second category processing result and the feature analysis result of the second real graph for the real graph classification model whose second fixed parameter value is adjusted.
  • the preset stopping condition herein may include but is not limited to any one of the following conditions: the first difference between a second fixed parameter value currently adjusted and a second fixed parameter value adjusted last time is less than the first threshold, the number of adjustment times to the second fixed parameter reach the preset number of times, and the like.
  • the classification unit 16 is configured to: obtain a to-be-classified sketch and classify the to-be-classified sketch according to the sketch classification model adjusted by the adjustment unit 14 to obtain a category of the to-be-classified sketch.
  • the classification unit 16 may further obtain the to-be-classified sketch and obtain each real graph stored in the classification training apparatus; then classify the to-be-classified sketch according to the sketch classification model adjusted by the adjustment unit 14 to obtain the category of the to-be-classified sketch, and classify each stored real graph separately according to the real graph classification model after the adjustment to obtain the categories of the real graphs; and finally select a real graph whose category is the same as that of the to-be-classified sketch.
  • An embodiment of this application further provides a server.
  • the schematic structural diagram is shown in FIG. 9 .
  • the server may produce larger difference due to different configuration or performance.
  • the server may include one or more central processing units (CPU) 20 (such as one or more processors), one or more memories 21, one or more storage media 22 (such as one or more mass storage devices) storing application programs 221 or data 222.
  • the memories 21 and the storage media 22 may temporarily or persistently store the application program 221 or the data 222.
  • the program storing in the storage media 22 may include one or more modules (not shown in the FIG), and each module may include a series of computer readable instructions to the server.
  • the central processing units 20 may be disposed in communication with the storage media 22 and to implement the series of computer readable instructions in the storage media 22 on the server.
  • the one or more processors included in the central processing unit 222 may be enabled to perform a classification training method.
  • the storage media 730 may be non-volatile storage media.
  • the non-volatile storage media may be non-volatile readable storage media.
  • the application program 221 stored in the storage media 22 includes a classification training application program.
  • the program may include the model determining unit 10, the training set unit 11, the processing unit 12, the function value calculation unit 13, the adjustment unit 14, the determining unit 15, and the classification unit 16 in the classification training apparatus, which is not described herein again.
  • the central processing units 20 may be disposed in communication with the storage media 22 and to perform a series of operations corresponding to the classification training application program stored in the storage media 22 on the server.
  • the server may further include one or more power supplies 23, one or more wired or wireless network interfaces 24, one or more input/output interfaces 25, and/or one or more operating systems 223, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, and FreeBSDTM.
  • the step performed by the classification training apparatus may be based on the structure of the server shown in FIG. 9 .
  • Each unit or module included in the classification training apparatus may be wholly or partially implemented through software, hardware, or a combination thereof.
  • An embodiment of this application further provides a storage medium.
  • the storage medium stores computer readable instructions.
  • the computer readable instructions are loaded by a processor to perform the classification training method according to the embodiments of this application.
  • An embodiment of this application further provides a server, including one or more processors and memories.
  • the memories store computer readable instructions.
  • the one or more processors are configured to implement each computer readable instruction.
  • the computer readable instructions are configured to be loaded by the one or more processors to perform the classification training method according to the embodiments of this application.
  • the computer readable instruction may be stored in a computer readable storage medium.
  • the storage medium may include: a read-only memory (ROM), a random access memory (RAM), a magnetic disk, an optical disk, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

A classification training method is provided, including: selecting a training set, determining a category of a sketch in the training set according to a sketch classification model, to obtain a first category processing result, and analyzing, according to a second feature analysis model, a feature of a sketch extracted by a first feature extracting model, to obtain an analysis result of a second sketch; then obtaining a function value of a first loss function according to the first category processing result and the analysis result of the second sketch; and finally adjusting a first fixed parameter value of the sketch classification model according to the function value of the first loss function.

Description

    RELATED APPLICATION
  • This application claims priority to Chinese Patent Application No. 2017113226122 , entitled "CLASSIFICATION TRAINING METHOD, SERVER, AND STORAGE MEDIUM" filed with China National Intellectual Property Administration on December 12, 2017, which is incorporated by reference in its entirety.
  • FIELD OF THE TECHNOLOGY
  • This application relates to the field of information processing technologies, and in particular, to a classification training method, a server, and a storage medium.
  • BACKGROUND OF THE DISCLOSURE
  • Sketch recognition may be applicable to a plurality of fields, such as early childhood education. Category identification of hand sketches and retrieval of these hand sketches based on their categorization may be of vital importance for the growth of divergent thinking and graph understanding capability of children. Sketch recognition may also be used in another graph retrieval system.
  • Specifically, a user inputs a hand sketch through a terminal device and transmits the hand sketch to a backend server. The backend server identifies a category of the received hand sketch according to a pre-trained classifier such as a support vector machine (SVM) or a pre-trained classification network. To ensure the accuracy for identifying the hand sketch, it is necessary to ensure the accuracy of the pre-trained classifier or classification network. Therefore, a process of training the classifier or the classification network is important.
  • In traditional methods, the classifier or the classification network are mainly obtained by training according to a large number of feature information of the sketch whose category has been marked in advance. However, scarce training samples of sketches may often result in overfitting or underfitting of the trained classifier or classification network, affecting the accuracy of identifying the category of the hand sketch.
  • SUMMARY
  • Embodiments of this application provide a classification training method, a server, and a storage medium.
  • An embodiment of this application provides a classification training method, including:
    • determining, by a server, a sketch classification model, the sketch classification model including a first feature extraction module and a first classification module; and determining a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model;
    • selecting, by the server, a training set, the training set including sketches of a plurality of categories;
    • determining, by the server, a category of a sketch in the training set according to the sketch classification model to obtain a first category processing result; and analyzing, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a second sketch;
    • calculating, by the server, a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch; and
    • adjusting, by the server, a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  • An embodiment of this application provides one or more storage media. The storage media store computer readable instructions. The computer readable instructions are loaded by a processor to perform the classification training method according to the embodiment of this application.
  • An embodiment of this application provides a server, including one or more processors and memories. The memories store computer readable instructions. The one or more processors are configured to implement each computer readable instruction. The computer readable instructions are configured to be loaded by the one or more processors to perform the classification training method according to the embodiment of this application.
  • The details of one or more embodiments of this application are disclosed in the following accompanying drawings and descriptions. Other features, objectives, and advantages of this application become clearer based on the specification, the accompanying drawings, and the claims of this application.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • To describe the technical solutions of the embodiments of this application or the existing technology more clearly, the following briefly introduces the accompanying drawings required for describing the embodiments or the existing technology. Apparently, the accompanying drawings in the following description show only some embodiments of this application, and a person of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
    • FIG. 1 is a flowchart of a classification training method according to an embodiment of this application.
    • FIG. 2 is a schematic diagram of calculating a function value of each loss function according to an embodiment of this application.
    • FIG. 3 is a schematic diagram of a sketch classification model and a real graph classification model according to an embodiment of this application.
    • FIG. 4 is a flowchart of another classification training method according to an embodiment of this application.
    • FIG. 5 is a schematic diagram of performing classification by a sketch classification model and a real graph classification model according to an application embodiment of this application.
    • FIG. 6 is a flowchart of a classification training method according to an application embodiment of this application.
    • FIG. 7 is a block diagram of a classification training apparatus according to an embodiment of this application.
    • FIG. 8 is a block diagram of another classification training apparatus according to an embodiment of this application.
    • FIG. 9 is a block diagram of a server according to an embodiment of this application.
    DESCRIPTION OF EMBODIMENTS
  • The following clearly and completely describes the technical solutions in the embodiments of this application with reference to the accompanying drawings in the embodiments of this application. Apparently, the described embodiments are some of the embodiments of this application rather than all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of this application without creative efforts shall fall within the protection scope of this application.
  • The terms "first", "second", "third", "fourth" and the like (if exist) used in the specification, claims, and accompanying drawings of this application are configured to distinguish similar objects but are not necessarily configured to describe a specific sequence or a precedence. It is to be understood that the data used in such a manner may be exchanged in a proper situation. Therefore, the embodiments of this application described herein may be, for example, implemented in other sequences than the sequences illustrated or described herein. In addition, the terms "include", "have" and any variants thereof are intended to cover a non-exclusive inclusion. For example, in a context of a process, method, system, product, or device that includes a series of steps or units are not necessarily limited to the steps or units that are expressly specified, but may include other steps or units not specified expressly or may include the inherent steps or units of the process, method, product, or device.
  • An embodiment of this application provides a classification training method. In the method, training is performed mainly according to a sketch whose category has been marked and a real graph whose category has been marked, to obtain a sketch classification model and a real graph classification model. Specifically, in this embodiment, a classification training apparatus performs training through the following method:
    determining the sketch classification model, the sketch classification model including a first feature extraction module and a first classification module, and determining a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module belonging to the real graph classification model; selecting a training set, the training set including sketches of a plurality of categories; determining a category of a sketch in the training set according to the sketch classification model to obtain a first category processing result, and analyzing, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a second sketch; calculating a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch; and adjusting a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  • Further, the sketch classification model and the real graph classification model obtained by training may be applicable but are not limited to the following scenarios: a user may input a hand sketch through a terminal device and transmits the hand sketch to a backend server; the backend server determines, through a pre-trained sketch classification model, a category of the hand sketch received by the backend server; and the backend server may retrieve a real image corresponding to the hand sketch according to the pre-trained sketch classification model and the real graph classification model.
  • In this way, reference will be made to not only a deviation for the classification model to classify a corresponding image, but also useful information of another classification model (such as a real graph classification model) in a classification process when a fixed parameter value of a certain classification model (such as the sketch classification model) is adjusted, that is, a feature analysis model (such as the second feature analysis model) analyzing a feature extracted by the feature extraction module in another classification model, to ensure more accurate classification calculation of the sketch classification model and the real graph classification model after the adjustment.
  • An embodiment of this application provides a classification training method. The classification training method is executed by a classification training apparatus. The flowchart is shown in FIG. 1, and the schematic diagram is shown in FIG. 2. The method includes the following steps.
  • Step 101. Determine a sketch classification model, the sketch classification model including a first feature extraction module and a first classification module; and determine a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model;
    The sketch classification model is configured to identify a category of a sketch. The sketch is an image drawn by a user through an electronic device, an image drawn by a user manually, or the like. Specifically, the sketch classification model usually includes a first feature extraction module extracting a feature of a sketch and a first classification module performing classification according to the extracted feature of the sketch, such as a convolutional-neural-network-based classification model or a classification model such as a support vector machine (SVM). In this embodiment, the determining, by a classification training apparatus, the sketch classification model specifically includes determining a structure of the sketch classification model and a corresponding fixed parameter value. The structure of the sketch classification model specifically includes the first feature extraction module and a structure of the first classification module. The corresponding fixed parameter value specifically includes the first feature extraction module and a parameter value of a fixed parameter used by the first classification module in a calculation process.
  • The real graph classification model is configured to identify a category of a real graph. The real graph is an image of an entity obtained by a camera or a webcam. Specifically, the real graph classification model usually includes a second feature extraction module extracting a feature of the real graph and a second classification module performing classification according to the extracted feature of the real graph. The second classification module is, for example, a convolutional-neural-network-based classification model or a classification model such as SVM. The second feature analysis model is configured to analyze the feature (that is, an output result) of the real graph extracted by the second feature extraction module in a process by which the real graph classification model identifies a category of the real graph. Specifically, the feature of the real graph may be determined as belonging to the real graph. The determining, by the classification training apparatus, the second feature analysis model specifically includes determining a structure of the second feature analysis model and a parameter value of the fixed parameter used by the second feature analysis model in a calculating process.
  • In this embodiment, a structure of the sketch classification model may be the same as or different from that of the real graph classification model. A parameter value of a fixed calculation parameter of each calculating sub-module that forms the sketch classification model is different from a parameter value (a fixed parameter value for short below) of a fixed calculation parameter of each calculation sub-module that forms the real graph classification model if the structure of the sketch classification model is the same as that of the real graph classification model. The fixed calculation parameter herein refers to a parameter that is used in a calculation process and that does not need to be assigned with a value at any time, such as a weight and an angle.
  • For example, as shown in FIG. 3, both the sketch classification model and the real graph classification model are a dense net model and include a same quantity of dense blocks and prediction modules. In FIG. 2, description is provided by using an example in which there are three dense blocks in FIG. 2. The dense blocks are connected in series through transition layers. The dense blocks and the transition layers belong to the feature extraction module. Each dense block performs convolution calculation and outputs a certain quantity of feature maps. The transition layers connected to the dense blocks may perform dimension reduction on a dimension of the feature maps outputted by the corresponding dense blocks. The prediction module mainly maps the extracted feature to a category probability distribution of a fixed dimension after running through a plurality of dense blocks and a plurality of transition layers, and predicts that a category of a certain image belongs to the classification module. However, fixed calculation parameters of the dense blocks between the sketch classification model and the real graph classification model, such as quantities of the outputted feature maps and sizes of convolution kernels, may be different. Fixed calculation parameters of the transition layers, such as dimension reduction multiples, may also be different.
  • It may be understood that in a case, the classification training apparatus may obtain a well-trained sketch classification model and real graph classification model in some systems when performing this step and may initiate to perform step 102 to step 105 of this embodiment.
  • In another case, the classification training apparatus may perform training according to a sketch whose category has been marked to obtain the sketch classification model and perform training according to a real graph whose category has been marked to obtain the real graph classification model when performing this step. For a currently well-trained sketch classification model and real graph classification model, the classification training apparatus initiates to perform step 102 to step 105 of this embodiment.
  • In this case, the classification training apparatus may perform training through a loss function related to an initial model of each category to obtain the sketch classification model and the real graph classification model when performing this step. Specifically: an initial model of a sketch classification and an initial model of a real graph classification are first determined, and the sketch whose category has been marked and the real graph whose category has been marked are determined. The determining an initial model of a sketch classification is specifically determining a structure of the initial model of the sketch classification and an initial value of the fixed parameter. The determining an initial model of a real graph classification is specifically determining a structure of the initial model of the real graph classification and an initial value of the fixed parameter.
  • Then a category of the sketch whose category has been marked is determined according to the initial model of the sketch classification, and a category of the real graph whose category has been marked is determined according to the initial model of the real graph classification, to obtain an initial classification result. The initial classification result may include an initial category of the sketch whose category has been marked and an initial category of the real graph whose category has been marked.
  • A function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification are calculated according to the initial classification result. Finally, the initial model of the sketch classification is adjusted according to the function value of the third loss function and a fixed parameter value in the initial model of the real graph classification is adjusted according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  • Step 102. Select a training set. The training set may not only include sketches of a plurality of categories, but also include real graphs of the categories. To be specific, the training set may include sketches and real graphs of a plurality of categories. Each category may correspond to a plurality of sketches and a plurality of real graphs. Each image in the training set has the following marks: the mark of the sketch or the mark of the real graph, and the mark of a category corresponding to the image.
  • The classification training apparatus may first perform preprocessing on each image in the training set after selecting the training set, and then perform step 103. The preprocessing process may include: performing zooming or clipping on each image, so that the sizes of processed images are the same. In this way, calculation performed when the categories of the images are determined in step 103 can be simplified.
  • The preprocessing process may further include: enhancing a main image of each image, so that the main image in each image is clearer and not fuzzy. In this way, when step 103 is performed, the effect of the unclear main image on the process of determining a category of the image is eliminated. The main image herein is a major image in one image, rather than a background image, such as a figure and an entity included in one image. There may further be other preprocessing, and any prepossessing method that can eliminate the effect on the process of determining, by the classification training apparatus, the category of the image in step 103 falls within the scope of this embodiment of this application, and this is not exemplified herein one by one.
  • step 103. Determine the category of the sketch in the training set according to the sketch classification model determined in step 101 to obtain a first category processing result. According to a second feature analysis model, a feature of a sketch extracted by the first feature extraction module in the sketch classification model is analyzed in a process in which the sketch classification model determines the category of the sketch, to obtain a feature analysis result of a second sketch. Analyzing the feature of the sketch may include determining whether the feature of the sketch belongs to the sketch. In another embodiment, there may further be other analysis processing, and this is not exemplified herein one by one.
  • The first category processing result obtained herein may specifically include the categories that are of sketches in the training set and that are determined by the sketch classification model. For example, the training set selected in step 102 includes n sketches. The sketches may be sketches of categories of "plane", "tree", and the like. The categories of the n sketches are respectively determined according to the sketch classification model to obtain categories C1, C2, ..., Cn of the n sketches.
  • The feature analysis result of the second sketch may specifically include a result of determining, by the second feature analysis model, whether the feature of the sketch extracted by the first feature extraction module belongs to the sketch.
  • Step 104. Calculate a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch that are obtained in step 103.
  • The first loss function includes a loss function related to the first classification module and a loss function by which the second feature analysis model analyzes the feature of the sketch.
  • The loss function related to the first classification module may be obtained according to the first category processing result, and may be specifically a cross-entropy loss function, configured to indicate a difference between the category determined by the first classification module and an actual category of the sketch, that is a deviation.
  • In another case, if the first classification module uses a method of sorting similar features in the process of determining the category of the sketch, the loss function related to the first classification module may also be a sorting loss function, configured to indicate a loss function in a process of sorting similar features. For example, the first classification module determines that a feature similarity between a sketch 1 and a sketch 2 is greater than a feature similarity between the sketch 1 and the sketch 2 in a process of determining a category of the sketch 1, and then determines that categories of the sketch 1 and the sketch 2 are the same. The sorting loss function may indicate a difference between a sort determined in the process of sorting the feature similarity and a sort of an actual feature similarity.
  • The loss function by which the second feature analysis model analyzes the feature of the sketch is obtained according to the feature analysis result of the second sketch and is configured to indicate a difference between an analysis result obtained by analyzing, the second feature analysis model, the feature of the sketch extracted by the first feature extraction module and an actual feature of the sketch.
  • Step 105. Adjust a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  • The first fixed parameter value is a fixed parameter that is used in the calculation process separately by the first feature extraction module and the first classification module that are included in the sketch classification model and that does not need to be assigned with a value at any time, such as a parameter value of a parameter such as weight and angle. If the calculated function value of the first loss function is relatively large, for example, greater than a preset value, it is necessary to change the first fixed parameter value, for example, increase a weight value of a certain weight or reduce an angle value of a certain angle, to reduce a function value of a first loss function calculated according to a first fixed parameter value after the adjustment.
  • There is no absolute sequence relationship between step 101 and step 102. Step 101 and step 102 may be performed at the same time or in sequence. FIG. 1 shows only one specific implementation.
  • Further, referring to FIG. 2, the classification training apparatus may further adjust a fixed parameter value in the second feature analysis model. Specifically, the classification training apparatus determines the real graph classification model, specifically including determining a structure of the sketch classification model and a corresponding fixed parameter value. The real graph classification model includes the second feature extraction module and the second classification module. The classification training apparatus analyzes, according to the second feature analysis model, the feature of the real graph extracted by the second feature extraction module in a process of determining, by the real graph classification model, the category of the real graph in the training set, to obtain a feature analysis result of a first real graph. Then the classification training apparatus calculates a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch, and adjusts a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function, to reduce the function value of the second adversarial loss function calculated according to a fixed parameter value of the second feature analysis model after the adjustment.
  • The feature analysis result of the first real graph may specifically include the result of determining, by the second feature analysis model, whether the feature of the real graph extracted by the second feature extraction module belongs to the real graph. The second adversarial loss function may include the loss function by which the second feature analysis model analyzes the feature of the real graph and a loss function by which the second feature analysis model analyzes the feature of the sketch. The loss function by which the second feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the first real graph. The loss function by which the second feature analysis model analyzes the feature of the sketch may be obtained according to the feature analysis result of the second sketch. The loss function by which the second feature analysis model analyzes the feature of the real graph is configured to indicate a difference between an analysis result obtained by analyzing, by the second feature analysis model, the feature of the real graph extracted by the second feature extraction module and an actual feature of the real graph.
  • In addition, step 103 to step 105 are processes of adjusting, by the classification training apparatus, the first fixed parameter value after the sketch classification model determined in step 101 separately processes each sketch in the training set. However, in actual applications, it is necessary to perform step 103 to step 105 continuously and circularly, until the adjustment on the first fixed parameter value meets a certain stopping condition.
  • Therefore, the classification training apparatus further needs to determine whether the current adjustment on the first fixed parameter value meets a preset stopping condition after performing step 101 to 105 of the foregoing embodiment. If yes, the procedure ends; and otherwise, for the sketch classification model whose first fixed parameter value is adjusted, return to operations of step 103 to step 105. To be specific, the operations of obtaining the first category processing result and the feature analysis result of the second sketch, calculating the function value of the first loss function of the sketch classification model, and adjusting the first fixed parameter value are performed.
  • The preset stopping condition includes but is not limited to any one of the following conditions: a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold, that is, the adjusted first fixed parameter value achieves convergence; the number of adjustment times of the first fixed parameter value reaches the preset number of times, and the like. It may be understood that the stopping condition herein is a condition of stopping adjusting the first fixed parameter value.
  • Further, the classification training apparatus may further adjust a second fixed parameter value of the real graph classification model through the following steps. The flowchart is shown in FIG. 4 and the schematic diagram is shown in FIG. 2. The following steps are included.
  • Step 201. Determine the first feature analysis model analyzing an output result of the first feature extraction module.
  • The first feature analysis model is configured to analyze the feature (that is, an output result) of the sketch extracted by the first feature extraction module in the process of identifying, by the sketch classification model, the category of the sketch. Specifically, the feature of the sketch may be determined as belonging to the sketch.
  • Step 202. Determine the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result; and analyze, according to the first feature analysis model, a feature of the real graph extracted by the second feature extraction module to obtain a feature analysis result of a second real graph.
  • The second category processing result obtained herein may specifically include categories that are of real graphs in the training set and that are determined by the real graph classification model. The feature analysis result of the second real graph may specifically include a result of determining, by the first feature analysis model, whether the feature of the real graph extracted by the second feature extraction module belongs to the real graph.
  • Step 203. Calculate a function value of a second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph. The second loss function includes a loss function related to the second classification module and a loss function by which the first feature analysis model analyzes the feature of the real graph.
  • The loss function related to the second classification module may be obtained according to the second category processing result. The loss function related to the second classification module may be specifically a cross-entropy loss function, configured to indicate a difference between the category determined by the second classification module and an actual category of the real graph. If the second classification module uses a method of sorting similar features in the process of determining the category of the real graph, the loss function related to the second classification module may also be a sorting loss function, configured to indicate the loss function in the process of sorting the similar features.
  • The loss function by which the first feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the second real graph and is configured to indicate a difference between an analysis result obtained by analyzing, by the first feature analysis model, the feature of the real graph extracted by the second feature extraction module and an actual feature of the real graph.
  • Step 204. Adjust a second fixed parameter value in the real graph classification model according to a function value of a second loss function.
  • The second fixed parameter value is a fixed parameter that is used in the calculation process separately by the second feature extraction module and the second classification module that are included in the real graph classification model and that does not need to be assigned with a value at any time, such as a parameter value of a parameter such as weight and angle. If the calculated function value of the second loss function is relatively large, for example, greater than a preset value, it is necessary to change the second fixed parameter value, for example, increase a weight value of a certain weight or reduce an angle value of a certain angle, to reduce the function value of the second loss function calculated according to a second fixed parameter value after the adjustment.
  • Further, referring to FIG. 2, the classification training apparatus may further adjust a fixed parameter value in the first feature analysis model. Specifically, the classification training apparatus analyzes, according to the first feature analysis model, the feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of the first sketch, then calculates a loss function of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph, and adjusts a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  • The feature analysis result of the first sketch may specifically include a result of determining, by the first feature analysis model, whether the feature of the sketch extracted by the first feature extraction module belongs to the sketch. The first adversarial loss function may include a loss function by which the first feature analysis model analyzes the feature of the real graph and a loss function by which the first feature analysis model analyzes the feature of the sketch. The loss function by which the first feature analysis model analyzes the feature of the real graph may be obtained according to the feature analysis result of the second real graph. The loss function by which the first feature analysis model analyzes the feature of the sketch may be obtained according to the feature analysis result of the first sketch. The loss function by which the first feature analysis model analyzes the feature of the sketch is configured to indicate a difference between an analysis result obtained by analyzing, by the first feature analysis model, the feature of the sketch extracted by the first feature extraction module and an actual feature of the sketch.
  • In addition, step 202 to step 204 are processes of adjusting, by the classification training apparatus, the second fixed parameter value after the real graphs in the training set are processed. However, in actual applications, it is necessary to perform step 202 to step 204 continuously and circularly, until the adjustment on the second fixed parameter value meets a certain stopping condition.
  • Therefore, the classification training apparatus needs to determine whether the current adjustment on the second fixed parameter value meets a preset stopping condition after performing step 201 to step 204. If yes, the procedure ends, and otherwise, for a real graph classification model whose second fixed parameter value is adjusted, return to step 202 to step 204. To be specific, operations of obtaining the second category processing result and the feature analysis result of the second real graph, calculating the function value of the second loss function of the real graph classification model, and adjusting the second fixed parameter value are performed.
  • The preset stopping condition includes but is not limited to any one of the following conditions: a first difference between a second fixed parameter value currently adjusted and a second fixed parameter value adjusted last time is less than a second threshold, that is, the adjusted second fixed parameter value achieves convergence; and the number of adjustment times to the second fixed parameter value reach the preset number of times, and the like.
  • Step 202 to step 204 may be performed alternately with step 103 to step 105. For example, in an adjusting process, the first fixed parameter value of the sketch classification model and the fixed parameter value of the second feature analysis model may be adjusted. That is, step 103 to step 105 are performed. In another adjusting process, the second fixed parameter value of the real graph classification model and the fixed parameter value of the first feature analysis model are adjusted. That is, step 202 to step 204 are performed. In a readjusting process, the first fixed parameter value of the sketch classification model and the fixed parameter value of the second feature analysis model are adjusted again. That is, step 103 to step 105 are performed, and so on.
  • Further, the classification training apparatus obtains the sketch classification model and the real graph classification model after the adjustment through the methods in the foregoing embodiments. In actual application of the sketch classification model and the real graph classification model after the adjustment, in one case, the classification training apparatus may first obtain a to-be-classified sketch (such as a to-be-classified sketch input by a user through a terminal device), then classifies the to-be-classified sketch according to the sketch classification model after the adjustment, to obtain a category of the to-be-classified sketch, implementing classification of the sketches.
  • In another case, the classification training apparatus may first obtain the to-be-classified sketch (such as the to-be-classified sketch input by the user through the terminal device) and obtain each real graph stored in the classification training apparatus; then classify the to-be-classified sketch according to the sketch classification model after the adjustment to obtain a category of the to-be-classified sketch, and separately classify each stored real graph according to the real graph classification model after the adjustment to obtain categories of the real graphs; and finally select a real graph having a same category as that of the to-be-classified sketch, to provide the real graph for the terminal device of the user. In this way, retrieval of the sketch is implemented.
  • It can be learnt that in the method of this embodiment, the classification training apparatus first selects the training set, determines the category of the sketch in the training set according to the sketch classification model to obtain the first category processing result, and analyzes, according to the second feature analysis model, the feature of the sketch extracted by a first feature extracting model to obtain the analysis result of the second sketch; then obtains the function value of the first loss function according to the first category processing result and the analysis result of the second sketch; and finally adjusts the first fixed parameter value of the sketch classification model according to the function value of the first loss function. The classification training apparatus further determines the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result, and analyzes, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module to obtain an analysis result of the second real graph; then obtains the function value of the second loss function according to the second category processing result and the analysis result of the second real graph; and finally adjusts the second fixed parameter value of the real graph classification model according to the function value of the second loss function. In this way, reference will be made to not only a deviation for the classification model to classify a corresponding image, but also useful information of another classification model (such as a real graph classification model) in a classification process when a fixed parameter value of a certain classification model (such as the sketch classification model) is adjusted, that is, a feature analysis model (such as the second feature analysis model) analyzing a feature extracted by the feature extraction module in another classification model, to ensure more accurate classification calculation of the sketch classification model and the real graph classification model after the adjustment.
  • The following specific application example describes the method in this embodiment. Referring to FIG. 5, in this embodiment, the sketch classification model and the real graph classification model may use a convolutional-neural-network-based classification model of a same structure and are respectively denoted as CNN_1 and CNN_2. The first feature extraction module and the first classification module that are included in the sketch classification model are respectively noted as CNN11 and CNN12. The first feature analysis model is specifically a sketch identifier D_1. The second feature extraction module and the second classification module that are included in the real graph classification model are respectively denoted as CNN21 and CNN22. The second feature analysis model is specifically a real graph identifier D_2. The classification training method of this embodiment may be implemented through the following steps. The flowchart is shown in FIG. 6, including:
    • Step 301. Determine the sketch classification model CNN_1 and the real graph classification model CNN_2, and the sketch identifier D_1 and the real graph identifier D_2, specifically including determining a structure of each model and an initial value of the fixed parameter.
    • Step 302. Select the training set. The training set includes image pairs of a plurality of categories. The image pair of each category includes sketches and real graphs of a same category, specifically S i I i | i = 1 N .
      Figure imgb0001
      Si is a sketch image of a category i. Ii is a real graph image of the category i.
    • Step 303. Determine the category of the sketch in the training set according to the sketch classification model CNN_1 to obtain the category of the sketch, and determine the category of the real graph in the training set according to the real graph classification model CNN_2 to obtain the category of the real graph.
  • In this process, the sketch identifier D_1 identifies the feature of the sketch extracted by the sketch feature extraction module CNN11 in the sketch classification model CNN_1 to obtain a sketch feature identification result 11. The sketch identifier D_1 identifies the feature of the real graph extracted by the real graph feature extraction module CNN21 in the real graph classification model CNN_2 to obtain a real graph feature identification result 12.
  • The real graph identifier D_2 identifies the feature of the real graph extracted by the real graph feature extraction module CNN21 in the real graph classification model CNN_2 to obtain a real graph feature identification result 21. The real graph identifier D_2 identifies the feature of the sketch extracted by the sketch feature extraction module CNN11 in the sketch classification model CNN_1 to obtain a sketch feature identification result 22.
  • Step 304. First fix the sketch classification model CNN_1 and the real graph identifier D_2, adjust a fixed parameter value of the real graph classification model CNN_2 and a fixed parameter value of the sketch identifier D_1, to ensure that the adjustment on the real graph classification model CNN_2 refers to useful information of the sketch classification model CNN_1 in the classification process.
  • Specifically, the classification training apparatus may calculate a function value of an adversarial function GLSi of the sketch identifier D_1 according to the following formula 1 and the sketch feature identification result 11 and the real graph feature identification result 12 that are obtained in step 303 and adjusts the fixed parameter value of the sketch identifier D_1 according to the function value. E S i S logD _ 1 CNN _ 1 S i E I i I log 1 D _ 1 CNN _ 2 I i
    Figure imgb0002
  • A function value of a loss function of the real graph classification model CNN_2 is calculated according to the following formula 2 and the category of the real graph and the real graph feature identification result 12 that are determined by the real graph classification model CNN_2 in step 303, and the fixed parameter value of the real graph classification model CNN_2 is adjusted according to the function value. The loss function may specifically include a loss function related to the real graph classification module CNN22 in the real graph classification model CNN_2, such as a cross-entropy loss function CLIi , and a loss function by which the sketch identifier D_1 identifies the feature of the real graph. The loss function by which the sketch identifier D_1 identifies the feature of the real graph may be obtained according to the real graph feature identification result 12. The related loss function CLIi may be obtained according to the following formula 3 and the category of the real graph determined by the real graph classification model CNN_2 in step 303. E I i I log D _ 1 CNN _ 2 I i + CL I i
    Figure imgb0003
    CL I i = 1 n d = 1 n y i d ln C I CNN 2 I i + 1 y i d ln 1 C I CNN 2 I i
    Figure imgb0004

    wherein log(D_1(CNN_2(Ii ))) may indicate a loss function by which the sketch identifier D_1 identifies the feature of the real graph.
  • Step 305. Fix the real graph classification model CNN_2 and the sketch identifier D_1 again, adjust a fixed parameter value of the sketch classification model CNN_1 and a fixed parameter value of the real graph identifier D_2, to ensure that the adjustment on the sketch classification model CNN_1 refers to useful information of the real graph classification model CNN_2 in the classification process.
  • Specifically, the classification training apparatus may calculate a function value of an adversarial function GLIi of the real graph identifier D_2 according to the following formula 4 and the real graph feature identification result 21 and the sketch feature identification result 22 that are obtained in step 303 and adjusts the fixed parameter value of the real graph identifier D_2 according to the function value. E I i S logD _ 2 CNN _ 2 I i E S i S log 1 D _ 2 CNN _ 1 S i
    Figure imgb0005
  • A function value of a loss function of the sketch classification model CNN_1 is calculated according to the following formula 5 and the category of the sketch and the sketch feature identification result 22 that are determined by the sketch classification model CNN_1 in step 303, and the fixed parameter value of the sketch classification model CNN_1 is adjusted according to the function value. The function value may specifically include a loss function related to the sketch classification module CNN12 in the sketch classification model CNN_1, such as a cross-entropy loss function CLSi , and a loss function by which the real graph identifier D_2 identifies the feature of the sketch. The loss function by which the real graph identifier D_2 identifies the feature of the sketch may be obtained according to the sketch feature identification result 22. The related loss function CLSi may be obtained according to the following formula 6 and the category of the sketch. E S i S log D _ 2 CNN _ 1 S i + CL S i
    Figure imgb0006
    CL S i = 1 n d = 1 n y i d ln C s CNN 1 S i + 1 y i d ln 1 C s CNN 1 S i
    Figure imgb0007

    wherein log (D_2(NN_1(Si ))) may indicate a loss function by which the real graph identifier D_2 identifies the feature of the sketch.
  • Step 306. Determine whether the adjustment on the fixed parameter value of the sketch classification model CNN_1 and the real graph classification model CNN_2 meets a preset condition after performing step 301 to step 305. If yes, the procedure ends, and otherwise, for the sketch classification model CNN_1 and the real graph classification model CNN_2 after the adjustment, and the sketch identifier D_1 and the real graph identifier D_2 after the adjustment, return to perform step 303 to step 305.
  • An embodiment of this application further provides a classification training apparatus. A schematic structural diagram of the classification training apparatus is shown in FIG. 7 and the classification training apparatus may specifically include:
    a model determining unit 10, configured to determine a sketch classification model, the sketch classification model including a first feature extraction module and a first classification module; and determine a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model.
  • Specifically, the model determining unit 10 is specifically configured to: determine an initial model of a sketch classification and an initial model of a real graph classification that have a same structure and determine a sketch whose category has been marked and a real graph whose category has been marked; determine a category of the sketch whose category has been marked according to the initial model of the sketch classification and determine a category of the real graph whose category has been marked according to the initial model of the real graph classification, to obtain an initial classification result; calculate a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification according to the initial classification result; and adjust a fixed parameter value in the initial model of the sketch classification according to the function value of the third loss function and adjust a fixed parameter value in the initial model of the real graph classification according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model. A training set unit 11 is configured to select a training set. The training set includes sketches of a plurality of categories. The training set further includes real graphs of corresponding categories.
  • A processing unit 12 is configured to: determine, according to the sketch classification model determined by the model determining unit 10, the category of the sketch in the training set selected by the training set unit 11, to obtain a first category processing result; and analyze, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module, to obtain a feature analysis result of a second sketch.
  • A function value calculation unit 13 is configured to calculate a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch that are obtained by the processing unit 12. The first loss function includes a loss function related to the first classification module and a loss function by which the second feature analysis model analyzes the feature of the sketch.
  • An adjustment unit 14 is configured to adjust a first fixed parameter value in the sketch classification model according to the function value of the first loss function calculated by the function value calculation unit 13.
  • Further, the training set selected by the training set unit 11 may further include a real graph of a corresponding category. The model determining unit 10 may further determine the real graph classification model. The real graph classification model includes a second feature extraction module and a second classification module. The processing unit 12 is further configured to analyze, according to the second feature analysis model, the feature of the real graph extracted by the second feature extraction module when the real graph classification model determines the category of the real graph, to obtain a feature analysis result of a first real graph. The function value calculation unit 13 is further configured to calculate a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch. The adjustment unit 14 is further configured to adjust a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function.
  • Further, the model determining unit 10 is further configured to determine the first feature analysis model analyzing an output result of the first feature extraction module. The processing unit 12 is further configured to: determine the category of the real graph in the training set according to the real graph classification model to obtain a second category processing result, and analyze, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module, to obtain the feature analysis result of the second real graph. The function value calculation unit 13 is further configured to calculate the function value of the second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph. The second loss function includes a loss function related to the second classification module and a loss function by which the first feature analysis model analyzes the feature of the real graph. The adjustment unit 14 is further configured to adjust the second fixed parameter value in the real graph classification model according to the function value of the second loss function.
  • Further, the processing unit 12 is further configured to analyze, according to the first feature analysis model, the feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a first sketch. The function value calculation unit 13 is further configured to calculate a function value of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph. The adjustment unit 14 is further configured to adjust a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  • It can be learnt that in the apparatus according to this embodiment, the training set unit 11 selects the training set. The processing unit 12 determines the category of the sketch in the training set according to the sketch classification model to obtain the first category processing result, and analyzes, according to the second feature analysis model, the feature of the sketch extracted by a first feature extracting model to obtain an analysis result of the second sketch. Then the function value calculation unit 13 obtains the function value of the first loss function according to the first category processing result and the analysis result of the second sketch. Finally the adjustment unit 14 adjusts the first fixed parameter value of the sketch classification model according to the function value of the first loss function. The processing unit 12 may further determine the category of the real graph in the training set according to the real graph classification model to obtain the second category processing result, and analyze, according to the first feature analysis model, the feature of the real graph extracted by the second feature extraction module to obtain the analysis result of the second real graph. Then the function value calculation unit 13 obtains the function value of the second loss function according to the second category processing result and the analysis result of the second real graph. Finally the adjustment unit 14 adjusts the second fixed parameter value of the real graph classification model according to the function value of the second loss function. In this way, reference will be made to not only a deviation for the classification model to classify a corresponding image, but also useful information of another classification model (such as a real graph classification model) in a classification process when a fixed parameter value of a certain classification model (such as the sketch classification model) is adjusted, that is, a feature analysis model (such as the second feature analysis model) analyzing a feature extracted by the feature extraction module in another classification model, to ensure more accurate classification calculation of the sketch classification model and the real graph classification model after the adjustment.
  • Referring to FIG. 8, in a specific embodiment, the classification training apparatus may further include a determining unit 15 and a classification unit 16 in addition to the structure shown in FIG. 7.
  • The determining unit 15 is configured to determine whether the adjustment of the adjustment unit 14 on the first fixed parameter value meets a preset stopping condition. If no, the processing unit 12 is notified to obtain the first category processing result and the feature analysis result of the second sketch for the sketch classification model whose first fixed parameter value is adjusted.
  • The preset stopping condition may include but is not limited to any one of the following conditions:
  • a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold, the number of adjustment times to a first fixed parameter reach the preset number of times, and the like.
  • Further, the determining unit 15 is configured to determine whether the adjustment of the adjustment unit 14 on the second fixed parameter value meets the preset stopping condition. If no, the processing unit 12 is notified to obtain the second category processing result and the feature analysis result of the second real graph for the real graph classification model whose second fixed parameter value is adjusted. The preset stopping condition herein may include but is not limited to any one of the following conditions: the first difference between a second fixed parameter value currently adjusted and a second fixed parameter value adjusted last time is less than the first threshold, the number of adjustment times to the second fixed parameter reach the preset number of times, and the like.
  • The classification unit 16 is configured to: obtain a to-be-classified sketch and classify the to-be-classified sketch according to the sketch classification model adjusted by the adjustment unit 14 to obtain a category of the to-be-classified sketch.
  • The classification unit 16 may further obtain the to-be-classified sketch and obtain each real graph stored in the classification training apparatus; then classify the to-be-classified sketch according to the sketch classification model adjusted by the adjustment unit 14 to obtain the category of the to-be-classified sketch, and classify each stored real graph separately according to the real graph classification model after the adjustment to obtain the categories of the real graphs; and finally select a real graph whose category is the same as that of the to-be-classified sketch.
  • An embodiment of this application further provides a server. The schematic structural diagram is shown in FIG. 9. The server may produce larger difference due to different configuration or performance. The server may include one or more central processing units (CPU) 20 (such as one or more processors), one or more memories 21, one or more storage media 22 (such as one or more mass storage devices) storing application programs 221 or data 222. The memories 21 and the storage media 22 may temporarily or persistently store the application program 221 or the data 222. The program storing in the storage media 22 may include one or more modules (not shown in the FIG), and each module may include a series of computer readable instructions to the server. Further, the central processing units 20 may be disposed in communication with the storage media 22 and to implement the series of computer readable instructions in the storage media 22 on the server. When the computer readable instructions are being implemented, the one or more processors included in the central processing unit 222 may be enabled to perform a classification training method. The storage media 730 may be non-volatile storage media. The non-volatile storage media may be non-volatile readable storage media.
  • Specifically, the application program 221 stored in the storage media 22 includes a classification training application program. The program may include the model determining unit 10, the training set unit 11, the processing unit 12, the function value calculation unit 13, the adjustment unit 14, the determining unit 15, and the classification unit 16 in the classification training apparatus, which is not described herein again. Further, the central processing units 20 may be disposed in communication with the storage media 22 and to perform a series of operations corresponding to the classification training application program stored in the storage media 22 on the server.
  • The server may further include one or more power supplies 23, one or more wired or wireless network interfaces 24, one or more input/output interfaces 25, and/or one or more operating systems 223, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, and FreeBSDTM.
  • In the foregoing method embodiment, the step performed by the classification training apparatus may be based on the structure of the server shown in FIG. 9. Each unit or module included in the classification training apparatus may be wholly or partially implemented through software, hardware, or a combination thereof.
  • An embodiment of this application further provides a storage medium. The storage medium stores computer readable instructions. The computer readable instructions are loaded by a processor to perform the classification training method according to the embodiments of this application.
  • An embodiment of this application further provides a server, including one or more processors and memories. The memories store computer readable instructions. The one or more processors are configured to implement each computer readable instruction. The computer readable instructions are configured to be loaded by the one or more processors to perform the classification training method according to the embodiments of this application.
  • A person of ordinary skill in the art may understand that all or some steps in the methods of the foregoing embodiments may be implemented by a computer readable instruction instructing relevant hardware. The computer readable instruction may be stored in a computer readable storage medium. The storage medium may include: a read-only memory (ROM), a random access memory (RAM), a magnetic disk, an optical disk, or the like.
  • The technical features of the foregoing embodiments may be randomly combined. For the purpose of concise descriptions, not all possible combinations of the technical features in the foregoing embodiments are described, but as long as combinations of the technical features do not conflict each other, the combinations of the technical features should be considered as falling within the scope of this specification.
  • The foregoing embodiments briefly describe the classification training method, server, and storage medium according to the embodiments of this application. The specification describes the principles and implementations of this application by using specific examples. The description of the foregoing embodiments are merely configured to help understanding the methods and core ideas of this application, but should not be understood as a limitation to the patent scope of the present disclosure. In addition, for a person of ordinary skill in the art, any change made to the specific embodiments and application ranges according to the idea of this application shall fall within the protection scope of this application. According to the above, the content of this specification should not be understood as a limitation to this application.

Claims (20)

  1. A classification training method, comprising:
    determining, by a server, a sketch classification model, the sketch classification model comprising a first feature extraction module and a first classification module; and determining a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model;
    selecting, by the server, a training set, the training set comprising sketches of a plurality of categories;
    determining, by the server, a category of a sketch in the training set according to the sketch classification model to obtain a first category processing result; and analyzing, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a second sketch;
    calculating, by the server, a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch; and
    adjusting, by the server, a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  2. The method according to claim 1, wherein the training set further comprises a real graph of a corresponding category, and the method further comprises:
    determining, by the server, a real graph classification model, the real graph classification model comprising a second feature extraction module and a second classification module;
    analyzing, by the server according to the second feature analysis model, a feature of the real graph extracted by the second feature extraction module in a case that the real graph classification model determines a category of the real graph, to obtain a feature analysis result of a first real graph;
    calculating, by the server, a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch; and
    adjusting, by the server, a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function.
  3. The method according to claim 2, wherein the determining, by the server, the sketch classification model and the real graph classification model comprises:
    determining, by the server, an initial model of a sketch classification and an initial model of a real graph classification, and determining a sketch whose category has been marked and a real graph whose category has been marked;
    determining, by the server, a category of the sketch whose category has been marked according to the initial model of the sketch classification and a category of the real graph whose category has been marked according to the initial model of the real graph classification, to obtain an initial classification result; and
    calculating, by the server, a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification according to the initial classification result, and adjusting the initial model of the sketch classification according to the function value of the third loss function and adjusting a fixed parameter value in the initial model of the real graph classification according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  4. The method according to claim 2, further comprising:
    determining, by the server, a first feature analysis model analyzing the output result of the first feature extraction module;
    determining, by the server, a category of a real graph in the training set according to the real graph classification model to obtain a second category processing result; and analyzing, according to the first feature analysis model, a feature of the real graph extracted by the second feature extraction module to obtain a feature analysis result of a second real graph;
    calculating, by the server, a function value of a second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph; and
    adjusting, by the server, a second fixed parameter value in the real graph classification model according to the function value of the second loss function.
  5. The method according to claim 4, further comprising:
    analyzing, by the server according to the first feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of the first sketch;
    calculating, by the server, a function value of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph; and
    adjusting, by the server, the fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  6. The method according to any one of claims 2 to 5, further comprising:
    returning, by the server for the sketch classification model whose first fixed parameter value is adjusted, the first category processing result obtained by determining, according to the sketch classification model, the category of the sketch in the training set to continue execution in a case that adjustment on the first fixed parameter value does not meet a preset stopping condition, until the adjustment on the first fixed parameter value meets the preset stopping condition.
  7. The method according to claim 6, wherein the preset stopping condition comprises any one of the following conditions:
    a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold and the number of adjustment times of the first fixed parameter value reaches the preset number of times.
  8. The method according to any one of claims 1 to 3, further comprising:
    obtaining, by the server, a to-be-classified sketch and classifying the to-be-classified sketch according to the sketch classification model after the adjustment to obtain a category of the to-be-classified sketch.
  9. One or more storage media, storing computer readable instructions, the computer readable instructions being loaded by a processor to perform the following operations:
    determining a real graph classification model, the real graph classification model comprising a second feature extraction module and a second classification module;
    analyzing, according to a second feature analysis model, a feature of the real graph extracted by the second feature extraction module in a case that the real graph classification model determines a category of the real graph, to obtain a feature analysis result of a first real graph;
    calculating a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch; and
    adjusting a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function.
  10. The storage media according to claim 9, wherein the determining a sketch classification model and a real graph classification model comprises:
    determining an initial model of a sketch classification and an initial model of a real graph classification, and determining a sketch whose category has been marked and a real graph whose category has been marked;
    determining a category of the sketch whose category has been marked according to the initial model of the sketch classification and a category of the real graph whose category has been marked according to the initial model of the real graph classification, to obtain an initial classification result; and
    calculating a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification according to the initial classification result, and adjusting the initial model of the sketch classification according to the function value of the third loss function and adjusting a fixed parameter value in the initial model of the real graph classification according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  11. The storage media according to claim 10, wherein the computer readable instructions are loaded by the processor to perform the following operations:
    determining a first feature analysis model analyzing an output result of the first feature extraction module;
    determining a category of a real graph in the training set according to the real graph classification model to obtain a second category processing result; and analyzing, according to the first feature analysis model, a feature of the real graph extracted by the second feature extraction module to obtain a feature analysis result of a second real graph;
    calculating a function value of a second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph; and
    adjusting a second fixed parameter value in the real graph classification model according to the function value of the second loss function.
  12. The storage media according to claim 10, wherein the computer readable instructions are loaded by the processor to perform the following operations:
    analyzing, according to the first feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of the first sketch;
    calculating a function value of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph; and
    adjusting a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  13. A server, comprising one or more processors and memories, the memories storing computer readable instructions, the processors being configured to implement the computer readable instructions, and the computer readable instructions being loaded by the one or more processors to perform the following operations:
    determining a sketch classification model, the sketch classification model comprising a first feature extraction module and a first classification module; and determining a second feature analysis model analyzing an output result of a second feature extraction module, the second feature extraction module being a real graph classification model;
    selecting a training set, the training set comprising sketches of a plurality of categories;
    determining a category of a sketch in the training set according to the sketch classification model to obtain a first category processing result; and analyzing, according to the second feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of a second sketch;
    calculating a function value of a first loss function of the sketch classification model according to the first category processing result and the feature analysis result of the second sketch; and
    adjusting a first fixed parameter value in the sketch classification model according to the function value of the first loss function.
  14. The server according to claim 13, wherein the training set further comprises a real graph of a corresponding category, and the computer readable instructions are loaded by the one or more processors to perform the following operations:
    determining a real graph classification model, the real graph classification model comprising a second feature extraction module and a second classification module;
    analyzing, according to the second feature analysis model, a feature of the real graph extracted by the second feature extraction module in a case that the real graph classification model determines a category of the real graph, to obtain a feature analysis result of a first real graph;
    calculating a function value of a second adversarial loss function of the second feature analysis model according to the feature analysis result of the first real graph and the feature analysis result of the second sketch; and
    adjusting a fixed parameter value of the second feature analysis model according to the function value of the second adversarial loss function.
  15. The server according to claim 14, wherein the determining a sketch classification model and a real graph classification model comprises:
    determining an initial model of a sketch classification and an initial model of a real graph classification, and determining a sketch whose category has been marked and a real graph whose category has been marked;
    determining a category of the sketch whose category has been marked according to the initial model of the sketch classification and a category of the real graph whose category has been marked according to the initial model of the real graph classification, to obtain an initial classification result; and
    calculating a function value of a third loss function related to the initial model of the sketch classification and a function value of a fourth loss function related to the initial model of the real graph classification according to the initial classification result, and adjusting the initial model of the sketch classification according to the function value of the third loss function and adjusting a fixed parameter value in the initial model of the real graph classification according to the function value of the fourth loss function, to obtain the sketch classification model and the real graph classification model.
  16. The server according to claim 14, wherein the computer readable instructions are loaded by the one or more processors to perform the following operations:
    determining a first feature analysis model analyzing the output result of the first feature extraction module;
    determining a category of a real graph in the training set according to the real graph classification model to obtain a second category processing result; and analyzing, according to the first feature analysis model, a feature of the real graph extracted by the second feature extraction module to obtain a feature analysis result of a second real graph;
    calculating a function value of a second loss function of the real graph classification model according to the second category processing result and the feature analysis result of the second real graph; and
    adjusting a second fixed parameter value in the real graph classification model according to the function value of the second loss function.
  17. The server according to claim 16, wherein the computer readable instructions are loaded by the one or more processors to perform the following operations:
    analyzing, according to the first feature analysis model, a feature of the sketch extracted by the first feature extraction module to obtain a feature analysis result of the first sketch;
    calculating a function value of a first adversarial loss function of the first feature analysis model according to the feature analysis result of the first sketch and the feature analysis result of the second real graph; and
    adjusting a fixed parameter value of the first feature analysis model according to the function value of the first adversarial loss function.
  18. The server according to any one of claims 13 to 17, wherein the computer readable instructions are loaded by the one or more processors to perform the following operations:
    returning, for the sketch classification model whose first fixed parameter value is adjusted, the first category processing result obtained by determining, according to the sketch classification model, the category of the sketch in the training set to continue execution in a case that adjustment on the first fixed parameter value does not meet a preset stopping condition, until the adjustment on the first fixed parameter value meets the preset stopping condition.
  19. The server according to claim 18, wherein the preset stopping condition comprises any one of the following conditions:
    a first difference between a first fixed parameter value currently adjusted and a first fixed parameter value adjusted last time is less than a first threshold and the number of adjustment times of the first fixed parameter value reaches the preset number of times.
  20. The server according to any one of claims 13 to 15, wherein the computer readable instructions are loaded by the one or more processors to perform the following operation:
    obtaining a to-be-classified sketch and classifying the to-be-classified sketch according to the sketch classification model after the adjustment to obtain a category of the to-be-classified sketch.
EP18887500.9A 2017-12-12 2018-11-23 Classification training method, server and storage medium Pending EP3726426A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711322612.2A CN108090508B (en) 2017-12-12 2017-12-12 classification training method, device and storage medium
PCT/CN2018/117158 WO2019114523A1 (en) 2017-12-12 2018-11-23 Classification training method, server and storage medium

Publications (2)

Publication Number Publication Date
EP3726426A1 true EP3726426A1 (en) 2020-10-21
EP3726426A4 EP3726426A4 (en) 2021-08-04

Family

ID=62175384

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18887500.9A Pending EP3726426A4 (en) 2017-12-12 2018-11-23 Classification training method, server and storage medium

Country Status (4)

Country Link
US (1) US11017220B2 (en)
EP (1) EP3726426A4 (en)
CN (2) CN110633745B (en)
WO (1) WO2019114523A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110633745B (en) 2017-12-12 2022-11-29 腾讯科技(深圳)有限公司 Image classification training method and device based on artificial intelligence and storage medium
JP7083037B2 (en) * 2018-09-20 2022-06-09 富士フイルム株式会社 Learning device and learning method
CN110163222B (en) * 2018-10-08 2023-01-24 腾讯科技(深圳)有限公司 Image recognition method, model training method and server
CN109977262B (en) * 2019-03-25 2021-11-16 北京旷视科技有限公司 Method and device for acquiring candidate segments from video and processing equipment
US11048932B2 (en) * 2019-08-26 2021-06-29 Adobe Inc. Transformation of hand-drawn sketches to digital images
US11526899B2 (en) 2019-10-11 2022-12-13 Kinaxis Inc. Systems and methods for dynamic demand sensing
US11886514B2 (en) 2019-10-11 2024-01-30 Kinaxis Inc. Machine learning segmentation methods and systems
CN111159397B (en) * 2019-12-04 2023-04-18 支付宝(杭州)信息技术有限公司 Text classification method and device and server
CN111143552B (en) * 2019-12-05 2023-06-27 支付宝(杭州)信息技术有限公司 Text information category prediction method and device and server
CN111079813B (en) * 2019-12-10 2023-07-07 北京百度网讯科技有限公司 Classification model calculation method and device based on model parallelism
CN111461352B (en) * 2020-04-17 2023-05-09 蚂蚁胜信(上海)信息技术有限公司 Model training method, service node identification device and electronic equipment
CN111951128B (en) * 2020-08-31 2022-01-28 江苏工程职业技术学院 Energy-saving and environment-friendly building construction method and device
CN111931865B (en) * 2020-09-17 2021-01-26 平安科技(深圳)有限公司 Training method and device of image classification model, computer equipment and storage medium
US20220091873A1 (en) * 2020-09-24 2022-03-24 Google Llc Systems and methods for cross media reporting by fast merging of data sources
CN112288098A (en) * 2020-11-02 2021-01-29 平安数字信息科技(深圳)有限公司 Method and device for acquiring pre-training model and computer equipment
CN112529978B (en) * 2020-12-07 2022-10-14 四川大学 Man-machine interactive abstract picture generation method
CN112862110B (en) * 2021-02-11 2024-01-30 脸萌有限公司 Model generation method and device and electronic equipment
CN113313022B (en) * 2021-05-27 2023-11-10 北京百度网讯科技有限公司 Training method of character recognition model and method for recognizing characters in image
TWI815492B (en) * 2022-06-06 2023-09-11 中國鋼鐵股份有限公司 Method and system for classifying defects on surface of steel stripe

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8140448B2 (en) * 2008-05-09 2012-03-20 International Business Machines Corporation System and method for classifying data streams with very large cardinality
US8289287B2 (en) * 2008-12-30 2012-10-16 Nokia Corporation Method, apparatus and computer program product for providing a personalizable user interface
US8718375B2 (en) * 2010-12-03 2014-05-06 Massachusetts Institute Of Technology Sketch recognition system
CN102184395B (en) * 2011-06-08 2012-12-19 天津大学 String-kernel-based hand-drawn sketch recognition method
US20140270489A1 (en) * 2013-03-12 2014-09-18 Microsoft Corporation Learned mid-level representation for contour and object detection
CN104680120B (en) * 2013-12-02 2018-10-19 华为技术有限公司 A kind of generation method and device of the strong classifier of Face datection
WO2017168125A1 (en) * 2016-03-31 2017-10-05 Queen Mary University Of London Sketch based search methods
CN106126581B (en) * 2016-06-20 2019-07-05 复旦大学 Cartographical sketching image search method based on deep learning
CN106250866A (en) * 2016-08-12 2016-12-21 广州视源电子科技股份有限公司 Image characteristics extraction modeling based on neutral net, image-recognizing method and device
CN106683048B (en) * 2016-11-30 2020-09-01 浙江宇视科技有限公司 Image super-resolution method and device
CN107122396B (en) * 2017-03-13 2019-10-29 西北大学 Method for searching three-dimension model based on depth convolutional neural networks
CN107220277A (en) * 2017-04-14 2017-09-29 西北大学 Image retrieval algorithm based on cartographical sketching
CN110633745B (en) * 2017-12-12 2022-11-29 腾讯科技(深圳)有限公司 Image classification training method and device based on artificial intelligence and storage medium

Also Published As

Publication number Publication date
EP3726426A4 (en) 2021-08-04
CN110633745B (en) 2022-11-29
CN110633745A (en) 2019-12-31
WO2019114523A1 (en) 2019-06-20
US20200097709A1 (en) 2020-03-26
CN108090508B (en) 2020-01-31
CN108090508A (en) 2018-05-29
US11017220B2 (en) 2021-05-25

Similar Documents

Publication Publication Date Title
US11017220B2 (en) Classification model training method, server, and storage medium
US11348249B2 (en) Training method for image semantic segmentation model and server
CN108491817B (en) Event detection model training method and device and event detection method
CN110162593B (en) Search result processing and similarity model training method and device
US11967152B2 (en) Video classification model construction method and apparatus, video classification method and apparatus, device, and medium
CN110147700B (en) Video classification method, device, storage medium and equipment
WO2020164282A1 (en) Yolo-based image target recognition method and apparatus, electronic device, and storage medium
US20210382937A1 (en) Image processing method and apparatus, and storage medium
EP2806374A1 (en) Method and system for automatic selection of one or more image processing algorithm
EP3989104A1 (en) Facial feature extraction model training method and apparatus, facial feature extraction method and apparatus, device, and storage medium
CN108647571B (en) Video motion classification model training method and device and video motion classification method
CN110929848B (en) Training and tracking method based on multi-challenge perception learning model
US10210424B2 (en) Method and system for preprocessing images
CN110263854B (en) Live broadcast label determining method, device and storage medium
CN113222942A (en) Training method of multi-label classification model and method for predicting labels
CN106294505B (en) Answer feedback method and device
WO2021051586A1 (en) Interview answer text classification method, device, electronic apparatus and storage medium
CN108549857B (en) Event detection model training method and device and event detection method
CN114596497A (en) Training method of target detection model, target detection method, device and equipment
US11941867B2 (en) Neural network training using the soft nearest neighbor loss
CN111708890A (en) Search term determining method and related device
CN111401343A (en) Method for identifying attributes of people in image and training method and device for identification model
CN116030323B (en) Image processing method and device
CN108170665B (en) Keyword expansion method and device based on comprehensive similarity
CN116128044A (en) Model pruning method, image processing method and related devices

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191212

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20210702

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/62 20060101AFI20210628BHEP

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20231023