CN113673570A - Training method, device and equipment for electronic device picture classification model - Google Patents

Training method, device and equipment for electronic device picture classification model Download PDF

Info

Publication number
CN113673570A
CN113673570A CN202110827473.9A CN202110827473A CN113673570A CN 113673570 A CN113673570 A CN 113673570A CN 202110827473 A CN202110827473 A CN 202110827473A CN 113673570 A CN113673570 A CN 113673570A
Authority
CN
China
Prior art keywords
model
electronic device
training
discriminator
training sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110827473.9A
Other languages
Chinese (zh)
Inventor
陈晓炬
王邦军
杨怀宇
李磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Xurui Software Technology Co ltd
Original Assignee
Nanjing Xurui Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Xurui Software Technology Co ltd filed Critical Nanjing Xurui Software Technology Co ltd
Priority to CN202110827473.9A priority Critical patent/CN113673570A/en
Publication of CN113673570A publication Critical patent/CN113673570A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the application provides a method, a device and equipment for training an electronic device picture classification model, wherein the method comprises the following steps: acquiring a first training sample and a second training sample; inputting the first training sample and the second training sample into an original model to perform first classification training, updating the parameters of a classification predictor and the parameters of a discriminator, and keeping the parameters of a feature extractor unchanged to obtain a first model; the original model is obtained through a first training sample; inputting the first training sample and the second training sample into the first model to perform second classification training, updating the parameters of the feature extractor, and keeping the parameters of the class predictor and the parameters of the discriminator unchanged to obtain a second model; when a preset model training stopping condition is met, determining the trained class predictor as an electronic device picture classification model; the embodiment of the application can solve the problem that the classification efficiency of the existing electric appliance device image classification model is low.

Description

Training method, device and equipment for electronic device picture classification model
Technical Field
The application belongs to the field of image recognition, and particularly relates to a training method, a training device and training equipment for an electronic device picture classification model.
Background
In the liquid crystal panel industry, the classification of panel image quality inspection defects is very important, the manufacturing of the liquid crystal panel relates to different processes, each process corresponds to different manufacturing stations, each station can have different products, the shape arrangement of electronic devices in different products is different, and the shot pictures are different. In the prior art, there is a method for identifying an electronic device picture by using an image identification model.
When a model trained by a certain product is used in the same manufacturing site, the model is often not well represented on another product type, and the recognition accuracy is low, so that the model needs to be retrained based on another product and then recognized respectively, and the classification efficiency is low.
Disclosure of Invention
The embodiment of the application provides a training method, a training device and training equipment for an electronic device picture classification model, and can solve the problem of low classification efficiency of the existing electric appliance picture classification model.
In a first aspect, an embodiment of the present application provides a method for training an electronic device image classification model, including:
acquiring a first training sample and a second training sample, wherein the first training sample comprises a plurality of first electronic device pictures marked with class labels, and the second training sample comprises a plurality of second electronic device pictures not marked with the class labels;
inputting the first training sample and the second training sample into an original model to perform first classification training, updating the parameters of a classification predictor and the parameters of a discriminator, and keeping the parameters of a feature extractor unchanged to obtain a first model; the method comprises the steps that an original model is obtained through a first training sample, and the original model comprises a feature extractor, a category predictor and a discriminator;
inputting the first training sample and the second training sample into the first model to perform second classification training, updating the parameters of the feature extractor, and keeping the parameters of the class predictor and the parameters of the discriminator unchanged to obtain a second model;
and when the preset model training stopping condition is met, determining the trained class predictor as the electronic device picture classification model.
Further, in one embodiment, the discriminator comprises:
the first discriminator and the second discriminator are provided with different initial parameters.
Further, in an embodiment, before inputting the first training sample and the second training sample into the original model for the first classification training, updating the parameters of the class predictor and the parameters of the discriminator, and keeping the parameters of the feature extractor unchanged to obtain the first model, the method further includes:
and inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain an original model.
Further, in one embodiment, the preset model training stop condition includes:
the loss function values of the first model and the second model meet preset conditions;
the loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
In a second aspect, an embodiment of the present application provides a method for classifying an electronic device picture by using an electronic device picture classification model, where the electronic device picture classification model is trained by a training method of the electronic device picture classification model, and the method includes:
acquiring an electronic device picture;
and inputting the electronic device picture into the electronic device picture classification model for classification, and outputting the electronic device type corresponding to the electronic device picture.
In a third aspect, an embodiment of the present application provides a training apparatus for an electronic device image classification model, including:
the acquisition module is used for acquiring a first training sample and a second training sample, wherein the first training sample comprises a plurality of first electronic device pictures marked with class labels, and the second training sample comprises a plurality of second electronic device pictures which are not marked with the class labels;
the updating module is used for inputting the first training sample and the second training sample into the original model to perform first classification training, updating the parameters of the classification predictor and the parameters of the discriminator, and keeping the parameters of the feature extractor unchanged to obtain a first model; the method comprises the steps that an original model is obtained through a first training sample, and the original model comprises a feature extractor, a category predictor and a discriminator;
the updating module is also used for inputting the first training sample and the second training sample into the first model to carry out second classification training, updating the parameters of the feature extractor, and keeping the parameters of the class predictor and the parameters of the discriminator unchanged to obtain a second model;
and the determining module is used for determining the trained class predictor as the electronic device picture classification model after meeting the preset model training stopping condition.
Further, in one embodiment, the discriminator comprises:
the first discriminator and the second discriminator are provided with different initial parameters.
Further, in an embodiment, the update module is further configured to:
inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain an original model before inputting the first training sample and the second training sample into the original model to perform first classification training, updating the parameters of the feature extractor, the class predictor and the discriminator to obtain the original model, and keeping the parameters of the feature extractor unchanged.
Further, in one embodiment, the preset model training stop condition includes:
the loss function values of the first model and the second model meet preset conditions;
the loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
In a fourth aspect, an embodiment of the present application provides an apparatus for classifying an electronic device picture using an electronic device picture classification model, where the electronic device picture classification model is trained by a training apparatus for the electronic device picture classification model, and the apparatus includes:
the acquisition module is used for acquiring an electronic device picture;
and the classification module is used for inputting the electronic device pictures into the electronic device picture classification model for classification and outputting the electronic device types corresponding to the electronic device pictures.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program implementing the above method when executed by the processor.
In a sixth aspect, an embodiment of the present application provides a computer-readable storage medium, on which an implementation program for information transfer is stored, and when the implementation program is executed by a processor, the method is implemented.
According to the training method, device and equipment for the electronic device picture classification model, a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not labeled are adopted to train the electronic device picture classification model, and as a training countermeasure mechanism is adopted during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a training method for an electronic device picture classification model according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating a method for classifying an electronic device picture using an electronic device picture classification model according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of an apparatus for training an electronic device image classification model according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of an apparatus for training an electronic device image classification model according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by illustrating examples thereof.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
When a model trained by a certain product is on the same site, the model is not good on another product type, after the product is remodeled in a factory, the marking data volume of the existing product is small, so that the model cannot be rapidly adapted, the recognition accuracy is low, the model needs to be retrained based on another product and then is respectively recognized, and the classification efficiency is low.
In order to solve the prior art problems, the embodiment of the application provides a training method, a device and equipment for an electronic device picture classification model. The embodiment of the application adopts a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not marked with a label to train the electronic device picture classification model, and adopts a training countermeasure mechanism during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved. First, a method for training an electronic device image classification model provided in an embodiment of the present application is described below.
Fig. 1 shows a flowchart of a training method for an electronic device picture classification model according to an embodiment of the present application. As shown in fig. 1, the method may include the steps of:
s100, a first training sample and a second training sample are obtained.
The first training sample comprises a plurality of first electronic device pictures marked with class labels, and the second training sample comprises a plurality of second electronic device pictures which are not marked with class labels.
In an application scenario of the liquid crystal panel, the first training sample may correspond to an original product type produced by a site, and the second training sample may correspond to a new product type produced by the site.
And S120, inputting the first training sample and the second training sample into the original model to perform first classification training, updating the parameters of the classification predictor and the parameters of the discriminator, and keeping the parameters of the feature extractor unchanged to obtain a first model.
The original model is obtained through a first training sample, and comprises a feature extractor, a category predictor and a discriminator. Because the original model is obtained through the first training sample, the original model is directly used for identifying the second training sample based on the difference between the original product type and the new product type, and the identification precision is low, so that the second training sample is introduced to continue training on the basis of the original model to improve the identification precision of the model for the new product type. The parameters of the class predictor and the discriminator are updated, and the parameters of the feature extractor are kept unchanged, so that the difference loss between discriminators is increased.
And S140, inputting the first training sample and the second training sample into the first model to perform second classification training, updating the parameters of the feature extractor, and keeping the parameters of the class predictor and the parameters of the discriminator unchanged to obtain a second model.
And updating the parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers and improving the classification precision of the model.
And S160, determining the trained class predictor as the electronic device picture classification model after the preset model training stopping condition is met.
The embodiment of the application adopts a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not marked with a label to train the electronic device picture classification model, and adopts a training countermeasure mechanism during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved.
In one embodiment, the discriminator may include:
the first discriminator and the second discriminator are provided with different initial parameters.
The discriminator is used for identifying the type of the electronic device corresponding to the training sample and is also used for identifying the first training sample or the second training sample to which the training sample belongs. Through making the differentiation not only study the electron device classification, but also study the affiliated type of training sample to set up two discriminators and compete, promoted the precision of electron device picture classification model.
In one embodiment, before S120, the method may further include:
and S110, inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain an original model.
The original model may be selected as a DANN network, and the training stopping condition of the original model may be set such that the loss function value thereof meets a preset condition. The corresponding loss function of the original model may be:
Figure BDA0003174145050000071
after passing through the feature extractor G, the output of the class predictor F can be expressed as:
f(x)=F(G(x))∈R
where x represents an input picture and R represents the numerical space of the output class
lsc(F) Represents a class predictor F inThe classification penalty on the source domain data is defined as follows:
Figure BDA0003174145050000072
lCE(f(x),y)=-<y,log f(x)>
e represents a first-order origin moment, y represents a one-dimensional vector of the real label represented by one-hot coding, and K elements are total, wherein each position represents a specific electronic device type, and the data belongs to which type; the value at position is 1 and the rest are 0. F (x) represents the output of the class predictor F, which is also a one-dimensional vector with K elements representing the prediction probability values for each class. P represents the data space of the source domain, xsRepresenting source domain data, i.e. first electronics picture, ysThe real category representing the source domain data corresponds to, i.e. the category label of the first electronic device picture, is a one-dimensional vector comprising K elements. ldsc(D1),ldsc(D2) Representing the classification loss due to data on two discriminators, respectively, bydsc(D1) To explain its definition:
Figure BDA0003174145050000081
output D1(G (x)) of discriminator D1s) Is a one-dimensional vector with 2K elements, the first K elements representing the source domain's class and the last K elements representing the target domain's K classes. 0 represents a zero vector of K elements. ldsc(D2) In the same way, the method comprises the following steps:
Figure BDA0003174145050000082
only the discriminator is replaced; lambda [ alpha ]dsc1dsc2Is a hyper-parameter and may be chosen to be 1.0.
In one embodiment, the presetting of the model training stopping condition may include:
the loss function values of the first model and the second model meet preset conditions; the loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
Wherein the loss function of the first model may be:
Figure BDA0003174145050000083
lFfor the classification loss and SSL regularization loss of the class predictor in the source domain data, i.e. the first electronic picture:
lF=lsc(F)+λsvatlsvat(F)+λtvatltvat(F)+λtelte(F)
lsc(F) see above, definition ofte(F) Representing the information entropy loss of the class predictor on the target domain data, i.e. the second electronic device picture:
Figure BDA0003174145050000084
lE(f(x))=-∑kf(x)[k]·logf(x)[k]
Figure BDA0003174145050000085
Figure BDA0003174145050000086
λsvattvatteis a hyperparameter, λsvattvatIs set to 1.0, lambdateSet to 0.1. q represents the data space of the target domain, R represents a certain numerical value on the numerical value space R of the output class, | R | < epsilon represents that R belongs to a minor variation.
lD1,lD2The classification penalty for the two discriminators on source domain data and target domain data:
lD1=λdsc1ldsc(D1)+λdtc1ldtc(D1)
lD1=λdsc2ldsc(D2)+λdtc2ldtc(D2)
λdsc1,λdtc1,λdsc2,λdtc2is a hyperparametric, λdsc1,λdsc2With reference to the preceding setting to 1.0, λdtc1,λdtc2Set to 1.0.
Illustrated by discriminator D1dsc(D1) With reference to the foregoing description, ldtc(D1) For classifier loss of discriminators on target domain data:
Figure BDA0003174145050000091
ldrepresenting the difference between the two discriminators:
Figure BDA0003174145050000092
Figure BDA0003174145050000093
λdis a hyper-parameter, set to 1.0.
For a given target domain data xtWhose true label is unknown, so a "false label" is used, i.e., class predictor F for data xtPredicted value of (2)
Figure BDA0003174145050000094
Note that one-hot coding is not used here to represent "false tags" for the target domain, and if one-hot is used as the prediction output, the one-hot results in the most dominant class of data in the target domain, meaning that the model will identify the dominant class too much and affect the accuracy of the remaining classes, so the output value of the classification predictor F is used here as the output value of the classification predictor F
Figure BDA0003174145050000095
Figure BDA0003174145050000096
0 in (2) represents a one-dimensional zero vector, containing K elements.
The loss function of the second model may be:
Figure BDA0003174145050000097
ldsa1(G),ldsa2(G) respectively representing the loss of source domain alignment of the two discriminators,/dta1(G),ldta2(G) The target domain alignment loss of the two discriminators is shown separately, and the source domain alignment loss and the target domain alignment loss are described below by taking the discriminator D1 as an example, and the discriminator D2 can refer to the description of D1.
Figure BDA0003174145050000098
Figure BDA0003174145050000099
Figure BDA0003174145050000101
Is a "false tag" of target domain data, ysIs a true tag, λ, of the source domain datadsa1,λdta1,λdsa2,λdta2For the superparameters, all were set to 0.1.
The embodiment of the application adopts a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not marked with a label to train the electronic device picture classification model, and adopts a training countermeasure mechanism during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved. That is to say, when a new product is produced, the classification model for the new product can be trained without labeling the new product, and the new product can be quickly remodeled.
In the above, a training method of an electronic device picture classification model provided in an embodiment of the present application is introduced, and based on the trained electronic device picture classification model, an embodiment of the present application provides a method for classifying an electronic device picture by using the electronic device picture classification model, where the electronic device picture classification model is trained by the training method of the electronic device picture classification model provided in an embodiment of the present application, and fig. 2 shows a schematic flow chart of the method for classifying an electronic device picture by using the electronic device picture classification model, as shown in fig. 2, the method may include the following steps:
s210, obtaining an electronic device picture.
S220, inputting the electronic device picture into the electronic device picture classification model for classification, and outputting the electronic device type corresponding to the electronic device picture.
The electronic device picture classification model obtained by training in the embodiment of the application is used for classifying the electronic device pictures, and the classification efficiency and precision are high.
Fig. 1-2 illustrate a method provided by an embodiment of the present application, and the following describes an apparatus provided by an embodiment of the present application with reference to fig. 3-5.
Fig. 3 is a schematic structural diagram of a training apparatus for an electronic device picture classification model according to an embodiment of the present application, and each module in the apparatus shown in fig. 3 has a function of implementing each step in fig. 1, and can achieve its corresponding technical effect. As shown in fig. 3, the apparatus may include:
an obtaining module 310 is configured to obtain a first training sample and a second training sample.
The first training sample may include a plurality of first electronic device pictures labeled with category labels, and the second training sample may include a plurality of second electronic device pictures not labeled with category labels.
And the updating module 320 is configured to input the first training sample and the second training sample into the original model to perform the first classification training, update the parameters of the class predictor and the parameters of the discriminator, and keep the parameters of the feature extractor unchanged to obtain the first model.
The raw model is derived from a first training sample, and may include a feature extractor, a class predictor, and a discriminator.
The updating module 320 is further configured to input the first training sample and the second training sample into the first model to perform a second classification training, update the parameter of the feature extractor, and keep the parameter of the class predictor and the parameter of the discriminator unchanged to obtain a second model.
The determining module 330 is configured to determine the trained class predictor as an electronic device picture classification model after a preset model training stopping condition is met.
The embodiment of the application adopts a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not marked with a label to train the electronic device picture classification model, and adopts a training countermeasure mechanism during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved.
In one embodiment, the discriminator may include:
the first discriminator and the second discriminator are provided with different initial parameters.
In an embodiment, the update module 320 may be further specifically configured to:
inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain an original model before inputting the first training sample and the second training sample into the original model to perform first classification training, updating the parameters of the feature extractor, the class predictor and the discriminator to obtain the original model, and keeping the parameters of the feature extractor unchanged.
In one embodiment, the presetting of the model training stopping condition may include:
and the loss function values of the first model and the second model accord with preset conditions.
The loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
The embodiment of the application adopts a first training sample comprising a first electronic device picture and an electronic device class label corresponding to the first electronic device picture and a second training sample comprising a second electronic device picture which is not marked with a label to train the electronic device picture classification model, and adopts a training countermeasure mechanism during training, namely: firstly, updating parameters of a category predictor and a discriminator, keeping parameters of a feature extractor unchanged, and increasing difference loss between discriminators; then, updating parameters of the feature extractor, keeping the parameters of the class predictor and the identifier unchanged, reducing the difference loss between the identifiers, and improving the classification precision of the model; and moreover, label labeling is not required to be carried out on the second training sample, so that labeling time is saved, and model training efficiency is improved. That is to say, when a new product is produced, the classification model for the new product can be trained without labeling the new product, and the new product can be quickly remodeled.
Fig. 4 is a schematic structural diagram of a training apparatus for an electronic device picture classification model according to an embodiment of the present application, where the electronic device picture classification model is trained by the training apparatus for the electronic device picture classification model, and each module in the apparatus shown in fig. 4 has a function of implementing each step in fig. 2, and can achieve its corresponding technical effect. As shown in fig. 4, the apparatus may include:
the obtaining module 410 is used for obtaining an electronic device picture.
The classifying module 420 is configured to input the electronic device picture into the electronic device picture classifying model for classification, and output an electronic device type corresponding to the electronic device picture.
The electronic device picture classification model obtained by training in the embodiment of the application is used for classifying the electronic device pictures, and the classification efficiency and precision are high.
Fig. 5 shows a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 5, the apparatus may include a processor 501 and a memory 502 storing computer program instructions.
Specifically, the processor 501 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement the embodiments of the present Application.
Memory 502 may include mass storage for data or instructions. By way of example, and not limitation, memory 502 may include a Hard Disk Drive (HDD), a floppy Disk Drive, flash memory, an optical Disk, a magneto-optical Disk, tape, or a Universal Serial Bus (USB) Drive or a combination of two or more of these. In one example, memory 502 can include removable or non-removable (or fixed) media, or memory 502 is non-volatile solid-state memory. The memory 502 may be internal or external to the integrated gateway disaster recovery device.
In one example, the Memory 502 may be a Read Only Memory (ROM). In one example, the ROM may be mask programmed ROM, programmable ROM (prom), erasable prom (eprom), electrically erasable prom (eeprom), electrically rewritable ROM (earom), or flash memory, or a combination of two or more of these.
The processor 501 reads and executes the computer program instructions stored in the memory 502 to implement the method in the embodiment shown in fig. 1-2, and achieve the corresponding technical effect achieved by the embodiment shown in fig. 1-2 executing the method, which is not described herein again for brevity.
In one example, the electronic device can also include a communication interface 503 and a bus 510. As shown in fig. 5, the processor 501, the memory 502, and the communication interface 503 are connected via a bus 510 to complete communication therebetween.
The communication interface 503 is mainly used for implementing communication between modules, apparatuses, units and/or devices in the embodiments of the present application.
Bus 510 comprises hardware, software, or both to couple the components of the online data traffic billing device to each other. By way of example, and not limitation, a Bus may include an Accelerated Graphics Port (AGP) or other Graphics Bus, an Enhanced Industry Standard Architecture (EISA) Bus, a Front-Side Bus (Front Side Bus, FSB), a Hyper Transport (HT) interconnect, an Industry Standard Architecture (ISA) Bus, an infiniband interconnect, a Low Pin Count (LPC) Bus, a memory Bus, a Micro Channel Architecture (MCA) Bus, a Peripheral Component Interconnect (PCI) Bus, a PCI-Express (PCI-X) Bus, a Serial Advanced Technology Attachment (SATA) Bus, a video electronics standards association local (VLB) Bus, or other suitable Bus or a combination of two or more of these. Bus 510 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
The electronic device may perform the method in the embodiments of the present application, thereby achieving the corresponding technical effects of the methods described in fig. 1-2.
In addition, in combination with the methods described in the above embodiments, the embodiments of the present application may be implemented by providing a computer storage medium. The computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement any of the methods in the above embodiments.
It is to be understood that the present application is not limited to the particular arrangements and instrumentality described above and shown in the attached drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications, and additions or change the order between the steps after comprehending the spirit of the present application.
The functional blocks shown in the above-described structural block diagrams may be implemented as hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic Circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via computer networks such as the internet, intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this application describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Aspects of the present application are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware for performing the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As described above, only the specific embodiments of the present application are provided, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the module and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present application, and these modifications or substitutions should be covered within the scope of the present application.

Claims (12)

1. A training method for an electronic device picture classification model is characterized by comprising the following steps:
acquiring a first training sample and a second training sample, wherein the first training sample comprises a plurality of first electronic device pictures marked with class labels, and the second training sample comprises a plurality of second electronic device pictures not marked with the class labels;
inputting the first training sample and the second training sample into an original model to perform first classification training, updating parameters of a class predictor and parameters of a discriminator, and keeping the parameters of a feature extractor unchanged to obtain a first model; the original model is obtained through the first training sample, and comprises the feature extractor, the class predictor and the discriminator;
inputting the first training sample and the second training sample into the first model to perform second classification training, updating the parameters of the feature extractor, and keeping the parameters of the class predictor and the parameters of the discriminator unchanged to obtain a second model;
and when a preset model training stopping condition is met, determining the trained class predictor as the electronic device picture classification model.
2. The method of training an electronic device picture classification model of claim 1, wherein the discriminator comprises:
a first discriminator and a second discriminator, the first discriminator and the second discriminator being provided with different initial parameters.
3. The method for training the image classification model of the electronic device according to claim 1, wherein before the first training sample and the second training sample are input into the original model for the first classification training, the parameters of the class predictor and the parameters of the discriminator are updated, and the parameters of the feature extractor are kept unchanged to obtain the first model, the method further comprises:
and inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain the original model.
4. The method for training the image classification model of the electronic device as claimed in claim 1, wherein the preset model training stopping condition includes:
the loss function values of the first model and the second model meet preset conditions;
the loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
5. A method of classifying an electronic device picture using an electronic device picture classification model, the electronic device picture classification model being trained by the method of claim 1, the method comprising:
acquiring an electronic device picture;
and inputting the electronic device picture into the electronic device picture classification model for classification, and outputting the electronic device type corresponding to the electronic device picture.
6. The utility model provides a training device of electron device picture classification model which characterized in that includes:
the acquisition module is used for acquiring a first training sample and a second training sample, wherein the first training sample comprises a plurality of first electronic device pictures marked with class labels, and the second training sample comprises a plurality of second electronic device pictures not marked with the class labels;
the updating module is used for inputting the first training sample and the second training sample into an original model to perform first classification training, updating the parameters of the class predictor and the parameters of the discriminator, and keeping the parameters of the feature extractor unchanged to obtain a first model; the original model is obtained through the first training sample, and comprises the feature extractor, the class predictor and the discriminator;
the updating module is further configured to input the first training sample and the second training sample into the first model to perform second classification training, update the parameter of the feature extractor, and keep the parameter of the class predictor and the parameter of the discriminator unchanged to obtain a second model;
and the determining module is used for determining the trained class predictor as the electronic device picture classification model after meeting the preset model training stopping condition.
7. The apparatus for training an electronic device picture classification model according to claim 6, wherein the discriminator comprises:
a first discriminator and a second discriminator, the first discriminator and the second discriminator being provided with different initial parameters.
8. The apparatus for training an electronic device picture classification model as claimed in claim 6, wherein the updating module is further configured to:
before inputting the first training sample and the second training sample into an original model to perform first classification training, updating parameters of a class predictor and parameters of a discriminator, keeping parameters of a feature extractor unchanged to obtain a first model, inputting the first training sample into the feature extractor, the class predictor and the discriminator to perform original classification training, and updating the parameters of the feature extractor, the class predictor and the discriminator to obtain the original model.
9. The apparatus for training an electronic device picture classification model as claimed in claim 6, wherein the preset model training stop condition comprises:
the loss function values of the first model and the second model meet preset conditions;
the loss function value is obtained by comparing the output result of the class predictor with the output result of the discriminator.
10. An apparatus for classifying an electronic device picture using an electronic device picture classification model, wherein the electronic device picture classification model is trained by the apparatus of claim 1, the apparatus comprising:
the acquisition module is used for acquiring an electronic device picture;
and the classification module is used for inputting the electronic device picture into the electronic device picture classification model for classification and outputting the electronic device type corresponding to the electronic device picture.
11. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the method of any one of claims 1 to 5.
12. A computer-readable storage medium, on which an implementation program of information transfer is stored, which when executed by a processor implements the method of any one of claims 1 to 5.
CN202110827473.9A 2021-07-21 2021-07-21 Training method, device and equipment for electronic device picture classification model Pending CN113673570A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110827473.9A CN113673570A (en) 2021-07-21 2021-07-21 Training method, device and equipment for electronic device picture classification model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110827473.9A CN113673570A (en) 2021-07-21 2021-07-21 Training method, device and equipment for electronic device picture classification model

Publications (1)

Publication Number Publication Date
CN113673570A true CN113673570A (en) 2021-11-19

Family

ID=78540002

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110827473.9A Pending CN113673570A (en) 2021-07-21 2021-07-21 Training method, device and equipment for electronic device picture classification model

Country Status (1)

Country Link
CN (1) CN113673570A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062753A (en) * 2017-12-29 2018-05-22 重庆理工大学 The adaptive brain tumor semantic segmentation method in unsupervised domain based on depth confrontation study
CN109902798A (en) * 2018-05-31 2019-06-18 华为技术有限公司 The training method and device of deep neural network
CN110348484A (en) * 2019-06-10 2019-10-18 天津大学 A method of classify for polarity electronic device
US20190354801A1 (en) * 2018-05-16 2019-11-21 Nec Laboratories America, Inc. Unsupervised cross-domain distance metric adaptation with feature transfer network
CN111835784A (en) * 2020-07-22 2020-10-27 苏州思必驰信息科技有限公司 Data generalization method and system for replay attack detection system
US10839269B1 (en) * 2020-03-20 2020-11-17 King Abdulaziz University System for fast and accurate visual domain adaptation
CN111968666A (en) * 2020-08-20 2020-11-20 南京工程学院 Hearing aid voice enhancement method based on depth domain self-adaptive network
CN111985554A (en) * 2020-08-18 2020-11-24 创新奇智(西安)科技有限公司 Model training method, bracelet identification method and corresponding device
WO2020238734A1 (en) * 2019-05-27 2020-12-03 腾讯科技(深圳)有限公司 Image segmentation model training method and apparatus, computer device, and storage medium
CN112215255A (en) * 2020-09-08 2021-01-12 深圳大学 Training method of target detection model, target detection method and terminal equipment
CN112801918A (en) * 2021-03-11 2021-05-14 苏州科达科技股份有限公司 Training method of image enhancement model, image enhancement method and electronic equipment
CN112926642A (en) * 2021-02-22 2021-06-08 山东大学 Multi-source domain self-adaptive intelligent mechanical fault diagnosis method and system
CN113076994A (en) * 2021-03-31 2021-07-06 南京邮电大学 Open-set domain self-adaptive image classification method and system

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062753A (en) * 2017-12-29 2018-05-22 重庆理工大学 The adaptive brain tumor semantic segmentation method in unsupervised domain based on depth confrontation study
US20190354801A1 (en) * 2018-05-16 2019-11-21 Nec Laboratories America, Inc. Unsupervised cross-domain distance metric adaptation with feature transfer network
US20210012198A1 (en) * 2018-05-31 2021-01-14 Huawei Technologies Co., Ltd. Method for training deep neural network and apparatus
CN109902798A (en) * 2018-05-31 2019-06-18 华为技术有限公司 The training method and device of deep neural network
WO2020238734A1 (en) * 2019-05-27 2020-12-03 腾讯科技(深圳)有限公司 Image segmentation model training method and apparatus, computer device, and storage medium
CN110348484A (en) * 2019-06-10 2019-10-18 天津大学 A method of classify for polarity electronic device
US10839269B1 (en) * 2020-03-20 2020-11-17 King Abdulaziz University System for fast and accurate visual domain adaptation
CN111835784A (en) * 2020-07-22 2020-10-27 苏州思必驰信息科技有限公司 Data generalization method and system for replay attack detection system
CN111985554A (en) * 2020-08-18 2020-11-24 创新奇智(西安)科技有限公司 Model training method, bracelet identification method and corresponding device
CN111968666A (en) * 2020-08-20 2020-11-20 南京工程学院 Hearing aid voice enhancement method based on depth domain self-adaptive network
CN112215255A (en) * 2020-09-08 2021-01-12 深圳大学 Training method of target detection model, target detection method and terminal equipment
CN112926642A (en) * 2021-02-22 2021-06-08 山东大学 Multi-source domain self-adaptive intelligent mechanical fault diagnosis method and system
CN112801918A (en) * 2021-03-11 2021-05-14 苏州科达科技股份有限公司 Training method of image enhancement model, image enhancement method and electronic equipment
CN113076994A (en) * 2021-03-31 2021-07-06 南京邮电大学 Open-set domain self-adaptive image classification method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HAOTIAN WANG;WENJING YANG;ZHIPENG LIN;YUE YU: ""TMDA: Task-Specific Multi-source Domain Adaptation via Clustering Embedded Adversarial Training"", 《2019 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM)》, 30 January 2020 (2020-01-30) *
MARKUS WULFMEIER; ALEX BEWLEY; INGMAR POSNER: ""Addressing appearance change in outdoor robotics with adversarial domain adaptation"", 《2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)》, 14 December 2017 (2017-12-14) *
WEI XIA;JING HUANG;JOHN H.L. HANSEN: ""Cross-lingual Text-independent Speaker Verification Using Unsupervised Adversarial Discriminative Domain Adaptation"", 《ICASSP 2019 - 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)》, 17 April 2019 (2019-04-17) *
孙冬梅;张飞飞;毛启容;: "标签引导的生成对抗网络人脸表情识别域适应方法", 计算机工程, no. 05, 15 May 2020 (2020-05-15) *

Similar Documents

Publication Publication Date Title
CN108280542B (en) User portrait model optimization method, medium and equipment
CN113205176B (en) Method, device and equipment for training defect classification detection model and storage medium
CN111818318B (en) White balance tuning method, device, equipment and storage medium for image processor
CN111798414A (en) Method, device and equipment for determining definition of microscopic image and storage medium
CN113781391A (en) Image defect detection method and related equipment
CN111126112B (en) Candidate region determination method and device
CN112949785B (en) Object detection method, device, equipment and computer storage medium
CN108377508B (en) User perception classification method and device based on measurement report data
CN116310713B (en) Infrared image recognition method and device, electronic equipment and storage medium
CN113673570A (en) Training method, device and equipment for electronic device picture classification model
CN110728229B (en) Image processing method, device, equipment and storage medium
CN115293467B (en) Method, device, equipment and medium for predicting out-of-date risk of product manufacturing
CN115690064A (en) Alignment method, alignment device, electronic equipment and computer readable storage medium
CN114417830A (en) Risk evaluation method, device, equipment and computer readable storage medium
CN112749998A (en) Income information output method and device, electronic equipment and computer storage medium
CN114429556A (en) Picture auditing method and device
CN114202494A (en) Method, device and equipment for classifying cells based on cell classification model
CN112733869B (en) Method, device, equipment and storage medium for training text recognition model
JPH1075218A (en) Method and device for identifying characteristic boundary of data groups
CN114547101B (en) Data quality evaluation method, device, equipment and storage medium for data center
CN116910555A (en) Training and application methods, devices, equipment and media of user credit prediction model
CN113554048B (en) Data identification method, device, equipment and storage medium
CN114996590A (en) Classification method, classification device, classification equipment and storage medium
CN117078397A (en) Post-loan risk control method, system, equipment and medium thereof
CN117130615A (en) Page data generation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination