CN109389220A - Processing method, device, electronic equipment and the storage medium of neural network model - Google Patents
Processing method, device, electronic equipment and the storage medium of neural network model Download PDFInfo
- Publication number
- CN109389220A CN109389220A CN201811143319.4A CN201811143319A CN109389220A CN 109389220 A CN109389220 A CN 109389220A CN 201811143319 A CN201811143319 A CN 201811143319A CN 109389220 A CN109389220 A CN 109389220A
- Authority
- CN
- China
- Prior art keywords
- probability
- sample data
- label
- default
- mark
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Abstract
The application is the processing method about a kind of neural network model, device, electronic equipment and storage medium.In the embodiment of the present application, the mark label of each sample data is one, namely, sample data is single label data collection, pass through the method for the embodiment of the present application, target multi-tag disaggregated model can be trained using single label data collection, later when needing to predict the label of data containing multiple labels itself, can be obtained by multiple labels of the data by target multi-tag disaggregated model.Compared to the relevant technologies, the application needs successively manually whether mark sample data belongs to each of label system tag along sort without staff, reduces cost of labor and improves efficiency.
Description
Technical field
This application involves field of computer technology more particularly to a kind of processing methods of neural network model, device, electronics
Equipment and storage medium.
Background technique
Currently, deep learning has obtained extensively in related fieldss such as video image, speech recognition and natural language processings
Using.An important branch of the convolutional neural networks as deep learning, due to its superpower capability of fitting and end to end
Global optimization ability, so that image classification task is after application convolutional neural networks, precision of prediction is substantially improved.
The image that the sorting technique of single label based on convolutional neural networks can input one exports a contingency table
Label.However, frequently including multiple objects in picture in true application scenarios, it is desirable to can be exported to picture multiple
Tag along sort, for example, including cat, dog and bottle in image, it is desirable to can be to picture output cat, dog and three totally points of bottle
Class label, but the sorting technique of single label based on convolutional neural networks not can solve this problem but.It can only be work people
The artificial observation image of member, and image is manually marked, to obtain multiple tag along sorts of image.
However, staff needs successively manually whether mark image when the tag along sort in label system is very more
Belong to each of label system tag along sort, takes a long time, and cost of labor is higher.
Summary of the invention
To overcome the problems in correlation technique, the application provide the processing method of neural network model a kind of, device,
Electronic equipment and storage medium.
According to the embodiment of the present application in a first aspect, provide a kind of processing method of neural network model, the method packet
It includes:
Obtain the mark label and default labeling model of sample data, sample data;
Using the mark label of the sample data and the sample data to the default labeling model training,
Obtain weak multi-tag disaggregated model;
Determine that the sample data is distinguished using the mark label of the weak multi-tag disaggregated model and the sample data
Belong to the first probability that each of default tally set presets label, includes the mark label in the default tally set;
Using the sample data and first probability to the weak multi-tag disaggregated model training, obtains target and mark more
Sign disaggregated model.
In an optional implementation, the mark using weak the multi-tag disaggregated model and the sample data
Note label determines that the sample data is belonging respectively to the first probability of the default label of each of default tally set, comprising:
It is each in the default tally set to predict that the sample data is belonging respectively to using the weak multi-tag disaggregated model
Second probability of a default label;
Third probability is set by the probability that the sample data belongs to the mark label;
By second probability and the third probability, obtains the sample data and be belonging respectively to each default label
The first probability.
It is described to merge second probability with the third probability in an optional implementation, it obtains described
Sample data is belonging respectively to the first probability of each default label, comprising:
In the default tally set, the sample data is replaced using the third probability and is belonged to and the mark label
Second probability of identical default label;
It is first probability by the replaced determine the probability that the sample data is belonging respectively to each default label.
In an optional implementation, it is described using the third probability replace the sample data belong to it is described
Mark the second probability of the identical default label of label, comprising:
Removal is lower than the second probability of preset threshold in multiple second probability;
If the corresponding default label of remaining second probability includes the mark label, replaced using the third probability
Change the second probability that the sample data belongs to the mark label.
In an optional implementation, the replacement that the sample data is belonging respectively to each default label
Determine the probability afterwards is first probability, comprising:
It is first probability by replaced remaining second determine the probability;
Alternatively, if the corresponding default label of remaining second probability does not include the mark label, by remaining
Two probability and the third determine the probability are first probability.
According to the second aspect of the embodiment of the present application, a kind of processing unit of neural network model, described device packet are provided
It includes:
Module is obtained, for obtaining the mark label and default labeling model of sample data, sample data;
First training module, for using the mark label of the sample data and the sample data to described default
Labeling model training obtains weak multi-tag disaggregated model;
Determining module, for using described in the mark label determination of the weak multi-tag disaggregated model and the sample data
Sample data is belonging respectively to the first probability of the default label of each of default tally set, includes institute in the default tally set
State mark label;
Second training module, for using the sample data and first probability to the weak multi-tag disaggregated model
Training, obtains target multi-tag disaggregated model.
In an optional implementation, the determining module includes:
Predicting unit, it is described default for predicting that the sample data is belonging respectively to using the weak multi-tag disaggregated model
Each in tally set presets the second probability of label;
Setting unit, the probability for the sample data to be belonged to the mark label are set as third probability;
Combining unit, for obtaining the sample data and being belonging respectively to by second probability and the third probability
First probability of each default label.
In an optional implementation, the combining unit includes:
Subelement is replaced, for replacing the sample data category using the third probability in the default tally set
In the second probability of default label identical with the mark label;
Subelement is determined, for the sample data to be belonging respectively to the replaced determine the probability of each default label
For first probability.
In an optional implementation, the replacement subelement is specifically used for:
Removal is lower than the second probability of preset threshold in multiple second probability;If remaining second probability is corresponding pre-
Bidding label include the mark label, then replace that the sample data belongs to the mark label using the third probability the
Two probability.
In an optional implementation, the determining subelement is specifically used for:
It is first probability by replaced remaining second determine the probability;Alternatively, if remaining second probability pair
The default label answered does not include the mark label, then is described first by remaining second probability and the third determine the probability
Probability.
According to the third aspect of the embodiment of the present application, a kind of electronic equipment is provided, the electronic equipment includes:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executing the processing method of neural network model as described in relation to the first aspect.
According to the fourth aspect of the embodiment of the present application, a kind of non-transitorycomputer readable storage medium is provided, when described
When instruction in storage medium is executed by the processor of mobile terminal, so that mobile terminal is able to carry out as described in relation to the first aspect
The processing method of neural network model.
According to the 5th of the embodiment of the present application the aspect, a kind of computer program product is provided, when the computer program produces
When instruction in product is executed by the processor of terminal, so that the terminal is able to carry out neural network mould as described in relation to the first aspect
The processing method of type.
The technical solution that embodiments herein provides can include the following benefits:
In the embodiment of the present application, the mark label and default labeling of sample data, sample data are obtained
Model;Using the mark label of sample data and sample data to default labeling model training, weak multi-tag point is obtained
Class model;Determine that sample data is belonging respectively to default tally set using the mark label of weak multi-tag disaggregated model and sample data
Each of default label the first probability;It include mark label in default tally set;Use sample data and the first probability
To weak multi-tag disaggregated model training, target multi-tag disaggregated model is obtained.
In the embodiment of the present application, the mark label of each sample data is one, that is, sample data is single label
Data set can train target multi-tag disaggregated model using single label data collection by the method for the embodiment of the present application, it
Afterwards when needing to predict the label of data containing multiple labels itself, this can be obtained by by target multi-tag disaggregated model
Multiple labels of data.Compared to the relevant technologies, the application needs successively manually whether mark sample data without staff
Belong to each of label system tag along sort, reduces cost of labor and improve efficiency.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The application can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the application
Example, and together with specification it is used to explain the principle of the application.
Fig. 1 is a kind of flow chart of the processing method of neural network model shown according to an exemplary embodiment.
Fig. 2 is a kind of block diagram of the processing unit of neural network model shown according to an exemplary embodiment.
Fig. 3 is a kind of block diagram for electronic equipment shown according to an exemplary embodiment.
Fig. 4 is a kind of block diagram for electronic equipment shown according to an exemplary embodiment.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the application.
Fig. 1 is a kind of flow chart of the processing method of neural network model shown according to an exemplary embodiment, such as Fig. 1
Shown, this approach includes the following steps.
In step s101, the mark label and default labeling model of sample data, sample data are obtained;
In the embodiment of the present application, in advance one or more sample datas can be locally located in technical staff, for appointing
It anticipates a sample data, technical staff can mark the mark label of the sample data, then by the sample data and mark
The mark set of tags of the sample data is stored in sample data and marks in the corresponding relationship between label at corresponding table item,
For other each sample datas, equally execution aforesaid operations.
In this way, in the embodiment of the present application, being deposited in the available corresponding relationship between sample data and mark classification
The sample data of storage, and searched in the corresponding relationship between sample data and mark classification corresponding with the sample data of acquisition
Mark label, and the mark label of the sample data as acquisition.
Secondly, in advance default labeling model can be locally located in technical staff, such as inception-v3 etc., such as
This can be directly acquired in advance in the default labeling model being locally located in this step.
In step s 102, default labeling model is instructed using the mark label of sample data and sample data
Practice, obtains weak multi-tag disaggregated model;
In the embodiment of the present application, it during to default labeling model training, generally requires take turns more and trains, often
One wheel requires to update the loss function of obtained newest disaggregated model after training;
Wherein, loss function may refer to following formula:
Lossn=p*log (pn’)+α*(1-p)*log(1-pn'), p ∈ { 0,1 };
In above-mentioned formula, p includes the probability that sample data belongs to mark label;
Wherein, since the mark label of sample data is that technical staff manually marks sample data, sample number
It is often higher according to the probability for belonging to mark label, for example, can be 100% etc..
pn' it include carrying out forecast sample data using the newest disaggregated model that each round training obtains to be belonging respectively to each
The probability of default label;N is the number of default label.
#classes includes: remaining different mark label after the mark label deduplication to each sample data
Quantity;
The default label that technical staff can may be belonged in advance with artificial observation each sample data, η includes each
Average value between the quantity for the default label that sample data may belong to;
Iters includes the wheel number of required training during to default labeling model training.
Iters_done includes that the wheel number completed has been trained during to default labeling model training;
In step s 103, determine that sample data is distinguished using the mark label of weak multi-tag disaggregated model and sample data
Belong to the first probability that each of default tally set presets label;
Wherein, preset includes the mark label in tally set;
This step can be realized by following process, comprising:
1031, each pre- bidding in default tally set is belonging respectively to using weak multi-tag disaggregated model forecast sample data
Second probability of label;
Sample data is inputted in weak multi-tag disaggregated model, the sample data that then weak multi-tag disaggregated model can export
It is belonging respectively to the second probability of each default label.
1032, third probability is set by the probability that sample data belongs to mark label;
In the embodiment of the present application, the mark label of sample data is artificial in advance to sample data mark, artificial to mark
The accuracy of note is often higher, in this way, sample data is often larger a possibility that belonging to mark label, in this way, by sample data
Belong to mark label probability also tend to it is larger, such as 100% etc..
1033, by the second probability and third probability, sample data is belonging respectively to each default label first is obtained
Probability.
For example, the second probability is merged with third probability, sample data is belonging respectively to each default label the is obtained
One probability.
It can be realized by following process in this step, comprising:
11), in default tally set, belong to pre- bidding identical with mark label using third probability replacement sample data
Second probability of label;
In the embodiment of the present application, presetting label is multiple labels, and mark label is often the small portion in default label
Minute mark infuses label.
In this way, for any one mark label of sample data, since sample data belongs to the third of the mark label
Probability be often greater than belonged to using the sample data that weak multi-tag disaggregated model is predicted it is identical pre- with the mark label
It is marked with the second probability of label, and the mark label of sample data is artificial in advance to sample data mark, the standard manually marked
Exactness is often higher, in this way, in order to improve the first probability that sample data belongs to identical with the mark label default label
Accuracy, and then the classification accuracy for the target multi-tag disaggregated model that training obtains is improved, it can be searched in default label
Then default label identical with the mark label is replaced with the probability that sample data belongs to the default label by the second probability
Third probability.Each mark label, equally execution aforesaid operations for other.
In another embodiment, label is preset for any one, if predicting to obtain using weak multi-tag disaggregated model
Sample data belong to the default label second probability lower than preset threshold, then illustrate that sample data is not belonging to the pre- bidding
Label, if being higher than or waiting using the second probability that the sample data that weak multi-tag disaggregated model is predicted belongs to the default label
In preset threshold, then illustrate that sample data belongs to the default label.
In this way, in order to improve the classification accuracy for the target multi-tag disaggregated model that training obtains, it can be multiple second
Removal is lower than the second probability of preset threshold in probability;Preset threshold can be 50%, 51% or 52% etc..
If the corresponding default label of remaining second probability includes mark label, sample number is replaced using third probability
According to the second probability for belonging to mark label.
For example, the mark label of sample data is cat, the third probability that sample belongs to cat is 100%, and default label includes
Cat, dog, tiger and lion are using the second probability that the sample data that weak multi-tag disaggregated model is predicted belongs to cat
75%, the second probability for belonging to dog is 23%, and the second probability for belonging to tiger is 55%, and belongs to the second probability of lion and be
88%, preset threshold 50%.
The second probability 23% that sample data belongs to dog is less than preset threshold 50%, can so remove sample data and belong to
The second probability 23% of dog, remaining second probability include the second probability 75% for belonging to cat, belong to the second probability of tiger
55%, and belong to the second probability 88% of lion, then belong to cat using the replacement of third probability 100% that sample belongs to cat
Second probability 75%.
It 12) is, the first probability by the replaced determine the probability that sample data is belonging respectively to each default label.
It in one embodiment, can be the first probability by replaced remaining second determine the probability;For example, above-mentioned
In example, sample is belonged to the third probability 100% of cat, belongs to the second probability 55% of tiger, and belongs to the second of lion
Probability 88% is used as the first probability;
Alternatively, if the corresponding default label of remaining second probability does not include mark label, it is general by remaining second
Rate and third determine the probability are the first probability.
For example, the mark label of sample data is cat, the third probability that sample data belongs to cat is 100%, presets label
Including cat, dog, tiger and lion, the second probability of cat is belonged to using the sample data that weak multi-tag disaggregated model is predicted
It is 25%, the second probability for belonging to dog is 23%, and the second probability for belonging to tiger is 55%, and belongs to the second probability of lion
It is 88%, preset threshold 50%.
The second probability 23% that sample data belongs to dog is less than preset threshold 50%, can so remove sample data and belong to
The second probability 23% of dog, the second probability 25% that sample data belongs to cat are less than preset threshold 50%, can so remove sample
Notebook data belongs to the second probability 23% of cat, and remaining second probability includes belonging to the second probability 55% of tiger and belonging to lion
Son the second probability 88%, the third probability 100% of cat is then belonged to using sample, belong to tiger the second probability 55% and
The second probability 88% for belonging to lion is determined as the first probability.
In step S104, using sample data and the first probability to weak multi-tag disaggregated model training, it is more to obtain target
Labeling model.
In the embodiment of the present application, it during to weak multi-tag disaggregated model training, generally requires take turns more and trains, often
One wheel requires to update the loss function of obtained newest disaggregated model after training.
Wherein, loss function may refer to following formula:
In above-mentioned formula, p includes the probability that sample data belongs to mark label.
Wherein, since the mark label of sample data is that technical staff manually marks sample data, sample number
It is often higher according to the probability for belonging to mark label, for example, can be 100% etc..
pn' it include carrying out forecast sample data using the newest disaggregated model that each round training obtains to be belonging respectively to each
The probability of default label;N is the number of default label.
The default label that technical staff can may be belonged in advance with artificial observation each sample data, η includes each
Average value between the quantity for the default label that sample data may belong to.
In the embodiment of the present application, the mark label and default labeling of sample data, sample data are obtained
Model;Using the mark label of sample data and sample data to default labeling model training, weak multi-tag point is obtained
Class model;Determine that sample data is belonging respectively to default tally set using the mark label of weak multi-tag disaggregated model and sample data
Each of default label the first probability;It include mark label in default tally set;Use sample data and the first probability
To weak multi-tag disaggregated model training, target multi-tag disaggregated model is obtained.
In the embodiment of the present application, the mark label of each sample data is one, that is, sample data is single label
Data set can train target multi-tag disaggregated model using single label data collection by the method for the embodiment of the present application, it
Afterwards when needing to predict the label of data containing multiple labels itself, this can be obtained by by target multi-tag disaggregated model
Multiple labels of data.Compared to the relevant technologies, the application needs successively manually whether mark sample data without staff
Belong to each of label system tag along sort, reduces cost of labor and improve efficiency.
Fig. 2 is a kind of processing unit of neural network model shown according to an exemplary embodiment.Referring to Fig. 2, the dress
It sets and includes:
Module 11 is obtained, for obtaining the mark label and default labeling mould of sample data, sample data
Type;
First training module 12, for using the mark label of the sample data and the sample data to described pre-
Bidding label disaggregated model training, obtains weak multi-tag disaggregated model;
Determining module 13, for determining institute using the mark label of the weak multi-tag disaggregated model and the sample data
The first probability that sample data is belonging respectively to the default label of each of default tally set is stated, includes in the default tally set
The mark label;
Second training module 14, for using the sample data and first probability to the weak multi-tag classification mould
Type training obtains target multi-tag disaggregated model.
In an optional implementation, the determining module 13 includes:
Predicting unit, it is described default for predicting that the sample data is belonging respectively to using the weak multi-tag disaggregated model
Each in tally set presets the second probability of label;
Setting unit, the probability for the sample data to be belonged to the mark label are set as third probability;
Combining unit, for obtaining the sample data and being belonging respectively to by second probability and the third probability
First probability of each default label.
In an optional implementation, the combining unit includes:
Subelement is replaced, for replacing the sample data category using the third probability in the default tally set
In the second probability of default label identical with the mark label;
Subelement is determined, for the sample data to be belonging respectively to the replaced determine the probability of each default label
For first probability.
In an optional implementation, the replacement subelement is specifically used for:
Removal is lower than the second probability of preset threshold in multiple second probability;If remaining second probability is corresponding pre-
Bidding label include the mark label, then replace that the sample data belongs to the mark label using the third probability the
Two probability.
In an optional implementation, the determining subelement is specifically used for:
It is first probability by replaced remaining second determine the probability;Alternatively, if remaining second probability pair
The default label answered does not include the mark label, then is described first by remaining second probability and the third determine the probability
Probability.
In the embodiment of the present application, the mark label and default labeling of sample data, sample data are obtained
Model;Using the mark label of sample data and sample data to default labeling model training, weak multi-tag point is obtained
Class model;Determine that sample data is belonging respectively to default tally set using the mark label of weak multi-tag disaggregated model and sample data
Each of default label the first probability;It include mark label in default tally set;Use sample data and the first probability
To weak multi-tag disaggregated model training, target multi-tag disaggregated model is obtained.
In the embodiment of the present application, the mark label of each sample data is one, that is, sample data is single label
Data set can train target multi-tag disaggregated model using single label data collection by the method for the embodiment of the present application, it
Afterwards when needing to predict the label of data containing multiple labels itself, this can be obtained by by target multi-tag disaggregated model
Multiple labels of data.Compared to the relevant technologies, the application needs successively manually whether mark sample data without staff
Belong to each of label system tag along sort, reduces cost of labor and improve efficiency.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
The executing subject of the application can be electronic equipment, as shown in figure 3, electronic equipment 300 can be mobile phone, meter
Calculation machine, digital broadcasting terminal, messaging device, game console, tablet device, Medical Devices, body-building equipment, individual digital
Assistant etc..As shown in figure 4, electronic equipment 400 can be server etc..
Referring to Fig. 3, electronic equipment 300 may include following one or more components: processing component 302, memory 304,
Electric power assembly 306, multimedia component 308, audio component 310, the interface 312 of input/output (I/O), sensor module 314,
And communication component 316.
The integrated operation of the usual controlling electronic devices 300 of processing component 302, such as with display, call, data are logical
Letter, camera operation and record operate associated operation.Processing component 302 may include one or more processors 320 to hold
Row instruction, to perform all or part of the steps of the methods described above.In addition, processing component 302 may include one or more moulds
Block, convenient for the interaction between processing component 302 and other assemblies.For example, processing component 302 may include multi-media module, with
Facilitate the interaction between multimedia component 308 and processing component 302.
Memory 304 is configured as storing various types of data to support the operation in equipment 300.These data are shown
Example includes the instruction of any application or method for operating on electronic equipment 300, contact data, telephone directory number
According to, message, picture, video etc..Memory 304 can by any kind of volatibility or non-volatile memory device or they
Combination realize, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) is erasable
Programmable read only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, quick flashing
Memory, disk or CD.
Power supply module 306 provides electric power for the various assemblies of electronic equipment 300.Power supply module 306 may include power supply pipe
Reason system, one or more power supplys and other with for electronic equipment 300 generate, manage, and distribute the associated component of electric power.
Multimedia component 308 includes the screen of one output interface of offer between the electronic equipment 300 and user.
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch surface
Plate, screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes one or more touches
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding
The boundary of movement, but also detect duration and pressure associated with the touch or slide operation.In some embodiments,
Multimedia component 308 includes a front camera and/or rear camera.When equipment 300 is in operation mode, as shot mould
When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting
Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 310 is configured as output and/or input audio signal.For example, audio component 310 includes a Mike
Wind (MIC), when electronic equipment 300 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone
It is configured as receiving external audio signal.The received audio signal can be further stored in memory 304 or via logical
Believe that component 316 is sent.In some embodiments, audio component 310 further includes a loudspeaker, is used for output audio signal.
I/O interface 312 provides interface between processing component 302 and peripheral interface module, and above-mentioned peripheral interface module can
To be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and lock
Determine button.
Sensor module 314 includes one or more sensors, for providing the state of various aspects for electronic equipment 300
Assessment.For example, sensor module 314 can detecte the state that opens/closes of equipment 300, the relative positioning of component, such as institute
The display and keypad that component is electronic equipment 300 are stated, sensor module 314 can also detect electronic equipment 300 or electronics
The position change of 300 1 components of equipment, the existence or non-existence that user contacts with electronic equipment 300,300 orientation of electronic equipment
Or the temperature change of acceleration/deceleration and electronic equipment 300.Sensor module 314 may include proximity sensor, be configured to
It detects the presence of nearby objects without any physical contact.Sensor module 314 can also include optical sensor, such as
CMOS or ccd image sensor, for being used in imaging applications.In some embodiments, which can be with
Including acceleration transducer, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 316 is configured to facilitate the communication of wired or wireless way between electronic equipment 300 and other equipment.
Electronic equipment 300 can access the wireless network based on communication standard, such as WiFi, carrier network (such as 2G, 3G, 4G or 5G),
Or their combination.In one exemplary embodiment, communication component 316 receives via broadcast channel and comes from external broadcasting management
The broadcast singal or broadcast related information of system.In one exemplary embodiment, the communication component 316 further includes that near field is logical
(NFC) module is believed, to promote short range communication.For example, radio frequency identification (RFID) technology, infrared data association can be based in NFC module
Meeting (IrDA) technology, ultra wide band (UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, electronic equipment 300 can be by one or more application specific integrated circuit (ASIC), number
Word signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided
It such as include the memory 304 of instruction, above-metioned instruction can be executed by the processor 320 of electronic equipment 300 to complete the above method.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
Referring to Fig. 4, it further comprises one or more processors that electronic equipment 400, which includes processing component 422, and
The memory resource as representated by memory 432, can be by the instruction of the execution of processing component 422, such as using journey for storing
Sequence.The application program stored in memory 432 may include it is one or more each correspond to one group of instruction mould
Block.In addition, processing component 422 is configured as executing instruction, to execute the above method.
Electronic equipment 400 can also include that a power supply module 426 is configured as executing the power supply pipe of electronic equipment 400
Reason, a wired or wireless network interface 450 are configured as electronic equipment 400 being connected to network and an input and output (I/
O) interface 458.Electronic equipment 400 can be operated based on the operating system for being stored in memory 432, such as
WindowsServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or similar.
The embodiment of the present application also provides a kind of non-transitorycomputer readable storage medium, the finger in the storage medium
When enabling the processor execution by electronic equipment, so that electronic equipment is able to carry out the processing of neural network model as described in Figure 1
Method.
The embodiment of the present application also provides a kind of computer program product, when the instruction in the computer program product is by electricity
When the processor of sub- equipment executes, so that the electronic equipment is able to carry out the processing side of neural network model as described in Figure 1
Method.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the application
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the application, these modifications, purposes or
Person's adaptive change follows the general principle of the application and including the undocumented common knowledge in the art of the application
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the application are by following
Claim is pointed out.
It should be understood that the application is not limited to the precise structure that has been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.Scope of the present application is only limited by the accompanying claims.
A6, a kind of processing unit of neural network model, described device include:
Module is obtained, for obtaining the mark label and default labeling model of sample data, sample data;
First training module, for using the mark label of the sample data and the sample data to described default
Labeling model training obtains weak multi-tag disaggregated model;
Determining module, for using described in the mark label determination of the weak multi-tag disaggregated model and the sample data
Sample data is belonging respectively to the first probability of the default label of each of default tally set, includes institute in the default tally set
State mark label;
Second training module, for using the sample data and first probability to the weak multi-tag disaggregated model
Training, obtains target multi-tag disaggregated model.
A7, the device according to A6, the determining module include:
Predicting unit, it is described default for predicting that the sample data is belonging respectively to using the weak multi-tag disaggregated model
Each in tally set presets the second probability of label;
Setting unit, the probability for the sample data to be belonged to the mark label are set as third probability;
Combining unit, for obtaining the sample data and being belonging respectively to by second probability and the third probability
First probability of each default label.
A8, the device according to A7, the combining unit include:
Subelement is replaced, for replacing the sample data category using the third probability in the default tally set
In the second probability of default label identical with the mark label;
Subelement is determined, for the sample data to be belonging respectively to the replaced determine the probability of each default label
For first probability.
A9, the device according to A8, the replacement subelement are specifically used for:
Removal is lower than the second probability of preset threshold in multiple second probability;If remaining second probability is corresponding pre-
Bidding label include the mark label, then replace that the sample data belongs to the mark label using the third probability the
Two probability.
A10, the device according to A9, the determining subelement are specifically used for:
It is first probability by replaced remaining second determine the probability;Alternatively, if remaining second probability pair
The default label answered does not include the mark label, then is described first by remaining second probability and the third determine the probability
Probability.
Claims (10)
1. a kind of processing method of neural network model, which is characterized in that the described method includes:
Obtain the mark label and default labeling model of sample data, sample data;
Using the mark label of the sample data and the sample data to the default labeling model training, obtain
Weak multi-tag disaggregated model;
Determine that the sample data is belonging respectively to using the mark label of the weak multi-tag disaggregated model and the sample data
Each of default tally set presets the first probability of label, includes the mark label in the default tally set;
Using the sample data and first probability to the weak multi-tag disaggregated model training, target multi-tag point is obtained
Class model.
2. the method according to claim 1, wherein described use the weak multi-tag disaggregated model and the sample
The mark label of notebook data determines that the sample data is belonging respectively to the first general of the default label of each of default tally set
Rate, comprising:
It is pre- to predict that the sample data is belonging respectively to each in the default tally set using the weak multi-tag disaggregated model
It is marked with the second probability of label;
Third probability is set by the probability that the sample data belongs to the mark label;
By second probability and the third probability, the sample data is belonging respectively to each default label the is obtained
One probability.
3. according to the method described in claim 2, it is characterized in that, described close second probability and the third probability
And obtain the first probability that the sample data is belonging respectively to each default label, comprising:
In the default tally set, belonged to using the third probability replacement sample data identical as the mark label
Default label the second probability;
It is first probability by the replaced determine the probability that the sample data is belonging respectively to each default label.
4. according to the method described in claim 3, it is characterized in that, described replace the sample data using the third probability
Belong to the second probability of default label identical with the mark label, comprising:
Removal is lower than the second probability of preset threshold in multiple second probability;
If the corresponding default label of remaining second probability includes the mark label, institute is replaced using the third probability
State the second probability that sample data belongs to the mark label.
5. according to the method described in claim 4, it is characterized in that, described be belonging respectively to each for the sample data and preset
The replaced determine the probability of label is first probability, comprising:
It is first probability by replaced remaining second determine the probability;
Alternatively, if the corresponding default label of remaining second probability does not include the mark label, it is general by remaining second
Rate and the third determine the probability are first probability.
6. a kind of processing unit of neural network model, which is characterized in that described device includes:
Module is obtained, for obtaining the mark label and default labeling model of sample data, sample data;
First training module, for using the mark label of the sample data and the sample data to the default label
Disaggregated model training obtains weak multi-tag disaggregated model;
Determining module, for determining the sample using the mark label of the weak multi-tag disaggregated model and the sample data
Data are belonging respectively to the first probability of the default label of each of default tally set, include the mark in the default tally set
Infuse label;
Second training module, for being instructed using the sample data and first probability to the weak multi-tag disaggregated model
Practice, obtains target multi-tag disaggregated model.
7. device according to claim 6, which is characterized in that the determining module includes:
Predicting unit, for predicting that the sample data is belonging respectively to the default label using the weak multi-tag disaggregated model
Concentrate the second probability of each default label;
Setting unit, the probability for the sample data to be belonged to the mark label are set as third probability;
Combining unit, for by second probability and the third probability, obtain the sample data be belonging respectively to it is each
First probability of a default label.
8. device according to claim 7, which is characterized in that the combining unit includes:
Replace subelement, in the default tally set, using the third probability replace the sample data belong to
Second probability for marking the identical default label of label;
Determine subelement, the replaced determine the probability for the sample data to be belonging respectively to each default label is institute
State the first probability.
9. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to executing the processing side of neural network model as described in any one in claim 1-5
Method.
10. a kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processing of mobile terminal
When device executes, so that mobile terminal is able to carry out the processing for executing neural network model as described in any one in claim 1-5
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811143319.4A CN109389220A (en) | 2018-09-28 | 2018-09-28 | Processing method, device, electronic equipment and the storage medium of neural network model |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811143319.4A CN109389220A (en) | 2018-09-28 | 2018-09-28 | Processing method, device, electronic equipment and the storage medium of neural network model |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109389220A true CN109389220A (en) | 2019-02-26 |
Family
ID=65418321
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811143319.4A Pending CN109389220A (en) | 2018-09-28 | 2018-09-28 | Processing method, device, electronic equipment and the storage medium of neural network model |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109389220A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110147852A (en) * | 2019-05-29 | 2019-08-20 | 北京达佳互联信息技术有限公司 | Method, apparatus, equipment and the storage medium of image recognition |
CN110428052A (en) * | 2019-08-01 | 2019-11-08 | 江苏满运软件科技有限公司 | Construction method, device, medium and the electronic equipment of deep neural network model |
CN110458245A (en) * | 2019-08-20 | 2019-11-15 | 图谱未来(南京)人工智能研究院有限公司 | A kind of multi-tag disaggregated model training method, data processing method and device |
CN111143609A (en) * | 2019-12-20 | 2020-05-12 | 北京达佳互联信息技术有限公司 | Method and device for determining interest tag, electronic equipment and storage medium |
CN112036166A (en) * | 2020-07-22 | 2020-12-04 | 大箴(杭州)科技有限公司 | Data labeling method and device, storage medium and computer equipment |
CN112149733A (en) * | 2020-09-23 | 2020-12-29 | 北京金山云网络技术有限公司 | Model training method, model training device, quality determining method, quality determining device, electronic equipment and storage medium |
CN112328823A (en) * | 2020-11-25 | 2021-02-05 | Oppo广东移动通信有限公司 | Training method and device for multi-label classification model, electronic equipment and storage medium |
CN113222050A (en) * | 2021-05-26 | 2021-08-06 | 北京有竹居网络技术有限公司 | Image classification method and device, readable medium and electronic equipment |
-
2018
- 2018-09-28 CN CN201811143319.4A patent/CN109389220A/en active Pending
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110147852A (en) * | 2019-05-29 | 2019-08-20 | 北京达佳互联信息技术有限公司 | Method, apparatus, equipment and the storage medium of image recognition |
US11263483B2 (en) | 2019-05-29 | 2022-03-01 | Beijing Dajia Internet Information Technology Co., Ltd. | Method and apparatus for recognizing image and storage medium |
CN110428052A (en) * | 2019-08-01 | 2019-11-08 | 江苏满运软件科技有限公司 | Construction method, device, medium and the electronic equipment of deep neural network model |
CN110458245A (en) * | 2019-08-20 | 2019-11-15 | 图谱未来(南京)人工智能研究院有限公司 | A kind of multi-tag disaggregated model training method, data processing method and device |
CN111143609A (en) * | 2019-12-20 | 2020-05-12 | 北京达佳互联信息技术有限公司 | Method and device for determining interest tag, electronic equipment and storage medium |
CN111143609B (en) * | 2019-12-20 | 2024-03-26 | 北京达佳互联信息技术有限公司 | Method and device for determining interest tag, electronic equipment and storage medium |
CN112036166A (en) * | 2020-07-22 | 2020-12-04 | 大箴(杭州)科技有限公司 | Data labeling method and device, storage medium and computer equipment |
CN112149733A (en) * | 2020-09-23 | 2020-12-29 | 北京金山云网络技术有限公司 | Model training method, model training device, quality determining method, quality determining device, electronic equipment and storage medium |
CN112149733B (en) * | 2020-09-23 | 2024-04-05 | 北京金山云网络技术有限公司 | Model training method, model quality determining method, model training device, model quality determining device, electronic equipment and storage medium |
CN112328823A (en) * | 2020-11-25 | 2021-02-05 | Oppo广东移动通信有限公司 | Training method and device for multi-label classification model, electronic equipment and storage medium |
CN113222050A (en) * | 2021-05-26 | 2021-08-06 | 北京有竹居网络技术有限公司 | Image classification method and device, readable medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109389220A (en) | Processing method, device, electronic equipment and the storage medium of neural network model | |
US10061762B2 (en) | Method and device for identifying information, and computer-readable storage medium | |
CN109389162B (en) | Sample image screening technique and device, electronic equipment and storage medium | |
CN104378441B (en) | schedule creation method and device | |
CN109447125B (en) | Processing method and device of classification model, electronic equipment and storage medium | |
CN110781957A (en) | Image processing method and device, electronic equipment and storage medium | |
CN109543066A (en) | Video recommendation method, device and computer readable storage medium | |
CN105528403B (en) | Target data identification method and device | |
CN109670077A (en) | Video recommendation method, device and computer readable storage medium | |
CN104077563A (en) | Human face recognition method and device | |
CN108563683A (en) | Label addition method, device and terminal | |
CN110069624A (en) | Text handling method and device | |
CN111046927B (en) | Method and device for processing annotation data, electronic equipment and storage medium | |
CN107181849A (en) | The way of recording and device | |
CN109409414B (en) | Sample image determines method and apparatus, electronic equipment and storage medium | |
CN110764627B (en) | Input method and device and electronic equipment | |
CN113936697B (en) | Voice processing method and device for voice processing | |
CN111275089B (en) | Classification model training method and device and storage medium | |
CN112784151A (en) | Method and related device for determining recommendation information | |
CN113609380B (en) | Label system updating method, searching device and electronic equipment | |
US11922725B2 (en) | Method and device for generating emoticon, and storage medium | |
CN110968246A (en) | Intelligent Chinese handwriting input recognition method and device | |
CN109145151B (en) | Video emotion classification acquisition method and device | |
CN105912398A (en) | Memory detection method and device | |
CN109447124A (en) | Image classification method, device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190226 |