CN109886415A - Data processing method, device, computer equipment and storage medium - Google Patents
Data processing method, device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN109886415A CN109886415A CN201910012742.9A CN201910012742A CN109886415A CN 109886415 A CN109886415 A CN 109886415A CN 201910012742 A CN201910012742 A CN 201910012742A CN 109886415 A CN109886415 A CN 109886415A
- Authority
- CN
- China
- Prior art keywords
- data
- model
- processed results
- submodel
- pretreatment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Abstract
This application involves a kind of data processing method, device, computer equipment and storage mediums.The described method includes: obtaining pending data;The pending data input data is handled into model;It obtains and respectively pre-processes the pre-processed results that submodel exports respectively in the data processing model;Count the corresponding anticipation probability of each pre-processed results;According to the corresponding anticipation probability of each pre-processed results, the corresponding processing result of the pending data is generated.According to the corresponding anticipation probability of each pre-processed results, the consistency of multiple pre-processed results can be verified, and the corresponding processing result of pending data is generated according to each anticipation probability, improves the accuracy of model data processing.
Description
Technical field
This application involves field of computer technology, more particularly to a kind of data processing method, device, computer equipment and
Storage medium.
Background technique
With the development of computer technology, there are machine learning techniques.In machine learning, it is necessary first to establish mould
Type is trained to model offer training data, is predicted using the model after training unknown data.Machine learning is people
The core of work intelligence has been widely used for the fields such as identifying and classify.
However, in order to improve the accuracy that model handles input data, often existing in traditional machine learning techniques
Give mode input a large amount of training data when training, so that training data can cover various situations.Nonetheless, single mould
There are still the possibility of error when carrying out data processing for type, and the processing result of model can only be checked at random by manually, accuracy
It is lower.
Summary of the invention
Based on this, it is necessary in view of the above technical problems, provide a kind of model that can be improved to data processing accuracy
Data processing method, device, computer equipment and storage medium.
A kind of data processing method, which comprises
Obtain pending data;
The pending data input data is handled into model;
It obtains and respectively pre-processes the pre-processed results that submodel exports respectively in the data processing model;
Count the corresponding anticipation probability of each pre-processed results;
According to the corresponding anticipation probability of each pre-processed results, the corresponding processing knot of the pending data is generated
Fruit.
A kind of data processing equipment, described device include:
Data acquisition module, for obtaining pending data;
Data input module, for the pending data input data to be handled model;
As a result module is obtained, for obtaining the pretreatment for respectively pre-processing submodel in the data processing model and exporting respectively
As a result;
Probability statistics module, for counting the corresponding anticipation probability of each pre-processed results;
Result-generation module, for generating described wait locate according to the corresponding anticipation probability of each pre-processed results
Manage the corresponding processing result of data.
A kind of computer equipment can be run on a memory and on a processor including memory, processor and storage
Computer program, the processor perform the steps of when executing the computer program
Obtain pending data;
The pending data input data is handled into model;
It obtains and respectively pre-processes the pre-processed results that submodel exports respectively in the data processing model;
Count the corresponding anticipation probability of each pre-processed results;
According to the corresponding anticipation probability of each pre-processed results, the corresponding processing knot of the pending data is generated
Fruit.
A kind of computer readable storage medium, is stored thereon with computer program, and the computer program is held by processor
It is performed the steps of when row
Obtain pending data;
The pending data input data is handled into model;
It obtains and respectively pre-processes the pre-processed results that submodel exports respectively in the data processing model;
Count the corresponding anticipation probability of each pre-processed results;
According to the corresponding anticipation probability of each pre-processed results, the corresponding processing knot of the pending data is generated
Fruit.
Above-mentioned data processing method, device, computer equipment and storage medium obtain pending data, by number to be processed
According to multiple pretreatment submodels in input data processing model, pending data is carried out simultaneously by multiple pretreatment submodels
Processing;The pre-processed results that each pretreatment submodel exports respectively are obtained, and count the corresponding anticipation of each pre-processed results
Probability;According to the corresponding anticipation probability of each pre-processed results, the consistency of multiple pre-processed results can be verified, and according to
Each anticipation probability generates the corresponding processing result of pending data, improves the accuracy of model data processing.
Detailed description of the invention
Fig. 1 is the applied environment figure of data processing method in one embodiment;
Fig. 2 is the flow diagram of data processing method in one embodiment;
The flow diagram for the step of Fig. 3 is the initial submodel of training in one embodiment;
The flow diagram for the step of Fig. 4 is the initial submodel of training in another embodiment;
Fig. 5 is flow diagram the step of constructing data processing model in one embodiment;
Fig. 6 is the structural schematic diagram of architectural source model in one embodiment;
Fig. 7 is flow diagram the step of generating processing result in one embodiment;
Fig. 8 is the flow diagram that processing abnormal the step of notifying is generated in one embodiment;
Fig. 9 is the schematic diagram of training data in one embodiment;
Figure 10 is the schematic diagram of data processing in one embodiment;
Figure 11 is the structural block diagram of data processing equipment in one embodiment;
Figure 12 is the internal structure chart of computer equipment in one embodiment.
Specific embodiment
It is with reference to the accompanying drawings and embodiments, right in order to which the objects, technical solutions and advantages of the application are more clearly understood
The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain the application, not
For limiting the application.
Data processing method provided by the present application can be applied in application environment as shown in Figure 1, can in application environment
To include terminal 102 and server 104, terminal 102 is communicated by network with server 104.This method can both be applied
In terminal 102, server 104 also can be applied to.Wherein, terminal 102 can be, but not limited to be various industrial computers, individual
Computer, laptop, smart phone and tablet computer.Server 104 can use independent server either multiple clothes
The server cluster of business device composition is realized.
In one embodiment, as shown in Fig. 2, providing a kind of data processing method, it is applied in Fig. 1 in this way
It is illustrated for terminal, comprising the following steps:
Step 202, pending data is obtained.
Wherein, pending data is to be entered the data that model is handled when using model.
Specifically, terminal obtains the data processing instructions of user's triggering, parses to data processing instructions, obtains data
The storage address of pending data in process instruction.The corresponding memory space of terminal access storage address, it is empty from the storage of access
Between it is middle extract storage pending data.
In one embodiment, terminal obtains the Data Identification of typing, generates data acquisition request according to Data Identification, leads to
It crosses network and data acquisition request is sent to server.Server receives data acquisition request, according in data acquisition request
Data Identification extracts pending data from database, and pending data is sent to terminal by network.
In one embodiment, terminal is equipped with image collecting device.After terminal obtains data processing instructions, start image
Acquisition device, the image data that terminal arrives image acquisition device is as pending data.
Step 204, pending data input data is handled into model.
Wherein, the model that data processing model is made of multiple pretreatment submodels, for the number to be processed to input
According to being handled.
Specifically, after terminal gets pending data, trigger data input instruction will acquire according to data input instruction
To pending data input data processing model in.
Step 206, it obtains in data processing model and respectively pre-processes the pre-processed results that submodel exports respectively.
Wherein, pre-processed results are processing result of the pretreatment submodel in data processing model to pending data.
Specifically, data processing model is made of multiple pretreatment submodels.Terminal will be at pending data input data
After managing model, each pretreatment submodel in data processing model will be handled pending data, be exported respective
Pre-processed results.Terminal obtains in data processing model, pre-processed results of each pretreatment submodel to pending data.
Step 208, the corresponding anticipation probability of each pre-processed results is counted.
Wherein, anticipation probability is that each pre-processed results occur in all pre-processed results of pretreatment submodel output
Probability can be the ratio of number and pre-processed results sum that each pre-processed results occur.
Specifically, terminal reads the pre-processed results of each pretreatment submodel output, and counts different pre-processed results
The number of appearance.Terminal calculates separately the ratio of the number that each pre-processed results occur and pre-processed results sum, will calculate
Anticipation probability of the ratio arrived as each pre-processed results.
Step 210, according to the corresponding anticipation probability of each pre-processed results, the corresponding processing knot of pending data is generated
Fruit.
Wherein, processing result is output result of the data processing model to the pending data of input.
Specifically, terminal counts after obtaining the corresponding anticipation probability of each pre-processed results, obtains predetermined probabilities condition,
The corresponding anticipation probability of pre-processed results is compared with predetermined probabilities condition one by one, screening meets predetermined probabilities condition
Anticipation probability corresponding to pre-processed results.
When terminal, which screens, meets pre-processed results corresponding to the anticipation probability of predetermined probabilities condition, terminal completion pair
The processing of pending data, by the corresponding processing result of the pre-processed results screened pending data as input.
Wherein, predetermined probabilities condition is pre-set for screening specific pre-processed results from each pre-processed results
Condition.Predetermined probabilities condition can be anticipation probability and be more than or equal to preset probability threshold value.
In one embodiment, terminal is ranked up by anticipation probability of the sort algorithm to each pre-processed results, sequence
After the completion, pre-processed results corresponding to highest anticipation probability are chosen.Sort algorithm can be bubble sort, selected and sorted,
At least one of merger sequence.
In one embodiment, terminal anticipation probability and place according to corresponding to each pre-processed results and each pre-processed results
Reason notifies as a result, generating processing result, is shown by display screen to processing result notice.
In one embodiment, data processing model can be any kind of model, and pending data can be and number
According to the corresponding any kind of data of processing model.For example, when data processing model is identification model or disaggregated model, it is to be processed
Data can be image data, and the processing result of data processing model is the product defects, type of goods and quantity recognized
Deng.When data processing model is trajectory planning model, pending data can be the pose data of image data or object, place
Manage the movement routine track that result is object.
In the present embodiment, by obtaining pending data, pending data input data is handled multiple pre- in model
Submodel is handled, pending data is handled simultaneously by multiple pretreatment submodels;Obtain each pretreatment submodel difference
The pre-processed results of output, and count the corresponding anticipation probability of each pre-processed results;It is right respectively according to each pre-processed results
The anticipation probability answered can verify the consistency of multiple pre-processed results, and generate pending data pair according to each anticipation probability
The processing result answered improves the accuracy of model data processing.
As shown in figure 3, in one embodiment, trained initial submodel, the step are further included the steps that before step 202
Specifically comprise the following steps:
Step 302, multiple and different initial submodel and training data are obtained.
Wherein, initial submodel is the initial model without parameter adjustment.Training data be for initial submodel into
The data sample of row training.
Specifically, before using model, it is necessary first to train the machine learning model for data processing.Terminal obtains
The model training instruction for taking family triggering, parses model training instruction, obtains initial submodel storage address and training
Address data memory.Terminal is read more according to initial submodel storage address and training data storage address from memory space
Each initial submodel and training data that read are loaded into memory by a initial submodel and training data.It is multiple initial
Submodel can be different types of model, be also possible to identical type but the different model of original model parameter, can also be same
When include different types of model and identical type but the different model of original model parameter.
Step 304, each initial submodel is trained with training data, obtains multiple pretreatment submodels.
Wherein, pretreatment submodel is the model that initial submodel training obtains after the completion.
Specifically, training data is inputted each initial submodel by terminal.Initial submodel according to input training data into
Row training, adjusts model parameter, until deconditioning when meeting training stop condition, obtains multiple pretreatment submodels.Training
Data may include label data, and initial submodel exports initial results according to training data, passes through initial results and number of tags
Error is predicted according to determining, if prediction error is more than or equal to preset error threshold, is adjusted according to the direction for minimizing prediction error
Model parameter, and iteration executes above-mentioned training process, until when prediction error is less than preset error threshold, deconditioning.
In one embodiment, training stop condition can be the number of iterations of the initial submodel in training.When initial
When the number of iterations of the submodel in training is more than or equal to preset the number of iterations threshold value, deconditioning obtains pretreatment submodule
Type.
Step 306, data processing model is constructed according to multiple pretreatment submodels.
Specifically, after terminal is to each initial submodel training, multiple pretreatment submodels are obtained.Terminal is with each pre- place
Based on managing submodel, pretreatment submodel cluster is set up, using obtained pretreatment submodel cluster as data processing model.
In the present embodiment, multiple and different initial submodel and training data are obtained, identical training data is input to
It is trained in multiple and different initial submodels, obtains multiple pretreatment submodels, improve the reliability of model training, root
Data processing model is constructed according to multiple pretreatment submodels, improves the efficiency for obtaining data processing model.
As shown in figure 4, in another embodiment, trained initial submodel, the step are further included the steps that before step 202
Suddenly specifically comprise the following steps:
Step 402, multiple identical initial submodels and training data are obtained.
Specifically, terminal obtains model training instruction, and initial submodel storage address, instruction are extracted from model training instruction
Practice address data memory and training parameter.Terminal is empty from storage according to initial submodel storage address and training data storage address
Between middle read initial submodel and training data.Terminal according to the default submodel quantity in training parameter to initial submodel into
Row duplication, obtains multiple identical initial submodels with default submodel quantity Matching.
Step 404, it is extracted from training data and multiple initial submodels multiple training sample sets correspondingly.
Specifically, for each initial submodel, terminal extraction section data from training data according to predetermined manner,
Training sample set is obtained according to the partial data being drawn into.Terminal can randomly select partial data from training data, according to
The partial data building training sample set randomly selected.
In one embodiment, terminal divides training data, obtains matching with initial submodel quantity more
A training data subset, using multiple training data subsets as with multiple initial submodels multiple training samples correspondingly
Collection.It include 10000 pictures in training data, terminal is according to 1-2000,2001- for example, there is 5 initial submodels
4000,4001- the 6000th, 6001-8000,8001- the 10000th division mode obtain 5 trained numbers
According to subset, using 5 training data subsets as with 5 one-to-one training sample sets of initial submodel.
Step 406, multiple pretreatment submodules are obtained according to the corresponding initial submodel of each training sample set training respectively
Type.
Specifically, for each initial submodel, terminal obtains training sample set corresponding with initial submodel, according to instruction
Practice sample set to be trained initial submodel.After terminal is to multiple initial submodel training, multiple pretreatment are obtained
Model.
In one embodiment, when terminal is trained initial submodel, by the training number of initial submodel to be entered
According to being divided into multiple groups.One group of training data is inputted initial submodel first by terminal, obtains the initial results of initial submodel output,
Initial results and label data are compared and are calculated prediction error, according to prediction error transfer factor model parameter;It again will be another
Group training data inputs initial submodel adjusted, repeats the above process, until prediction error convergence, obtains pretreatment submodule
Type.
Step 408, data processing model is constructed according to multiple pretreatment submodels.
Specifically, after the completion of terminal is to multiple identical initial submodel training, multiple pretreatment submodels are obtained.Terminal
Submodel cluster is constructed according to multiple pretreatment submodels, the submodel cluster that building is obtained is as data processing model.
In the present embodiment, obtain multiple identical initial submodels and training data, extracted from training data with it is multiple
Initial submodel multiple training sample sets correspondingly, for multiple identical initial submodels, using the side of control variable
Method inputs different training sample sets respectively and is trained, and obtains multiple pretreatment submodels, improves the reliable of model training
Property, data processing model is constructed according to multiple pretreatment submodels, improves the efficiency for obtaining data processing model.
As shown in figure 5, in one embodiment, step 204 specifically further includes the steps that constructing data processing model, the step
Suddenly specifically comprise the following steps:
Step 502, multiple and different model minor structures is extracted from trained architectural source model.
Wherein, model minor structure is the minor structure randomly selected from architectural source model, can normally execute mould
The function of type.The structure of architectural source model is linked comprising multilayer, can be handled pending data.
Specifically, terminal prestores trained architectural source model.Terminal carries out the data processing instructions got
Parsing, obtains the storage address of architectural source model, and according to the storage address of architectural source model, knot is read from memory space
Structure carrys out source model.It after terminal reads architectural source model, is repeatedly extracted from architectural source model, extracts obtain every time
One model minor structure.When being extracted every time, terminal takes out some link at random from architectural source model respectively, obtains
Multiple and different model minor structures.
In one embodiment, the structure of architectural source model is as shown in Figure 6.Architectural source model can be neural network
Model, including input layer, hidden layer and output layer.Terminal can take out certain between any two layers link, obtain it is multiple not
Same model minor structure.For example, terminal can take out x in Fig. 61To h1Link, h can also be taken out2To o2Link.
In one embodiment, terminal first will obtain architectural source model by training before obtaining pending data.
Terminal obtains initial submodel and training data according to model training instruction, is instructed according to training data to initial submodel
Practice, after training, obtains architectural source model.
Step 504, using each model minor structure as the pretreatment submodel in data processing model, data processing is constructed
Model.
Specifically, after terminal obtains multiple and different model minor structures by extraction, using each model minor structure as pre-
Handle submodel.Terminal constructs submodel cluster according to multiple pretreatment submodels, obtains data processing model.
Step 506, by each pretreatment submodel in pending data difference input data processing model.
Specifically, after the completion of terminal building data processing model, according to the quantity of pretreatment submodel to pending data
It is replicated, obtains the more parts of identical pending datas with pretreatment submodel quantity Matching.Terminal is respectively by every part wait locate
Manage each pretreatment submodel in data input data processing model.
In one embodiment, the model that prestores in terminal is respectively identified as the first model, the second model and/or third
Model.Wherein, the first model and the second model are data processing model, and each pretreatment submodel of the first model is by multiple and different
The training of initial submodel obtain, each pretreatment submodel of the second model is obtained by multiple identical initial submodels training.
Third model is architectural source model.It after terminal gets pending data, obtains and prestores the model identification of model, when getting
Model identification when being the first model identification or the second model identification, pending data is inputted the first model or the second mould by terminal
Each pretreatment submodel in type;When the model identification got is third model identification, terminal is extracted from third model
Multiple and different model minor structures constructs data processing model using each model minor structure as pretreatment submodel, will be wait locate
Manage each pretreatment submodel in data difference input data processing model.Each pretreatment submodel source of data processing model
It can be one of the first model, the second model and/or third model.
In the present embodiment, multiple and different model minor structures is obtained by extracting from trained architectural source model,
Using each model minor structure as pretreatment submodel, data processing model is constructed, pending data is distinguished at input data
Manage each pretreatment submodel in model.By randomly selecting model minor structure from architectural source model, by model minor structure
As pretreatment submodel, the reliability of the pretreatment submodel of selection ensure that.
As shown in fig. 7, in one embodiment, step 210 specifically further includes the steps that generating processing result, step tool
Body includes the following steps:
Step 702, it from each pre-processed results, filters out and meets pre- place corresponding to the anticipation probability of predetermined probabilities condition
Manage result.
Specifically, it the available multiple candidate pre-processed results of each pretreatment submodel and is tied with each candidate pretreatment
The corresponding candidate probability of fruit.Terminal reads the default transition probability prestored, pre-processes submodel for each, terminal can
Pretreatment knot candidate probability to be more than or equal to the candidate pre-processed results of default transition probability, as pretreatment submodel
Fruit.Terminal counts the corresponding anticipation probability of each pre-processed results of each pretreatment submodel, and obtains preset probability item
Part, one by one by the corresponding anticipation probability of pre-processed results compared with predetermined probabilities condition, screening meets predetermined probabilities condition
Prejudge pre-processed results corresponding to probability.
Step 704, the uncertainty for the pre-processed results that calculating sifting arrives.
Wherein, uncertainty is the probabilistic quantitative evaluation value of pre-processed results screened to terminal.Uncertainty
Lower, the confidence level of pre-processed results is higher.
Specifically, terminal can be theoretical based on Bayesian schools (Bayesians), adds to the pre-processed results screened
Uncertainty.Terminal obtains preset uncertainty calculation mode, according to uncertainty calculation mode and obtained each pretreatment
As a result or candidate pre-processed results, the uncertainty for the pre-processed results that calculating sifting arrives.
Step 706, according to uncertainty and the pre-processed results screened, the corresponding processing knot of pending data is generated
Fruit.
Specifically, terminal is by the corresponding processing result of the pre-processed results screened pending data as input, eventually
End shows uncertainty corresponding with processing result when showing processing result.Terminal can also obtain screen it is pre-
The anticipation probability of processing result, while showing processing result, anticipation probability corresponding with processing result and uncertainty.
In one embodiment, terminal can be theoretical based on Frequency school (Frequentists), does not calculate uncertainty,
Determining processing result is exported, the pre-processed results screened and anticipation probability are exported.
In one embodiment, when the uncertainty being calculated is less than preset uncertainty threshold value, terminal is shown
The pre-processed results screened and the uncertainty being calculated.
Terminal is based on Bayesian schools theoretical calculation uncertainty, for the confidence level amount of offer of the pre-processed results screened
Change assessed value.When uncertainty is more than or equal to preset uncertainty threshold value, terminal display uncertainty notifies extremely.Not really
Whether the fixed abnormal notice of degree may remind the user that the pending data of input once for training initial submodel or architectural source mould
Type, and remind user's supplementary training data;Uncertainty notice extremely also may remind the user that preset uncertainty threshold value is
It is no reasonable, and user is reminded to reset uncertainty threshold value.
Table 1:
For example, data processing model includes 4 pretreatment submodels, pending data is image data, each pre- place
Reason submodel obtains two candidate pre-processed results, and candidate pre-processed results are the animal species recognized.Each pretreatment submodule
The candidate pre-processed results of type can be as shown in Table 1, and wherein A, B and C respectively indicate three classes candidate's pre-processed results, in table
Number be candidate pre-processed results candidate probability, C class pre-processed results be according to presetting transition probability, it is candidate to A class pre-
It is obtained after processing result or the conversion of B class candidate's pre-processed results.Terminal is by C class candidate's pre-processed results, candidate probability 1
Result as pretreatment submodel pre-processed results.For example, in A class candidate's pre-processed results of pretreatment submodel one,
Candidate pre-processed results are that the candidate probability of cat is 0.9, and candidate pre-processed results are that the candidate probability of dog is 0.1, preset conversion
Probability can be 0.5;0.9 is greater than 0.5, therefore after A class candidate's pre-processed results are converted to C class candidate's pre-processed results, it is candidate
Pre-processed results are that the candidate probability of cat is 1, and candidate pre-processed results are that the candidate probability of dog is 0.Terminal is using cat as pre- place
Manage the pre-processed results of submodel one.
Assuming that each pretreatment submodel candidate's pre-processed results and candidate probability, respectively table obtained in use twice
A class candidate pre-processed results and B class candidate's pre-processed results in lattice 1, then the pre-processed results screened are cat, and prejudge probability
It is 75%.Assuming that terminal is calculated in A class candidate's pre-processed results, uncertainty 0.2;In B class candidate's pre-processed results,
Uncertainty is 0.1.For A class candidate pre-processed results or B class candidate's pre-processed results, when being based on Frequency school theory,
The determining processing result of terminal output, output " pre-processed results screened are cat and anticipation probability is 75% ".It is being based on shellfish
When Ye Si school theory, for A class candidate's pre-processed results, " pre-processed results screened are cat, prejudge probability for terminal output
For 75% and uncertainty is 0.2 ";For B class candidate's pre-processed results, terminal output " pre-processed results screened are cat,
Anticipation probability is 75% and uncertainty is 0.1 ".When being based on Frequency school theory, A class candidate pre-processed results and B class are waited
Select pre-processed results identical;When being based on Bayesian schools theory, it is candidate that the degree of certainty of B class candidate's pre-processed results is greater than A class
The degree of certainty of pre-processed results.
It in one embodiment, can be first candidate pre- by A class when terminal is based on Bayesian schools theoretical calculation uncertainty
Processing result or B class candidate's pre-processed results are converted into C class candidate's pre-processed results, and calculate uncertainty according to formula 1:
Uncertainty=1- frequency of occurrence most pre-processed results frequency of occurrence/pre-processed results are total (1)
For example, respectively pre-processing submodel after A class candidate's pre-processed results are converted C class candidate's pre-processed results by terminal
Pre-processed results be respectively " cat, cat, cat, dog ", the most pre-processed results of frequency of occurrence are cat, and frequency of occurrence 3 is pre- to locate
Managing result sum is 4.According to formula 1, uncertainty are as follows: 1-3/4=0.25.When calculating uncertainty by formula 1, minimum is not
Degree of certainty is 0, and maximum uncertainty is 0.5.
In one embodiment, terminal calculates uncertainty according to formula 2:
Uncertainty=final uncertainty-respectively pretreatment submodel uncertainty average value (2)
Wherein, terminal calculates each pretreatment submodel uncertainty average value by formula 3:
The sum of each pretreatment submodel uncertainty average value=each pretreatment submodel uncertainty/pretreatment submodel
Total (3)
Wherein, terminal calculates each pretreatment submodel uncertainty H by formula 4:
Wherein, piFor the candidate probability of candidate pre-processed results in pretreatment submodel, i is positive integer, is pretreatment
The quantity of the candidate pre-processed results of model.For example, one pretreatment submodel candidate pre-processed results include cat and
Dog, wherein the candidate probability of cat is 0.5, and the candidate probability of dog is 0.5, then the uncertainty h=- of the pretreatment submodel
[0.5log (0.5)+0.5log (0.5)=1.When the candidate probability of cat is 0, the candidate probability of dog is 1, uncertainty h
=-[0 × log0+1 × log1=0.Uncertainty value range is [0,1].Terminal respectively obtains each pretreatment submodel not
After degree of certainty, by the uncertainty of each pretreatment submodel be added to obtain addend and, calculate addend and with pretreatment submodel number
The ratio of amount, using obtained ratio as the average value of each pretreatment submodel uncertainty.
When terminal calculates final uncertainty, according to each pretreatment submodel, the time of all kinds of candidate pre-processed results is calculated
Select the simple arithmetic mean of probabilityI is positive integer, is to pre-process the quantity of the candidate pre-processed results of submodel, then root
Final uncertainty is calculated according to formula 5:
For example, in A class candidate's pre-processed results of table 1, each to pre-process in submodel, candidate pre-processed results cat
Candidate probability be 0.9,0.8,0.7 and 0.1 respectively, simple arithmetic meanCandidate pre-processed results are that the candidate probability of dog is respectively
0.1,0.2,0.3 and 0.9, simple arithmetic meanFinally
Uncertainty H=- (0.625 × log0.625+0.375 × log0.375)=0.63.
In the data processing model that terminal is obtained according to the first model, the second model and third model construction, pass through third
The data processing model that model obtains extracts the side of multiple and different model minor structures from trained architectural source model
Formula can preferably be suitable for obtaining uncertainty based on Bayesian schools theory.
In the present embodiment, from each pre-processed results, filters out and meet corresponding to the anticipation probability of predetermined probabilities condition
Pre-processed results, then the uncertainty of pre-processed results that calculating sifting arrives, uncertainty reflect the pretreatment knot screened
The credibility of fruit;When according to pre-processed results, generating the corresponding processing result of pending data, it is added to uncertainty,
Improve the accuracy of the processing result of model output.
As shown in figure 8, in one embodiment, further including the steps that generating the abnormal notice of processing, the step after step 208
Suddenly specifically comprise the following steps:
Step 802, pending data is obtained.
Step 804, pending data input data is handled into model.
Step 806, it obtains in data processing model and respectively pre-processes the pre-processed results that submodel exports respectively.
Step 808, the corresponding anticipation probability of each pre-processed results is counted.
Step 810, when generating processing result corresponding with pending data not according to each anticipation probability, according to each pre- place
It manages result and generates the abnormal notice of processing.
Wherein, the abnormal notice of processing is that terminal does not screen the notification information generated when pre-processed results.
Specifically, when terminal, which does not screen, meets pre-processed results corresponding to the anticipation probability of predetermined probabilities condition,
Processing result corresponding with pending data can not be generated, terminal is according to corresponding to each pre-processed results and each pre-processed results
Probability is prejudged, the abnormal notice of processing is generated.
Step 812, the abnormal notice of processing is shown.
Specifically, after terminal generates the abnormal notice of processing, the abnormal notice of processing is sent to the display screen of terminal, by aobvious
Display screen shows the abnormal notice of processing.
In one embodiment, the abnormal notice of processing can prompt user to re-enter pending data, so that user is defeated
After entering new pending data, new pending data is handled to obtain processing result.The abnormal notice of processing can also mention
Show that user can again be trained model.
In the present embodiment, when generating processing result corresponding with pending data not according to each anticipation probability, i.e., do not screen
When to pre-processed results, generate that processing is abnormal to be notified and show the abnormal notice of processing according to each pre-processed results, to connect again
The pending data for receiving input, improves the reliability of data processing.
It is provided by the present application it is theoretical based on Bayesian schools, for being sieved from the pre-processed results of multiple pretreatment submodels
The method that the pre-processed results selected calculate uncertainty, can apply in various machine learning techniques, such as supervised learning,
Semi-supervised learning, intensified learning and learning by imitation etc.;Various machine learning techniques based on the application can solve in each field,
The relevant various problems such as classification or recurrence, are described as follows:
1, supervised learning
By taking the defects detection based on supervised learning as an example.Defects detection can be applied in various fields, such as converted products
Defects detection (such as scratch, bubble and integrity degree etc.), AOI detection (Automated Optical Inspection, automatically
Optical detection) etc..
When carrying out defects detection, pre-processed results are the defects of product, when the pre-processed results screened not really
Determine degree value it is lower when, the accuracy of the pre-processed results screened is higher;When the pre-processed results judgement by screening produces
Product zero defect, but when uncertainty is higher, product is likely that there are defect.Reason may is that the defects of product is learned in supervision
It is to be labeled, or seldom appear in training data in habit;The type can be added in the training data of supervised learning
Defect makes model re-start study.Reason may also be that preset uncertainty threshold value setting is unreasonable, need to set again
Determine uncertainty threshold value.
2, semi-supervised learning
Semi-supervised learning (Semi-Supervised Learning, SSL) refers to using some training datas not marked,
The training data marked carries out model training.
In training, some training datas for having mark of input carry out source model to initial submodel or initial configuration and instruct
Practice.After training, pending data is inputted into each pretreatment submodel, and screen pre-processed results;The pretreatment screened
As a result when uncertainty is higher, the training data that will do not marked is needed to be labeled by manual type, then retraining mould
Type.
For handling regression problem, data processing model is applied in target object location or pose identification, for knowing
The pose of other target object.Fig. 9 is the schematic diagram of training data in one embodiment.Specifically, it referring to Fig. 9, identifies Fig. 9 (a)
The position of middle target object or the uncertainty of pose are lower;In Fig. 9 (b), because there are hiding relations between object, target is identified
The position of object or the uncertainty of pose are higher, then may need again to be labeled training data.
3, intensified learning and learning by imitation
In intensified learning and learning by imitation, need to execute a movement (behavior), current strategies meeting for current state
There is provided an Action option for current state, (intensified learning and learning by imitation need intelligent body to have exploring ability, such as machine
People, but intelligent body can only not necessarily execute the Action option when carrying out track use), and behavior value function can be directed to current shape
State and the Action option provide an expected returns.When behavior value function utilizes the Bayesian inference of multiple models, not only
An expected returns are provided for the current state and the current Action option, the uncertainty of expected returns can be also provided;If
The uncertainty of expected returns is low, i.e., after current state executes current action option, the return of track is contemplated to be determination relatively,
Show that current state and the combination of current action option have effectively been explored in the past learning process of intelligent body;?
The stage that intensified learning or learning by imitation need intelligent body to explore, it is not recommended that selection current action option;When expected returns not
Degree of certainty is high, i.e., the return that current state executes the track after current action option is contemplated to be relatively unknown, shows current shape
State and the combination of current action option are not explored fully in the past learning process of intelligent body, then in intensified learning or mould
The stage that imitative study needs intelligent body to explore, it is proposed that selection current action option.
Figure 10 is the schematic diagram of data processing in one embodiment.Specifically, referring to Fig.1 0, data processing model can be
Image recognition model is made of 4 pretreatment submodels.The pending data that terminal obtains is image data, by image data
4 pretreatment submodels being input in data processing model, 4 pretreatment submodels identify the animal in image.
The pre-processed results that terminal gets 3 pretreatment submodels are cat, and anticipation probability is 75%, and 1 pre-processes the pre- of submodel
Processing result is dog, and anticipation probability is 25%.If preset Probability Condition is anticipation, probability is more than or equal to 75%, and terminal can sieve
The pre-processed results for meeting predetermined probabilities condition, as cat are chosen, and using cat as data processing model to the place of image data
Manage result.If preset Probability Condition is anticipation, probability is equal to 100%, and terminal cannot screen pre-processed results.
It should be understood that although each step in the flow chart of Fig. 2-5 and 7-8 is successively shown according to the instruction of arrow
Show, but these steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly state otherwise herein, this
There is no stringent sequences to limit for the execution of a little steps, these steps can execute in other order.Moreover, Fig. 2-5 and 7-8
In at least part step may include that perhaps these sub-steps of multiple stages or stage are not necessarily multiple sub-steps
Completion is executed in synchronization, but can be executed at different times, the execution in these sub-steps or stage sequence is not yet
Necessarily successively carry out, but can be at least part of the sub-step or stage of other steps or other steps in turn
Or it alternately executes.
In one embodiment, as shown in figure 11, a kind of data processing equipment 1100 is provided, comprising: data processing mould
Block 1102, data input module 1104, result obtain module 1106, probability statistics module 1108 and result-generation module 1110 its
In:
Data acquisition module 1102, for obtaining pending data.
Data input module 1104, for pending data input data to be handled model.
As a result module 1106 is obtained, for obtaining the pretreatment for respectively pre-processing submodel in data processing model and exporting respectively
As a result.
Probability statistics module 1108, for counting the corresponding anticipation probability of each pre-processed results.
Result-generation module 1110, for generating number to be processed according to the corresponding anticipation probability of each pre-processed results
According to corresponding processing result.
In the present embodiment, by obtaining pending data, pending data input data is handled multiple pre- in model
Submodel is handled, pending data is handled simultaneously by multiple pretreatment submodels;Obtain each pretreatment submodel difference
The pre-processed results of output, and count the corresponding anticipation probability of each pre-processed results;It is right respectively according to each pre-processed results
The anticipation probability answered can verify the consistency of multiple pre-processed results, and generate pending data pair according to each anticipation probability
The processing result answered improves the accuracy of model data processing.
In one embodiment, data processing equipment 1100 further includes model training module, and model training module is for obtaining
Take multiple and different initial submodel and training data;Each initial submodel is trained with training data, is obtained multiple pre-
Handle submodel;Data processing model is constructed according to multiple pretreatment submodels.
In the present embodiment, multiple and different initial submodel and training data are obtained, identical training data is input to
It is trained in multiple and different initial submodels, obtains multiple pretreatment submodels, improve the reliability of model training, root
Data processing model is constructed according to multiple pretreatment submodels, improves the efficiency for obtaining data processing model.
In another embodiment, model training module is also used to obtain multiple identical initial submodels and training number
According to;It is extracted from training data and multiple initial submodels multiple training sample sets correspondingly;Respectively according to each training
The corresponding initial submodel of sample set training, obtains multiple pretreatment submodels;Data are constructed according to multiple pretreatment submodels
Handle model.
In the present embodiment, obtain multiple identical initial submodels and training data, extracted from training data with it is multiple
Initial submodel multiple training sample sets correspondingly, for multiple identical initial submodels, using the side of control variable
Method inputs different training sample sets respectively and is trained, and obtains multiple pretreatment submodels, improves the reliable of model training
Property, data processing model is constructed according to multiple pretreatment submodels, improves the efficiency for obtaining data processing model.
In one embodiment, data input module 1104 specifically includes: structure extraction module, model construction module and defeated
Enter module, in which:
Structure extraction module, for extracting multiple and different model minor structures from trained processing model.
Model construction module, for using each model minor structure as the pretreatment submodel in data processing model, structure
Build data processing model.
Input module, for pending data difference input data to be handled each pretreatment submodel in model.
In the present embodiment, multiple and different model minor structures is obtained by extracting from trained architectural source model,
Using each model minor structure as pretreatment submodel, data processing model is constructed, pending data is distinguished at input data
Manage each pretreatment submodel in model.By randomly selecting model minor structure from architectural source model, by model minor structure
As pretreatment submodel, the reliability of the pretreatment submodel of selection ensure that.
In one embodiment, result-generation module 1110 is used for from each pre-processed results, is filtered out to meet and be preset generally
Pre-processed results corresponding to the anticipation probability of rate condition;The uncertainty for the pre-processed results that calculating sifting arrives;According to not true
The pre-processed results spent and screened calmly generate the corresponding processing result of pending data.
In the present embodiment, from each pre-processed results, filters out and meet corresponding to the anticipation probability of predetermined probabilities condition
Pre-processed results, then the uncertainty of pre-processed results that calculating sifting arrives, uncertainty reflect the pretreatment knot screened
The credibility of fruit;When according to pre-processed results, generating the corresponding processing result of pending data, it is added to uncertainty,
Improve the accuracy of the processing result of model output.
In one embodiment, data processing equipment 1100 further include: notice generation module and notice display module,
In:
It notifies generation module, is used for when generating processing result corresponding with pending data not according to each anticipation probability,
The abnormal notice of processing is generated according to each pre-processed results.
Display module is notified, for showing the abnormal notice of processing.
In the present embodiment, when generating processing result corresponding with pending data not according to each anticipation probability, i.e., do not screen
When to pre-processed results, generate that processing is abnormal to be notified and show the abnormal notice of processing according to each pre-processed results, to connect again
The pending data for receiving input, improves the reliability of data processing.
Specific about data processing equipment limits the restriction that may refer to above for data processing method, herein not
It repeats again.Modules in above-mentioned data processing equipment can be realized fully or partially through software, hardware and combinations thereof.On
Stating each module can be embedded in the form of hardware or independently of in the processor in computer equipment, can also store in a software form
In memory in computer equipment, the corresponding operation of the above modules is executed in order to which processor calls.
In one embodiment, a kind of computer equipment is provided, which can be terminal, internal structure
Figure is shown in Fig.12.The computer equipment includes the processor connected by system bus, memory, network interface, shows
Display screen, input unit and image collecting device.Wherein, the processor of the computer equipment is for providing calculating and control ability.
The memory of the computer equipment includes non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with behaviour
Make system and computer program.The built-in storage is the operation of the operating system and computer program in non-volatile memory medium
Environment is provided.The network interface of the computer equipment is used to communicate with external terminal by network connection.The computer program
To realize a kind of data processing method when being executed by processor.The display screen of the computer equipment can be liquid crystal display or
Electric ink display screen, the input unit of the computer equipment can be the touch layer covered on display screen, be also possible to calculate
Key, trace ball or the Trackpad being arranged on machine equipment shell can also be external keyboard, Trackpad or mouse etc..Image
For acquisition device for acquiring image data, acquired image data will be used as pending data.
It will be understood by those skilled in the art that structure shown in Figure 12, only part relevant to application scheme
The block diagram of structure, does not constitute the restriction for the computer equipment being applied thereon to application scheme, and specific computer is set
Standby may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, a kind of computer equipment is provided, including memory, processor and storage are on a memory
And the computer program that can be run on a processor, processor performs the steps of acquisition when executing computer program to be processed
Data;Pending data input data is handled into model;It obtains in data processing model and respectively pre-processes what submodel exported respectively
Pre-processed results;Count the corresponding anticipation probability of each pre-processed results;According to the corresponding anticipation of each pre-processed results
Probability generates the corresponding processing result of pending data.
In one embodiment, before obtaining pending data, following step is also realized when processor executes computer program
It is rapid: to obtain multiple and different initial submodel and training data;Each initial submodel is trained with training data, is obtained more
A pretreatment submodel;Data processing model is constructed according to multiple pretreatment submodels.
In another embodiment, it is also realized before obtaining pending data, when processor executes computer program following
Step: multiple identical initial submodels and training data are obtained;It is extracted with multiple initial submodels one by one from training data
Corresponding multiple training sample sets;Respectively according to the corresponding initial submodel of each training sample set training, multiple pre- places are obtained
Manage submodel;Data processing model is constructed according to multiple pretreatment submodels.
It in one embodiment, include: from trained architectural source mould by pending data input data processing model
Multiple and different model minor structures is extracted in type;Using each model minor structure as the pretreatment submodule in data processing model
Type constructs data processing model;By each pretreatment submodel in pending data difference input data processing model.
In one embodiment, according to the corresponding anticipation probability of each pre-processed results, it is corresponding to generate pending data
Processing result include: to filter out from each pre-processed results and meet pre- place corresponding to the anticipation probability of predetermined probabilities condition
Manage result;The uncertainty for the pre-processed results that calculating sifting arrives;According to uncertainty and the pre-processed results screened, generate
The corresponding processing result of pending data.
In one embodiment, from each pre-processed results, count the corresponding anticipation probability of each pre-processed results it
Afterwards, it also performs the steps of when processor executes computer program and is generated and pending data pair when not according to each anticipation probability
When the processing result answered, the abnormal notice of processing is generated according to each pre-processed results;Show the abnormal notice of processing.
In the present embodiment, by obtaining pending data, pending data input data is handled multiple pre- in model
Submodel is handled, pending data is handled simultaneously by multiple pretreatment submodels;Obtain each pretreatment submodel difference
The pre-processed results of output, and count the corresponding anticipation probability of each pre-processed results;It is right respectively according to each pre-processed results
The anticipation probability answered can verify the consistency of multiple pre-processed results, and generate pending data pair according to each anticipation probability
The processing result answered improves the accuracy of model data processing.
In one embodiment, a kind of computer readable storage medium is provided, computer program is stored thereon with, is calculated
Machine program performs the steps of acquisition pending data when being executed by processor;Pending data input data is handled into model;
It obtains in data processing model and respectively pre-processes the pre-processed results that submodel exports respectively;Each pre-processed results are counted to respectively correspond
Anticipation probability;According to the corresponding anticipation probability of each pre-processed results, the corresponding processing result of pending data is generated.?
In one embodiment, it is more that acquisition is also performed the steps of before obtaining pending data, when computer program is executed by processor
A different initial submodel and training data;Each initial submodel is trained with training data, obtains multiple pretreatments
Submodel;Data processing model is constructed according to multiple pretreatment submodels.
In another embodiment, obtain pending data before, when computer program is executed by processor also realize with
Lower step: multiple identical initial submodels and training data are obtained;It is extracted and multiple initial submodels one from training data
One corresponding multiple training sample sets;Respectively according to the corresponding initial submodel of each training sample set training, obtain multiple pre-
Handle submodel;Data processing model is constructed according to multiple pretreatment submodels.
It in one embodiment, include: from trained architectural source mould by pending data input data processing model
Multiple and different model minor structures is extracted in type;Using each model minor structure as the pretreatment submodule in data processing model
Type constructs data processing model;By each pretreatment submodel in pending data difference input data processing model.At one
In embodiment, according to the corresponding anticipation probability of each pre-processed results, generating the corresponding processing result of pending data includes:
From each pre-processed results, filters out and meet pre-processed results corresponding to the anticipation probability of predetermined probabilities condition;Calculating sifting
The uncertainty of the pre-processed results arrived;According to uncertainty and the pre-processed results screened, it is corresponding to generate pending data
Processing result.
In one embodiment, from each pre-processed results, count the corresponding anticipation probability of each pre-processed results it
Afterwards, it is also performed the steps of when computer program is executed by processor when not according to the generation of each anticipation probability and pending data
When corresponding processing result, the abnormal notice of processing is generated according to each pre-processed results;Show the abnormal notice of processing.
In the present embodiment, by obtaining pending data, pending data input data is handled multiple pre- in model
Submodel is handled, pending data is handled simultaneously by multiple pretreatment submodels;Obtain each pretreatment submodel difference
The pre-processed results of output, and count the corresponding anticipation probability of each pre-processed results;It is right respectively according to each pre-processed results
The anticipation probability answered can verify the consistency of multiple pre-processed results, and generate pending data pair according to each anticipation probability
The processing result answered improves the accuracy of model data processing.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with
Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer
In read/write memory medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein,
To any reference of memory, storage, database or other media used in each embodiment provided herein,
Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM
(PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include
Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms,
Such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM), enhancing
Type SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM
(RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of above embodiments can be combined arbitrarily, for simplicity of description, not to above-described embodiment
In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance
Shield all should be considered as described in this specification.
The several embodiments of the application above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously
It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art
It says, without departing from the concept of this application, various modifications and improvements can be made, these belong to the protection of the application
Range.Therefore, the scope of protection shall be subject to the appended claims for the application patent.
Claims (10)
1. a kind of data processing method, which comprises
Obtain pending data;
The pending data input data is handled into model;
It obtains and respectively pre-processes the pre-processed results that submodel exports respectively in the data processing model;
Count the corresponding anticipation probability of each pre-processed results;
According to the corresponding anticipation probability of each pre-processed results, the corresponding processing result of the pending data is generated.
2. the method according to claim 1, wherein before the acquisition pending data, further includes:
Obtain multiple and different initial submodel and training data;
Each initial submodel is trained with the training data, obtains multiple pretreatment submodels;
Data processing model is constructed according to the multiple pretreatment submodel.
3. the method according to claim 1, wherein before the acquisition pending data, further includes:
Obtain multiple identical initial submodels and training data;
It is extracted from the training data and multiple initial submodels multiple training sample sets correspondingly;
Respectively according to the corresponding initial submodel of each training sample set training, multiple pretreatment submodels are obtained;
Data processing model is constructed according to the multiple pretreatment submodel.
4. the method according to claim 1, wherein described handle model for the pending data input data
Include:
Multiple and different model minor structures is extracted from trained architectural source model;
Using each model minor structure as the pretreatment submodel in data processing model, the data processing model is constructed;
The pending data is inputted to each pretreatment submodel in the data processing model respectively.
5. the method according to claim 1, wherein described corresponding pre- according to each pre-processed results
Sentence probability, generating the corresponding processing result of the pending data includes:
From each pre-processed results, filters out and meet pre-processed results corresponding to the anticipation probability of predetermined probabilities condition;
The uncertainty for the pre-processed results that calculating sifting arrives;
According to the uncertainty and the pre-processed results screened, the corresponding processing knot of the pending data is generated
Fruit.
6. the method according to claim 1, wherein the corresponding anticipation of each pre-processed results of statistics is general
After rate, further includes:
When generating processing result corresponding with the pending data not according to each anticipation probability, tied according to each pretreatment
Fruit generates the abnormal notice of processing;
Show that the processing notifies extremely.
7. a kind of data processing equipment, which is characterized in that described device includes:
Data acquisition module, for obtaining pending data;
Data input module, for the pending data input data to be handled model;
As a result module is obtained, for obtaining the pretreatment knot for respectively pre-processing submodel in the data processing model and exporting respectively
Fruit;
Probability statistics module, for counting the corresponding anticipation probability of each pre-processed results;
Result-generation module, for generating the number to be processed according to the corresponding anticipation probability of each pre-processed results
According to corresponding processing result.
8. device according to claim 6, which is characterized in that the data input module includes:
Structure extraction module, for extracting multiple and different model minor structures from trained processing model;
Model construction module, for constructing institute using each model minor structure as the pretreatment submodel in data processing model
State data processing model;
Input module, for the pending data to be inputted to each pretreatment submodel in the data processing model respectively.
9. a kind of computer equipment including memory, processor and stores the meter that can be run on a memory and on a processor
Calculation machine program, which is characterized in that the processor realizes any one of claims 1 to 6 institute when executing the computer program
The step of stating method.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
The step of method described in any one of claims 1 to 6 is realized when being executed by processor.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910012742.9A CN109886415A (en) | 2019-01-07 | 2019-01-07 | Data processing method, device, computer equipment and storage medium |
PCT/CN2020/070651 WO2020143610A1 (en) | 2019-01-07 | 2020-01-07 | Data processing method and apparatus, computer device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910012742.9A CN109886415A (en) | 2019-01-07 | 2019-01-07 | Data processing method, device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109886415A true CN109886415A (en) | 2019-06-14 |
Family
ID=66925695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910012742.9A Pending CN109886415A (en) | 2019-01-07 | 2019-01-07 | Data processing method, device, computer equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109886415A (en) |
WO (1) | WO2020143610A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111079175A (en) * | 2019-11-26 | 2020-04-28 | 微民保险代理有限公司 | Data processing method, data processing device, computer readable storage medium and computer equipment |
WO2020143610A1 (en) * | 2019-01-07 | 2020-07-16 | 鲁班嫡系机器人(深圳)有限公司 | Data processing method and apparatus, computer device, and storage medium |
CN111984414A (en) * | 2020-08-21 | 2020-11-24 | 苏州浪潮智能科技有限公司 | Data processing method, system, equipment and readable storage medium |
CN112272362A (en) * | 2020-09-11 | 2021-01-26 | 安徽中科新辰技术有限公司 | Method for realizing message notification sending service |
WO2021129143A1 (en) * | 2019-12-28 | 2021-07-01 | 华为技术有限公司 | Multitask-based data analysis method, device and terminal equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170262766A1 (en) * | 2016-03-08 | 2017-09-14 | Linkedin Corporation | Variable grouping for entity analysis |
CN107193836A (en) * | 2016-03-15 | 2017-09-22 | 腾讯科技(深圳)有限公司 | A kind of recognition methods and device |
CN107480774A (en) * | 2017-08-11 | 2017-12-15 | 山东师范大学 | Dynamic neural network model training method and device based on integrated study |
CN107527065A (en) * | 2017-07-25 | 2017-12-29 | 北京联合大学 | A kind of flower variety identification model method for building up based on convolutional neural networks |
CN108665457A (en) * | 2018-05-16 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Image-recognizing method, device, storage medium and computer equipment |
CN109146076A (en) * | 2018-08-13 | 2019-01-04 | 东软集团股份有限公司 | model generating method and device, data processing method and device |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107527091B (en) * | 2016-10-14 | 2021-05-25 | 腾讯科技(北京)有限公司 | Data processing method and device |
CN107463966B (en) * | 2017-08-17 | 2019-06-18 | 电子科技大学 | Radar range profile's target identification method based on dual-depth neural network |
CN107679491B (en) * | 2017-09-29 | 2020-05-19 | 华中师范大学 | 3D convolutional neural network sign language recognition method fusing multimodal data |
CN109886415A (en) * | 2019-01-07 | 2019-06-14 | 鲁班嫡系机器人(深圳)有限公司 | Data processing method, device, computer equipment and storage medium |
-
2019
- 2019-01-07 CN CN201910012742.9A patent/CN109886415A/en active Pending
-
2020
- 2020-01-07 WO PCT/CN2020/070651 patent/WO2020143610A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170262766A1 (en) * | 2016-03-08 | 2017-09-14 | Linkedin Corporation | Variable grouping for entity analysis |
CN107193836A (en) * | 2016-03-15 | 2017-09-22 | 腾讯科技(深圳)有限公司 | A kind of recognition methods and device |
CN107527065A (en) * | 2017-07-25 | 2017-12-29 | 北京联合大学 | A kind of flower variety identification model method for building up based on convolutional neural networks |
CN107480774A (en) * | 2017-08-11 | 2017-12-15 | 山东师范大学 | Dynamic neural network model training method and device based on integrated study |
CN108665457A (en) * | 2018-05-16 | 2018-10-16 | 腾讯科技(深圳)有限公司 | Image-recognizing method, device, storage medium and computer equipment |
CN109146076A (en) * | 2018-08-13 | 2019-01-04 | 东软集团股份有限公司 | model generating method and device, data processing method and device |
Non-Patent Citations (4)
Title |
---|
刘伯运 等: "基于神经网络和D-S证据理论的柴油机状态评估", 《车用发动机》 * |
张军英: "《二进前向人工神经网络理论与应用》", 31 May 2001, 西安电子科技大学出版社 * |
秦鹏: "基于最小二乘支持向量机和D-S证据理论的电力变压器故障诊断研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
高志强: "《深度学习从入门到实战》", 30 June 2018, 中国铁道出版社 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020143610A1 (en) * | 2019-01-07 | 2020-07-16 | 鲁班嫡系机器人(深圳)有限公司 | Data processing method and apparatus, computer device, and storage medium |
CN111079175A (en) * | 2019-11-26 | 2020-04-28 | 微民保险代理有限公司 | Data processing method, data processing device, computer readable storage medium and computer equipment |
WO2021129143A1 (en) * | 2019-12-28 | 2021-07-01 | 华为技术有限公司 | Multitask-based data analysis method, device and terminal equipment |
CN111984414A (en) * | 2020-08-21 | 2020-11-24 | 苏州浪潮智能科技有限公司 | Data processing method, system, equipment and readable storage medium |
CN111984414B (en) * | 2020-08-21 | 2022-05-24 | 苏州浪潮智能科技有限公司 | Data processing method, system, equipment and readable storage medium |
CN112272362A (en) * | 2020-09-11 | 2021-01-26 | 安徽中科新辰技术有限公司 | Method for realizing message notification sending service |
Also Published As
Publication number | Publication date |
---|---|
WO2020143610A1 (en) | 2020-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109886415A (en) | Data processing method, device, computer equipment and storage medium | |
JP6453968B2 (en) | Threshold change device | |
Amirkhani et al. | Exploiting experts’ knowledge for structure learning of Bayesian networks | |
Peel et al. | Detecting change points in the large-scale structure of evolving networks | |
Duval | Explainable artificial intelligence (XAI) | |
CN110472082A (en) | Data processing method, device, storage medium and electronic equipment | |
CN111523421A (en) | Multi-user behavior detection method and system based on deep learning and fusion of various interaction information | |
Song et al. | MSFYOLO: Feature fusion-based detection for small objects | |
RU2689818C1 (en) | Method of interpreting artificial neural networks | |
CN109656818A (en) | A kind of denseness system failure prediction method | |
CN112613349A (en) | Time sequence action detection method and device based on deep hybrid convolutional neural network | |
WO2022063076A1 (en) | Adversarial example identification method and apparatus | |
WO2020041859A1 (en) | System and method for building and using learning machines to understand and explain learning machines | |
CN116975743A (en) | Industry information classification method, device, computer equipment and storage medium | |
CN116501979A (en) | Information recommendation method, information recommendation device, computer equipment and computer readable storage medium | |
EP4064038B1 (en) | Automated generation and integration of an optimized regular expression | |
EP3821366A1 (en) | Systems, methods, and computer-readable media for improved table identification using a neural network | |
CN111638926A (en) | Method for realizing artificial intelligence in Django framework | |
Adeyiga et al. | A comparative analysis of selected clustering algorithms for criminal profiling | |
CN114692012A (en) | Electronic government affair recommendation method based on Bert neural collaborative filtering | |
Gomez et al. | Fuzzy sets in remote sensing classification | |
Fitrianah et al. | Fine-tuned mobilenetv2 and vgg16 algorithm for fish image classification | |
Adebayo | Towards Effective Tools for Debugging Machine Learning Models | |
Fang et al. | EPR: A neural network for automatic feature learning from code for defect prediction | |
CN115982646B (en) | Management method and system for multisource test data based on cloud platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190614 |