CN109325541A - Method and apparatus for training pattern - Google Patents

Method and apparatus for training pattern Download PDF

Info

Publication number
CN109325541A
CN109325541A CN201811157474.1A CN201811157474A CN109325541A CN 109325541 A CN109325541 A CN 109325541A CN 201811157474 A CN201811157474 A CN 201811157474A CN 109325541 A CN109325541 A CN 109325541A
Authority
CN
China
Prior art keywords
machine learning
training
data
learning algorithm
data set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811157474.1A
Other languages
Chinese (zh)
Inventor
李文峰
陈诗妮
宝腾飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201811157474.1A priority Critical patent/CN109325541A/en
Publication of CN109325541A publication Critical patent/CN109325541A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The embodiment of the present application discloses the method and apparatus for training pattern.One specific embodiment of this method includes: acquisition configuration file, wherein configuration file includes data set identification, machine learning frame mark, machine learning algorithm mark and parameter information;Data are chosen from data set indicated by data set identification as training data, generate training dataset;The indicated machine learning frame of machine learning frame mark is chosen from preset machine learning frame set as target machine learning framework;Under target machine learning framework, parameter indicated by indicated machine learning algorithm and parameter information is identified using machine learning algorithm, label associated by the training data concentrated based on training dataset and training data, training obtains disaggregated model.The embodiment can be used family and select different machine learning frame and corresponding machine learning algorithm, parameter according to actual needs, extend the scope of application of model training.

Description

Method and apparatus for training pattern
Technical field
The invention relates to field of computer technology, and in particular to the method and apparatus for training pattern.
Background technique
As machine learning is gradually used by all trades and professions, general machine learning model is often unable to satisfy determining for user AI (Artificial Intelligence, artificial intelligence) demand of inhibition and generation.User, which usually will recognize that, uses specific data, The model of training oneself is customized to adapt to specific scene, such as image classification, object detection, voice recognition, Application on Voiceprint Recognition Deng.Since different types of model may need to train under different types of machine learning frame, and existing model training System usually supports limited machine learning frame, and therefore, existing model training systems are generally unable to satisfy the difference of user Model training demand.
Summary of the invention
The embodiment of the present application proposes the method and apparatus for training pattern.
In a first aspect, the embodiment of the present application provides a kind of method for training pattern, this method comprises: obtaining configuration File, wherein configuration file includes data set identification, machine learning frame mark, machine learning algorithm mark and parameter information; Data are chosen from data set indicated by data set identification as training data, generate training dataset, wherein in data set The preparatory correlation tag of data;The indicated machine of machine learning frame mark is chosen from preset machine learning frame set Learning framework is as target machine learning framework;Under target machine learning framework, identified using machine learning algorithm indicated by Machine learning algorithm and parameter information indicated by parameter, the training data institute concentrated based on training dataset and training data Associated label, training obtain disaggregated model.
In some embodiments, configuration file further includes for associated by characterize data collection between label and target labels Corresponding relationship correspondence relationship information, target labels be in the training process for prediction label;And utilize engineering Parameter indicated by algorithm mark indicated machine learning algorithm and parameter information is practised, training dataset and training data are based on Label associated by the training data of concentration, training obtain disaggregated model, comprising: utilize above-mentioned machine learning algorithm and above-mentioned ginseng It counts, label and correspondence relationship information associated by the training data concentrated based on training dataset, training data, training are divided Class model.
In some embodiments, data conduct the above method further include: execute operation presented below: is chosen from data set Test data generates test data set;Class prediction is carried out to the test data that test data is concentrated using disaggregated model, is obtained Prediction result;Label and prediction result associated by the test data concentrated based on test data, predict disaggregated model Recruitment evaluation generates assessment result, and shows assessment result.
In some embodiments, the step of above-mentioned displaying operates further include: increment operation is executed to exercise wheel number;Determine instruction Practice whether wheel number reaches target exercise wheel number, if so, terminating above-mentioned displaying operation;If the training of exercise wheel number miss the mark Number is taken turns, then using disaggregated model as initial model, chooses data again from data set as training data, generates training data Collection, under target machine learning framework, using above-mentioned machine learning algorithm, based on the training dataset and training number regenerated Label associated by training data according to concentration, is trained initial model, obtains disaggregated model, and continues to execute described Show operation.
In some embodiments, obtaining configuration file includes: the information for obtaining user and configuring on configuration interface, and being based on should Information generate configuration file, wherein configuration interface include machine learning frame chosen area, machine learning algorithm chosen area and Parameter information chosen area.
In some embodiments, show there is at least one machine learning algorithm title in machine learning algorithm chosen area Group, machine learning algorithm name group are divided according to the purposes of machine learning algorithm.
Second aspect, the embodiment of the present application provide a kind of device for training pattern, which includes: to obtain list Member is configured to obtain configuration file, wherein configuration file includes data set identification, machine learning frame mark, machine learning Algorithm mark and parameter information;Generation unit is configured to choose data conduct from data set indicated by data set identification Training data generates training dataset, wherein the preparatory correlation tag of data in data set;Determination unit is configured to from pre- The indicated machine learning frame of machine learning frame mark is chosen in the machine learning frame set set as target machine Practise frame;Training unit is configured under target machine learning framework, and indicated machine is identified using machine learning algorithm Parameter indicated by learning algorithm and parameter information, associated by the training data concentrated based on training dataset and training data Label, training obtain disaggregated model.
In some embodiments, configuration file further includes for associated by characterize data collection between label and target labels Corresponding relationship correspondence relationship information, target labels be in the training process for prediction label;And training unit into One step is configured to: utilizing above-mentioned machine learning algorithm and above-mentioned parameter, the training concentrated based on training dataset, training data Label associated by data and correspondence relationship information, training obtain disaggregated model.
In some embodiments, above-mentioned apparatus further include: display unit is configured to execute operation presented below: from number Data are chosen as test data according to concentrating, and generate test data set;The test number that test data is concentrated using disaggregated model According to class prediction is carried out, prediction result is obtained;Label and prediction result associated by the test data concentrated based on test data, Evaluation is carried out to disaggregated model, generates assessment result, and show assessment result.
In some embodiments, the step of above-mentioned displaying operates further include: increment operation is executed to exercise wheel number;Determine instruction Practice whether wheel number reaches target exercise wheel number, if so, terminating above-mentioned displaying operation;If the training of exercise wheel number miss the mark Number is taken turns, then using disaggregated model as initial model, chooses data again from data set as training data, generates training data Collection, under target machine learning framework, using above-mentioned machine learning algorithm, based on the training dataset and training number regenerated Label associated by training data according to concentration, is trained initial model, obtains disaggregated model, and continues to execute above-mentioned Show operation.
In some embodiments, acquiring unit is further configured to: the information that user configures on configuration interface is obtained, Configuration file is generated based on the information, wherein configuration interface includes machine learning frame chosen area, machine learning algorithm selection Region and parameter information chosen area.
In some embodiments, show there is at least one machine learning algorithm title in machine learning algorithm chosen area Group, machine learning algorithm name group are divided according to the purposes of machine learning algorithm.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, which includes: one or more processing Device;Storage device is stored thereon with one or more programs;When the one or more program is held by the one or more processors Row, so that the one or more processors realize the method as described in implementation any in first aspect.
Fourth aspect, the embodiment of the present application provide a kind of computer-readable medium, are stored thereon with computer program, should The method as described in implementation any in first aspect is realized when program is executed by processor.
Method and apparatus provided by the embodiments of the present application for training pattern, by obtaining configuration file, then from matching It sets and chooses data in data set indicated by the data set identification in file as training data, generate training dataset, then The machine learning frame indicated from the machine learning frame mark chosen in preset machine learning frame set in configuration file Frame is as target machine learning framework, to utilize the machine learning algorithm in configuration file under target machine learning framework Parameter indicated by mark indicated machine learning algorithm and parameter information is concentrated based on training dataset and training data Label associated by training data, training obtain disaggregated model.Family can be used and select different engineerings according to actual needs Frame and corresponding machine learning algorithm, parameter are practised, the scope of application of model training is extended.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 is that one embodiment of the application can be applied to exemplary system architecture figure therein;
Fig. 2 is the flow chart according to one embodiment of the method for training pattern of the application;
Fig. 3 is the schematic diagram according to an application scenarios of the method for training pattern of the application;
Fig. 4 is the flow chart according to another embodiment of the method for training pattern of the application;
Fig. 5 is the structural schematic diagram according to one embodiment of the device for training pattern of the application;
Fig. 6 is adapted for the structural schematic diagram for the computer system for realizing the electronic equipment of the embodiment of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 is shown can be using the method for training pattern of the application or the implementation of the device for training pattern The exemplary system architecture 100 of example.
As shown in Figure 1, system architecture 100 may include terminal device 101,102,103, network 104 and server 105. Network 104 between terminal device 101,102,103 and server 105 to provide the medium of communication link.Network 104 can be with Including various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be used terminal device 101,102,103 and be interacted by network 104 with server 105, to receive or send out Send message etc..Various telecommunication customer end applications can be installed, such as web browser is answered on terminal device 101,102,103 With, model training class application etc..
Terminal device 101,102,103 can be hardware, be also possible to software.When terminal device 101,102,103 is hard When part, it can be the various electronic equipments with display screen, including but not limited to smart phone, tablet computer, on knee portable Computer and desktop computer etc..When terminal device 101,102,103 is software, above-mentioned cited electricity may be mounted at In sub- equipment.Multiple softwares or software module (such as providing Distributed Services) may be implemented into it, also may be implemented into Single software or software module.It is not specifically limited herein.
Server 105 can be to provide the server of various services.Such as server 105 can from terminal device 101, 102,103 configuration file relevant to model training is obtained, and corresponding model training operation is executed according to configuration file, instructed Practice result (such as disaggregated model).
It should be noted that the method provided by the embodiment of the present application for training pattern is generally held by server 105 Row.Correspondingly, it is generally positioned in server 105 for the device of training pattern.
It should be pointed out that server can be hardware, it is also possible to software.When server is hardware, may be implemented At the distributed server cluster that multiple servers generate, individual server also may be implemented into.It, can when server is software To be implemented as multiple softwares or software module (such as providing Distributed Services), single software or software also may be implemented into Module.It is not specifically limited herein.
It should be understood that the number of terminal device, network and server in Fig. 1 is only schematical.According to realization need It wants, can have any number of terminal device, network and server.
With continued reference to Fig. 2, the process of one embodiment of the method for training pattern according to the application is shown 200.This is used for the process 200 of the method for training pattern, comprising the following steps:
Step 201, configuration file is obtained.
It in the present embodiment, can be with for the executing subject of the method for training pattern (such as server 105 shown in FIG. 1) Configuration file is obtained by wired connection mode or radio connection.Wherein, configuration file for example may include data set Mark, machine learning frame mark, machine learning algorithm mark and parameter information.The parameter information can serve to indicate that the machine Parameter needed for the indicated machine learning algorithm of learning algorithm mark.Parameter information for example may include parameter identification.It should Understand, mark can be indicated with letter, number or combinations thereof.
It should be noted that data set identification, machine learning frame mark, machine learning algorithm mark in configuration file Know, the data set that parameter information indicates respectively, machine learning frame, machine learning algorithm, parameter can be stored in advance in it is above-mentioned Executing subject is local.In addition, the data in the data set can preparatory correlation tag.Label can serve to indicate that associated number According to classification.
In the present embodiment, by terminal device, (such as terminal shown in FIG. 1 is set the available user of above-mentioned executing subject Standby 101,102,103) configuration file that uploads.Above-mentioned executing subject can provide a user configuration circle relevant to model training Face.User can upload pre-generated configuration file by configuration interface.
It is emphasized that above-mentioned executing subject can locally be previously provided with machine learning frame set, machine learning Algorithm set, parameter relevant to machine learning algorithm.Wherein, machine learning frame, machine learning algorithm, parameter can be all provided with It is equipped with corresponding mark.
It should be understood that machine learning frame can include but is not limited to TensorFlow, Caffe, Theano, Keras, Torch, MXNet, CNTK, PaddlePaddle etc..Wherein, TensorFlow is the machine learning library of opposite high-order, Yong Huke Easily to design neural network structure with it.The full name in English of Caffe is Convolutional Architecture for Fast Feature Embedding is the open source deep learning frame being widely used.Theano is a high-performance Sign computation and deep learning library.Keras is the neural network library of high modularization, uses the Python (solution of object-oriented Release type computer programming language) it realizes, and can operate on TensorFlow and Theano simultaneously.Torch is to be based on The deep learning frame of Lua language, scalability with higher.MXNet is deep learning frame, is supported from single machine to more GPU (Graphics Processing Unit, graphics processor), the computing capability of more clusters.CNTK can pass through fine-grained structure Part block allows user not need that new, complicated channel type can be created using the language of low level.PaddlePaddle is a kind of Deep learning Open Source Platform.The full name in English of Paddle is Parallel Distributed Deep Learning, Chinese name Referred to as parallel distributed deep learning.PaddlePaddle has realized CPU (Central Processing Unit, central processing Device)/GPU single machine and distributed mode, while can support mass data training, hundreds machine concurrent operation, it can be light Cope with large-scale data training.
Machine learning algorithm can include but is not limited to SDM (Supervised Descent Method), CNN (Convolutional Neural Network, convolutional neural networks), RNN (Recurrent Neural Network, circulation Neural network), LSTM (Long Short-Term Memory, shot and long term memory network), FastText, ULMFit (Universal Language Model Fine-tuning), linear regression, logistic regression, decision tree, SVM (Support Vector Machine, support vector machines), naive Bayesian, K nearest neighbor algorithm, K mean algorithm, random forests algorithm, dimensionality reduction Algorithm, Gradient Boost and Adaboost algorithm etc..It should be noted that algorithm above is research and application extensively at present Well-known technique, details are not described herein.
The parameter in machine learning frame mark, machine learning algorithm mark, parameter information in practice, in configuration file Mark can be above-mentioned executing subject locally existing machine learning frame, machine learning algorithm, parameter mark.
In some optional implementations of the present embodiment, the parameter information in configuration file is in addition to including parameter identification It in addition, can also include parameter.The parameter can be the parameter that user sets according to actual needs.
In some optional implementations of the present embodiment, configuration file can also include for characterizing above-mentioned data set The correspondence relationship information of corresponding relationship between associated label and target labels.Wherein, target labels can be in training Label for prediction in the process.As an example it is supposed that label associated by above-mentioned data set include " automobile ", " motorcycle ", " bicycle ", " tricycle ".Target labels include " motor vehicle " and " non-motor vehicle ".Above-mentioned correspondence relationship information, which can be, to be used for Corresponding relationship, label " bicycle ", " tricycle " between indicating label " automobile ", " motorcycle " and target labels " motor vehicle " The correspondence relationship information of corresponding relationship between target labels " non-motor vehicle ".
In some optional implementations of the present embodiment, above-mentioned configuration interface may include that machine learning frame is chosen Region, machine learning algorithm chosen area and parameter chosen area.It can have been shown in machine learning frame chosen area above-mentioned The title of machine learning frame in machine learning frame set.Above-mentioned machine can have been shown in machine learning algorithm chosen area The title of machine learning algorithm in device learning algorithm set.It can show in parameter chosen area and selected with machine learning algorithm The associated parameter name of machine learning algorithm title for taking region to be shown.User can be in the difference on above-mentioned configuration interface Information needed for region is chosen carries out information configuration.In addition, above-mentioned configuration interface can also include calculating for inputting machine learning The input area of the parameter and other information (such as above-mentioned correspondence relationship information) of method.The thus above-mentioned available use of executing subject The information that family configures on above-mentioned configuration interface generates configuration file based on the information.
In some optional implementations of the present embodiment, it can be shown at least in machine learning algorithm chosen area One machine learning algorithm name group.Machine learning algorithm name group can be purposes (such as the people according to machine learning algorithm Face alignment, speech recognition, text identification, image recognition etc.) divide.In this way, can be convenient without machine learning knowledge The user of background rapidly selects required algorithm.It should be noted that machine learning algorithm name group can be above-mentioned hold Row main body divides automatically, is also possible to artificially to divide, be not specifically limited herein.
Step 202, data are chosen from data set indicated by the data set identification in configuration file as training data, Generate training dataset.
In the present embodiment, above-mentioned executing subject can be from data set indicated by the data set identification in configuration file Data are chosen as training data, generate training dataset.For example, above-mentioned executing subject can be according to the training data number of setting Ratio shared by mesh, positive example and negative example selects positive example data and negative number of cases according to as training data from data set.Example again Such as, above-mentioned executing subject can choose the total data in data set as training data.
Step 203, institute is identified from the machine learning frame chosen in configuration file in preset machine learning frame set The machine learning frame of instruction is as target machine learning framework.
In the present embodiment, above-mentioned executing subject can be chosen in configuration file from preset machine learning frame set The indicated machine learning frame of machine learning frame mark as target machine learning framework.
Step 204, under target machine learning framework, indicated using the machine learning algorithm mark in configuration file Parameter indicated by machine learning algorithm and parameter information, the training data concentrated based on training dataset and training data are closed The label of connection, training obtain disaggregated model.
In the present embodiment, above-mentioned executing subject can learn after determining target machine learning framework in target machine Under frame, identified using the machine learning algorithm in configuration file indicated by indicated machine learning algorithm and parameter information Parameter, label associated by the training data concentrated based on training dataset and training data, training obtain disaggregated model.
As an example, above-mentioned executing subject can be using the training data of training data concentration as input, by training data Associated label carries out model training as output.In the training process, above-mentioned executing subject can concentrate training data The model trained of training data input in, obtain corresponding prediction result.Above-mentioned executing subject can tie the prediction Fruit is compared with label associated by the training data, determines whether model trains completion according to comparison result.If it is determined that mould Type training is completed, then the model that training is completed can be determined as disaggregated model by above-mentioned executing subject.
Here, above-mentioned executing subject can determine whether model reaches preset optimization aim according to comparison result, if reaching It arrives, then above-mentioned executing subject can determine that model training is completed.Wherein, optimization aim can for example refer to the accuracy rate of prediction result Greater than preset accuracy rate threshold value.If above-mentioned executing subject determines that model is not up to preset optimization aim, above-mentioned execution master The parameter of the adjustable model of body, the other training datas for continuing to use training data concentration carry out model training.
Optionally, if above-mentioned executing subject is obtaining prediction corresponding with each training data of training data concentration As a result after, still determine that model is not up to preset optimization aim, above-mentioned executing subject can also determine that model training is completed.
In some optional implementations of the present embodiment, if configuration file includes above-mentioned correspondence relationship information, on The machine learning algorithm that stating executing subject can use in configuration file identifies indicated machine learning algorithm and parameter information Indicated parameter, based on training dataset, training data concentrate training data associated by label and above-mentioned corresponding relationship Information, training obtain disaggregated model.Here, above-mentioned executing subject can using training data concentrate training data as input, Using target labels corresponding with label associated by the training data as output, model training is carried out.
With continued reference to the signal that Fig. 3, Fig. 3 are according to the application scenarios of the method for training pattern of the present embodiment Figure.In the application scenarios of Fig. 3, server 301 is local to can store the data set that user uploads in advance.Moreover, server 301 local can be stored in advance machine learning frame set, machine learning algorithm set and corresponding with machine learning algorithm join Number.When user wants to carry out model training, the information above that can be prestored according to practical business demand and server 301, really The information such as data set, machine learning frame, machine learning algorithm and parameter to be used are needed when training surely, are then set by terminal The corresponding configuration file of standby 302 creation, is uploaded to server 301 for configuration file.Wherein, configuration file may include data set Title, machine learning frame title, machine learning algorithm title and parameter information.Server 301 is getting above-mentioned configuration text After part, data can be chosen as training data in the data set indicated by dataset name be locally stored, above-mentioned, generated Training dataset 303.Wherein, the data in the data set can preparatory correlation tag.Then, server 301 can be from above-mentioned Machine learning frame indicated by above-mentioned machine learning frame title is chosen in machine learning frame set as target machine Practise frame 304.Finally, server 301 can utilize local pre-stored above-mentioned machine under target machine learning framework 304 The machine learning algorithm 305 and above-mentioned parameter information that machine learning algorithm name referring in device learning algorithm set, above-mentioned shows Indicated parameter 306, using the training data in training dataset 303 as input, by label associated by the training data As output, training obtains disaggregated model 307.
The method provided by the above embodiment of the application, by obtaining configuration file, then from the data in configuration file Data are chosen in data set indicated by set identifier as training data, training dataset are generated, then from preset engineering It practises and chooses the indicated machine learning frame of the mark of the machine learning frame in configuration file in frame set as target machine Learning framework, to identify indicated machine using the machine learning algorithm in configuration file under target machine learning framework Parameter indicated by device learning algorithm and parameter information, associated by the training data concentrated based on training dataset and training data Label, training obtain disaggregated model.Family can be used and select different machine learning frame and corresponding according to actual needs Machine learning algorithm, parameter extend the scope of application of model training.
With further reference to Fig. 4, it illustrates the processes 400 of another embodiment of the method for training pattern.The use In the process 400 of the method for training pattern, comprising the following steps:
Step 401, configuration file is obtained.
Step 402, data are chosen from data set indicated by the data set identification in configuration file as training data, Generate training dataset.
Step 403, institute is identified from the machine learning frame chosen in configuration file in preset machine learning frame set The machine learning frame of instruction is as target machine learning framework.
Step 404, under target machine learning framework, indicated using the machine learning algorithm mark in configuration file Parameter indicated by machine learning algorithm and parameter information, the training data concentrated based on training dataset and training data are closed The label of connection, training obtain disaggregated model.
It in the present embodiment, can be referring to the step in embodiment illustrated in fig. 2 for the explanation of step 401-404 The related description of 201-204, details are not described herein.
Step 405, data are chosen from data set as test data, generate test data set.
It in the present embodiment, can be with for the executing subject of the method for training pattern (such as server 105 shown in FIG. 1) Data are chosen from above-mentioned data set as test data, generate test data set.Here, test data choosing method for example may be used To include: to randomly select data as test data from above-mentioned data set.Test data choosing method for example can also include: Ratio shared by test data number, positive example and negative example according to setting selects positive example data and is born from above-mentioned data set Number of cases is according to as test data.It, can also be with it should be noted that the data that test data set and training data are concentrated can be identical Difference is not specifically limited herein.
Step 406, class prediction is carried out to the test data that test data is concentrated using disaggregated model, obtains prediction knot Fruit.
In the present embodiment, above-mentioned executing subject can use disaggregated model and carry out to the test data that test data is concentrated Class prediction obtains prediction result.For example, the test data input classification mould that above-mentioned executing subject can concentrate test data Type, so that disaggregated model exports prediction result corresponding with the test data.Wherein, prediction result may include that disaggregated model is pre- The label corresponding with the test data measured.
Step 407, based on test data concentrate test data associated by label and prediction result, to disaggregated model into Row evaluation generates assessment result, and shows assessment result.
In the present embodiment, above-mentioned executing subject can based on test data concentrate test data associated by label and The prediction result obtained in a step 406 carries out evaluation to disaggregated model, generates assessment result, and show and comment Estimate result.
As an example, above-mentioned executing subject can be by the test data for every test data that test data is concentrated Associated label is compared with corresponding prediction result, determines whether prediction result is correct.Then, above-mentioned executing subject can To count the number of correct prediction result.Later, above-mentioned executing subject can calculate the number and test data is concentrated Test data total number between ratio.Then, which can be determined as the reality of disaggregated model by above-mentioned executing subject The assessment result including actual prediction accuracy rate can be generated in border predictablity rate, above-mentioned executing subject, and shows the assessment As a result.
Optionally, above-mentioned actual prediction accuracy rate can also be compared by above-mentioned executing subject with accuracy rate threshold value, be obtained To comparison result.The assessment result including above-mentioned actual prediction accuracy rate and the comparison result can be generated in above-mentioned executing subject.
Step 408, increment operation is executed to exercise wheel number.
In the present embodiment, above-mentioned executing subject can execute increment operation to exercise wheel number.Wherein, exercise wheel number is first Initial value can be 0.Above-mentioned executing subject used step-length when executing increment operation to exercise wheel number can be 1.
Since above-mentioned executing subject has carried out a wheel model training in step 404, thus above-mentioned executing subject is held for the first time When row step 408, exercise wheel number can be incremented by 1, the value of exercise wheel number is 1 at this time.
Step 409, determine whether exercise wheel number reaches target exercise wheel number.
In the present embodiment, after above-mentioned executing subject executes increment operation to exercise wheel number, exercise wheel can be then determined Whether number reaches target exercise wheel number.Wherein, it is pre-set to can be above-mentioned executing subject for target exercise wheel number, is also possible to What user specified.For example, may include target exercise wheel number in configuration file.
Step 410, in response to determining that exercise wheel number reaches target exercise wheel number, end operation is executed.
In the present embodiment, above-mentioned executing subject can be executed in response to determining that exercise wheel number reaches target exercise wheel number End operation, to terminate the circulation execution to step 405-411.
Step 411, in response to determining exercise wheel number miss the mark exercise wheel number, using disaggregated model as initial model, Data are chosen again from data set as training data, training dataset is generated and utilizes machine under target machine learning framework Device learning algorithm, label associated by the training data concentrated based on the training dataset regenerated and training data, to first Beginning model is trained, and obtains disaggregated model.
In the present embodiment, above-mentioned executing subject can be in response to determining exercise wheel number miss the mark exercise wheel number, will Disaggregated model chooses data as training data again from above-mentioned data set, generates training dataset as initial model.And Afterwards, above-mentioned executing subject can be identified signified under target machine learning framework using the machine learning algorithm in configuration file The machine learning algorithm shown, mark associated by the training data concentrated based on the training dataset regenerated and training data Label, are trained initial model, obtain disaggregated model.Then, above-mentioned executing subject can turn to execute step 405.
As an example, above-mentioned executing subject can be using the training data of training data concentration as input, by the training number According to associated label as output, initial model is trained.Wherein, training process may include: from training dataset The training data is inputted initial model, obtains prediction result by middle selection training data;By label associated by the training data It is compared with the prediction result;According to comparison result, determine whether initial model trains completion;If it is determined that initial model training It completes, then the initial model completed training is as disaggregated model.Here, determine whether initial model trains the method for completion can Referring to the related description of the step 204 in embodiment illustrated in fig. 2, details are not described herein.
If above-mentioned executing subject determines that initial model not complete by training, the ginseng of the above-mentioned adjustable initial model of executing subject Number is then concentrated from training data and chooses other training datas, continues to execute above-mentioned training process.
Optionally, if above-mentioned executing subject is obtaining prediction corresponding with each training data of training data concentration As a result after, still determine that initial model is not up to preset optimization aim, above-mentioned executing subject can also determine that model training is complete At.
In some optional implementations of the present embodiment, if configuration file includes being closed for characterizing above-mentioned data set The correspondence relationship information of corresponding relationship between the label and target labels of connection, then above-mentioned executing subject can be in target machine It practises under frame, indicated machine learning algorithm is identified using the machine learning algorithm in configuration file, based on what is regenerated Label and correspondence relationship information associated by the training data that training dataset, training data are concentrated, carry out initial model Training, obtains disaggregated model.Wherein, target labels can be the label for prediction in the training process.Above-mentioned executing subject The training data that training data can be concentrated is as input, by target labels corresponding with label associated by the training data As output, initial model is trained.
Figure 4, it is seen that the method for training pattern compared with the corresponding embodiment of Fig. 2, in the present embodiment Process 400 highlight based on target exercise wheel numerical control simulation training wheel number, and each round training after the completion of, to point Class model carries out evaluation, generates assessment result, and show assessment result.Thus the scheme of the present embodiment description can To realize multiwheel models training, and it can be convenient user and check the prediction effect for training the disaggregated model of completion for every wheel Assessment result.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, this application provides one kind for training mould One embodiment of the device of type, the Installation practice is corresponding with embodiment of the method shown in Fig. 2, which can specifically answer For in various electronic equipments.
As shown in figure 5, the device 500 for training pattern of the present embodiment includes: that acquiring unit 501 is configured to obtain Configuration file, wherein configuration file may include data set identification, machine learning frame mark, machine learning algorithm mark and Parameter information;Generation unit 502 is configured to choose data from data set indicated by data set identification as training data, Generate training dataset, wherein the data in data set can preparatory correlation tag;Determination unit 503 is configured to from preset Machine learning frame set in choose the indicated machine learning frame of machine learning frame mark as target machine study Frame;Training unit 504 is configured under target machine learning framework, and indicated machine is identified using machine learning algorithm Parameter indicated by learning algorithm and parameter information, associated by the training data concentrated based on training dataset and training data Label, training obtain disaggregated model.
In the present embodiment, in the device of training pattern 500: acquiring unit 501, generation unit 502, determination unit 503 and training unit 504 specific processing and its brought technical effect can be respectively with reference to the step in Fig. 2 corresponding embodiment 201, the related description of step 202 step 203 and step 204, details are not described herein.
In some optional implementations of the present embodiment, configuration file can also include being closed for characterize data collection The correspondence relationship information of corresponding relationship between the label and target labels of connection, target labels can be to be used in the training process The label of prediction;And training unit 504 can be further configured to: above-mentioned machine learning algorithm and above-mentioned parameter are utilized, Label and correspondence relationship information associated by the training data concentrated based on training dataset, training data, training are classified Model.
In some optional implementations of the present embodiment, above-mentioned apparatus 500 can also include: display unit (in figure It is not shown), it is configured to execute operation presented below: choosing data from data set as test data, generate test data Collection;Class prediction is carried out to the test data that test data is concentrated using disaggregated model, obtains prediction result;Based on test data Label associated by the test data of concentration and prediction result carry out evaluation to disaggregated model, generate assessment result, And show assessment result.
In some optional implementations of the present embodiment, above-mentioned displaying operate the step of can also include: to training It takes turns number and executes increment operation;Determine whether exercise wheel number reaches target exercise wheel number, if so, terminating above-mentioned displaying operation;Such as Fruit exercise wheel number miss the mark exercise wheel number chooses data then using disaggregated model as initial model again from data set As training data, training dataset is generated, under target machine learning framework, using above-mentioned machine learning algorithm, based on weight Label associated by the training data that newly-generated training dataset and training data are concentrated, is trained initial model, obtains To disaggregated model, and continue to execute above-mentioned displaying operation.
In some optional implementations of the present embodiment, acquiring unit 501 can be further configured to: being obtained and used The information that family configures on configuration interface generates configuration file based on the information, wherein configuration interface may include machine learning Frame chosen area, machine learning algorithm chosen area and parameter information chosen area.
In some optional implementations of the present embodiment, it can be shown at least in machine learning algorithm chosen area One machine learning algorithm name group, machine learning algorithm name group can be according to machine learning algorithm purposes division.
The device provided by the above embodiment of the application, by obtaining configuration file, then from the data in configuration file Data are chosen in data set indicated by set identifier as training data, training dataset are generated, then from preset engineering It practises and chooses the indicated machine learning frame of the mark of the machine learning frame in configuration file in frame set as target machine Learning framework, to identify indicated machine using the machine learning algorithm in configuration file under target machine learning framework Parameter indicated by device learning algorithm and parameter information, associated by the training data concentrated based on training dataset and training data Label, training obtain disaggregated model.Family can be used and select different machine learning frame and corresponding according to actual needs Machine learning algorithm, parameter extend the scope of application of model training.
Below with reference to Fig. 6, it is (such as shown in FIG. 1 that it illustrates the electronic equipments for being suitable for being used to realize the embodiment of the present application Server 105) computer system 600 structural schematic diagram.Electronic equipment shown in Fig. 6 is only an example, should not be right The function and use scope of the embodiment of the present application bring any restrictions.
As shown in fig. 6, computer system 600 includes central processing unit (CPU) 601, it can be read-only according to being stored in Program in memory (ROM) 602 or be loaded into the program in random access storage device (RAM) 603 from storage section 608 and Execute various movements appropriate and processing.In RAM 603, also it is stored with system 600 and operates required various programs and data. CPU 601, ROM 602 and RAM 603 are connected with each other by bus 604.Input/output (I/O) interface 605 is also connected to always Line 604.
I/O interface 605 is connected to lower component: the importation 606 including keyboard, mouse etc.;It is penetrated including such as cathode The output par, c 607 of spool (CRT), liquid crystal display (LCD) etc. and loudspeaker etc.;Storage section 608 including hard disk etc.; And the communications portion 609 of the network interface card including LAN card, modem etc..Communications portion 609 via such as because The network of spy's net executes communication process.Driver 610 is also connected to I/O interface 605 as needed.Detachable media 611, such as Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 610, in order to read from thereon Computer program be mounted into storage section 608 as needed.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium On computer program, which includes the program code for method shown in execution flow chart.In such reality It applies in example, which can be downloaded and installed from network by communications portion 609, and/or from detachable media 611 are mounted.When the computer program is executed by central processing unit (CPU) 601, executes and limited in the system of the application Above-mentioned function.
It should be noted that computer-readable medium shown in the application can be computer-readable signal media or meter Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but not Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.Meter The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, just of one or more conducting wires Taking formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only storage Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device, Or above-mentioned any appropriate combination.In this application, computer readable storage medium can be it is any include or storage journey The tangible medium of sequence, the program can be commanded execution system, device or device use or in connection.And at this In application, computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal, Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for By the use of instruction execution system, device or device or program in connection.Include on computer-readable medium Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned Any appropriate combination.
The calculating of the operation for executing the application can be write with one or more programming languages or combinations thereof Machine program code, described program design language include object oriented program language-such as Java, Smalltalk, C+ +, further include conventional procedural programming language-such as " C " language or similar programming language.Program code can Fully to execute, partly execute on the user computer on the user computer, be executed as an independent software package, Part executes on the remote computer or executes on a remote computer or server completely on the user computer for part. In situations involving remote computers, remote computer can pass through the network of any kind --- including local area network (LAN) Or wide area network (WAN)-is connected to subscriber computer, or, it may be connected to outer computer (such as utilize Internet service Provider is connected by internet).
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction It closes to realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard The mode of part is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor packet Include acquiring unit, generation unit, determination unit and analytical unit.Wherein, the title of these units not structure under certain conditions The restriction of the pairs of unit itself, for example, acquiring unit is also described as " obtaining the unit of configuration file ".
As on the other hand, present invention also provides a kind of computer-readable medium, which be can be Included in electronic equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying electronic equipment. Above-mentioned computer-readable medium carries one or more program, when the electronics is set by one for said one or multiple programs When standby execution, so that the electronic equipment: obtaining configuration file, wherein configuration file may include data set identification, machine learning Frame mark, machine learning algorithm mark and parameter information;Data conduct is chosen from data set indicated by data set identification Training data generates training dataset, wherein the data in data set can preparatory correlation tag;From preset machine learning The indicated machine learning frame of machine learning frame mark is chosen in frame set as target machine learning framework;In target Under machine learning frame, ginseng indicated by indicated machine learning algorithm and parameter information is identified using machine learning algorithm It counts, label associated by the training data concentrated based on training dataset and training data, training obtains disaggregated model.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic Scheme, while should also cover in the case where not departing from foregoing invention design, it is carried out by above-mentioned technical characteristic or its equivalent feature Any combination and the other technical solutions formed.Such as features described above has similar function with (but being not limited to) disclosed herein Can technical characteristic replaced mutually and the technical solution that is formed.

Claims (14)

1. a kind of method for training pattern, comprising:
Obtain configuration file, wherein the configuration file includes data set identification, machine learning frame mark, machine learning calculation Method mark and parameter information;
Data are chosen from data set indicated by the data set identification as training data, generate training dataset, wherein The preparatory correlation tag of data in the data set;
The indicated machine learning frame of the machine learning frame mark is chosen from preset machine learning frame set to make For target machine learning framework;
Under the target machine learning framework, indicated machine learning algorithm and institute are identified using the machine learning algorithm Parameter indicated by parameter information is stated, label associated by the training data concentrated based on training dataset and training data, instruction Get disaggregated model.
2. according to the method described in claim 1, wherein, the configuration file further includes for characterizing associated by the data set Label and target labels between corresponding relationship correspondence relationship information, the target labels are in the training process for pre- The label of survey;And
It is described that ginseng indicated by indicated machine learning algorithm and the parameter information is identified using the machine learning algorithm It counts, label associated by the training data concentrated based on training dataset and training data, training obtains disaggregated model, comprising:
Using the machine learning algorithm and the parameter, the training data concentrated based on training dataset, training data is closed The label of connection and the correspondence relationship information, training obtain disaggregated model.
3. according to the method described in claim 1, wherein, the method also includes:
Execute operation presented below:
Data are chosen from the data set as test data, generate test data set;
Class prediction is carried out to the test data that the test data is concentrated using disaggregated model, obtains prediction result;
Label and the prediction result associated by the test data concentrated based on the test data carry out disaggregated model pre- Recruitment evaluation is surveyed, generates assessment result, and show the assessment result.
4. according to the method described in claim 3, wherein, the step of displaying operates further include:
Increment operation is executed to exercise wheel number;
Determine whether exercise wheel number reaches target exercise wheel number, if so, terminating the displaying operation;
If exercise wheel number is not up to the target exercise wheel number, using disaggregated model as initial model, from the data set In choose data again as training data, generate training dataset and utilize the machine under the target machine learning framework Device learning algorithm, label associated by the training data concentrated based on the training dataset regenerated and training data, to first Beginning model is trained, and obtains disaggregated model, and continues to execute the displaying operation.
5. according to the method described in claim 1, wherein, the acquisition configuration file includes:
The information that user configures on configuration interface is obtained, configuration file is generated based on the information, wherein the configuration interface Including machine learning frame chosen area, machine learning algorithm chosen area and parameter information chosen area.
6. according to the method described in claim 5, wherein, showing there is at least one machine in the machine learning algorithm chosen area Device learning algorithm name group, machine learning algorithm name group are divided according to the purposes of machine learning algorithm.
7. a kind of device for training pattern, comprising:
Acquiring unit is configured to obtain configuration file, wherein the configuration file includes data set identification, machine learning frame Frame mark, machine learning algorithm mark and parameter information;
Generation unit is configured to choose data from data set indicated by the data set identification as training data, life At training dataset, wherein the preparatory correlation tag of data in the data set;
Determination unit is configured to choose from preset machine learning frame set indicated by the machine learning frame mark Machine learning frame as target machine learning framework;
Training unit is configured under the target machine learning framework, using indicated by machine learning algorithm mark Machine learning algorithm and the parameter information indicated by parameter, the training number concentrated based on training dataset and training data According to associated label, training obtains disaggregated model.
8. device according to claim 7, wherein the configuration file further includes for characterizing associated by the data set Label and target labels between corresponding relationship correspondence relationship information, the target labels are in the training process for pre- The label of survey;And
The training unit is further configured to:
Using the machine learning algorithm and the parameter, the training data concentrated based on training dataset, training data is closed The label of connection and the correspondence relationship information, training obtain disaggregated model.
9. device according to claim 7, wherein described device further include:
Display unit is configured to execute operation presented below:
Data are chosen from the data set as test data, generate test data set;
Class prediction is carried out to the test data that the test data is concentrated using disaggregated model, obtains prediction result;
Label and the prediction result associated by the test data concentrated based on the test data carry out disaggregated model pre- Recruitment evaluation is surveyed, generates assessment result, and show the assessment result.
10. device according to claim 9, wherein the step of displaying operates further include:
Increment operation is executed to exercise wheel number;
Determine whether exercise wheel number reaches target exercise wheel number, if so, terminating the displaying operation;
If exercise wheel number is not up to the target exercise wheel number, using disaggregated model as initial model, from the data set In choose data again as training data, generate training dataset and utilize the machine under the target machine learning framework Device learning algorithm, label associated by the training data concentrated based on the training dataset regenerated and training data, to first Beginning model is trained, and obtains disaggregated model, and continues to execute the displaying operation.
11. device according to claim 7, wherein the acquiring unit is further configured to:
The information that user configures on configuration interface is obtained, configuration file is generated based on the information, wherein the configuration interface Including machine learning frame chosen area, machine learning algorithm chosen area and parameter information chosen area.
12. device according to claim 11, wherein show there is at least one in the machine learning algorithm chosen area Machine learning algorithm name group, machine learning algorithm name group are divided according to the purposes of machine learning algorithm.
13. a kind of electronic equipment, comprising:
One or more processors;
Storage device is stored thereon with one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors are real Now such as method as claimed in any one of claims 1 to 6.
14. a kind of computer-readable medium, is stored thereon with computer program, wherein real when described program is executed by processor Now such as method as claimed in any one of claims 1 to 6.
CN201811157474.1A 2018-09-30 2018-09-30 Method and apparatus for training pattern Pending CN109325541A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811157474.1A CN109325541A (en) 2018-09-30 2018-09-30 Method and apparatus for training pattern

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811157474.1A CN109325541A (en) 2018-09-30 2018-09-30 Method and apparatus for training pattern

Publications (1)

Publication Number Publication Date
CN109325541A true CN109325541A (en) 2019-02-12

Family

ID=65266345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811157474.1A Pending CN109325541A (en) 2018-09-30 2018-09-30 Method and apparatus for training pattern

Country Status (1)

Country Link
CN (1) CN109325541A (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109918205A (en) * 2019-03-25 2019-06-21 深圳市网心科技有限公司 A kind of edge device dispatching method, system, device and computer storage medium
CN110008028A (en) * 2019-04-10 2019-07-12 北京旷视科技有限公司 Computational resource allocation method, apparatus, computer equipment and storage medium
CN110211591A (en) * 2019-06-24 2019-09-06 卓尔智联(武汉)研究院有限公司 Interview data analysing method, computer installation and medium based on emotional semantic classification
CN110232130A (en) * 2019-05-20 2019-09-13 平安科技(深圳)有限公司 Metadata management pedigree generation method, device, computer equipment and storage medium
CN110263938A (en) * 2019-06-19 2019-09-20 北京百度网讯科技有限公司 Method and apparatus for generating information
CN110276456A (en) * 2019-06-20 2019-09-24 山东大学 A kind of machine learning model auxiliary construction method, system, equipment and medium
CN110321619A (en) * 2019-06-26 2019-10-11 深圳技术大学 Parametrization customizing model generation method based on voice data
CN110322337A (en) * 2019-04-18 2019-10-11 中国工商银行股份有限公司 A kind of inquiry business looks into multiple method and device automatically
CN110334816A (en) * 2019-07-12 2019-10-15 深圳市智物联网络有限公司 A kind of industrial equipment detection method, device, equipment and readable storage medium storing program for executing
CN110389834A (en) * 2019-06-28 2019-10-29 苏州浪潮智能科技有限公司 A kind of method and apparatus for submitting deep learning training mission
CN110597874A (en) * 2019-08-26 2019-12-20 数字广东网络建设有限公司 Data analysis model creation method and device, computer equipment and storage medium
CN110928528A (en) * 2019-10-23 2020-03-27 深圳市华讯方舟太赫兹科技有限公司 Development method of algorithm model, terminal device and computer storage medium
CN110991649A (en) * 2019-10-28 2020-04-10 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) Deep learning model building method, device, equipment and storage medium
CN111027707A (en) * 2019-11-22 2020-04-17 北京金山云网络技术有限公司 Model optimization method and device and electronic equipment
CN111047049A (en) * 2019-12-05 2020-04-21 北京小米移动软件有限公司 Method, apparatus and medium for processing multimedia data based on machine learning model
CN111079892A (en) * 2019-10-30 2020-04-28 华为技术有限公司 Deep learning model training method, device and system
CN111191797A (en) * 2020-01-03 2020-05-22 深圳追一科技有限公司 Information processing method, information processing device, electronic equipment and storage medium
CN111199287A (en) * 2019-12-16 2020-05-26 北京淇瑀信息科技有限公司 Feature engineering real-time recommendation method and device and electronic equipment
CN111222553A (en) * 2019-12-30 2020-06-02 广州华多网络科技有限公司 Training data processing method and device of machine learning model and computer equipment
CN111242320A (en) * 2020-01-16 2020-06-05 京东数字科技控股有限公司 Machine learning method and device, electronic equipment and storage medium
CN111242309A (en) * 2020-01-03 2020-06-05 支付宝(杭州)信息技术有限公司 Method and device for forming machine learning application system and electronic equipment
CN111291071A (en) * 2020-01-21 2020-06-16 北京字节跳动网络技术有限公司 Data processing method and device and electronic equipment
CN111310934A (en) * 2020-02-14 2020-06-19 北京百度网讯科技有限公司 Model generation method and device, electronic equipment and storage medium
CN111369011A (en) * 2020-04-16 2020-07-03 光际科技(上海)有限公司 Method and device for applying machine learning model, computer equipment and storage medium
CN111400443A (en) * 2020-03-04 2020-07-10 北京松果电子有限公司 Information processing method, device and storage medium
CN111444021A (en) * 2020-04-02 2020-07-24 电子科技大学 Synchronous training method, server and system based on distributed machine learning
CN111488995A (en) * 2020-04-08 2020-08-04 北京字节跳动网络技术有限公司 Method and apparatus for evaluating a joint training model
CN111626760A (en) * 2019-02-28 2020-09-04 富士通株式会社 Distribution method and apparatus, extraction method and apparatus, and computer-readable recording medium
CN111626762A (en) * 2019-02-28 2020-09-04 富士通株式会社 Computer-readable recording medium, determination method, and information processing apparatus
CN111708520A (en) * 2020-06-16 2020-09-25 北京百度网讯科技有限公司 Application construction method and device, electronic equipment and storage medium
CN111797126A (en) * 2019-04-08 2020-10-20 阿里巴巴集团控股有限公司 Data processing method, device and equipment
CN112000389A (en) * 2020-06-24 2020-11-27 苏州浪潮智能科技有限公司 Configuration recommendation method, system, device and computer storage medium
CN112101297A (en) * 2020-10-14 2020-12-18 杭州海康威视数字技术股份有限公司 Training data set determination method, behavior analysis method, device, system and medium
CN112101567A (en) * 2020-09-15 2020-12-18 厦门渊亭信息科技有限公司 Automatic modeling method and device based on artificial intelligence
CN112163677A (en) * 2020-10-14 2021-01-01 杭州海康威视数字技术股份有限公司 Method, device and equipment for applying machine learning model
CN112183758A (en) * 2019-07-04 2021-01-05 华为技术有限公司 Method and device for realizing model training and computer storage medium
CN112199189A (en) * 2019-07-08 2021-01-08 国际商业机器公司 Adaptation of deep learning models to resource constrained edge devices
CN112288006A (en) * 2020-10-29 2021-01-29 深圳开立生物医疗科技股份有限公司 Image processing model construction method, device, equipment and readable storage medium
CN112613580A (en) * 2020-12-30 2021-04-06 中国工商银行股份有限公司 Method, device, system and medium for defending machine learning model from attack
CN112949670A (en) * 2019-12-10 2021-06-11 京东数字科技控股有限公司 Data set switching method and device for federal learning model
CN113438663A (en) * 2020-03-04 2021-09-24 诺基亚通信公司 Machine learning based handover parameter optimization
CN113516251A (en) * 2021-08-05 2021-10-19 上海高德威智能交通系统有限公司 Machine learning system and model training method
CN113704618A (en) * 2021-08-31 2021-11-26 中国平安人寿保险股份有限公司 Data processing method, device, equipment and medium based on deep learning model
CN113822322A (en) * 2021-07-15 2021-12-21 腾讯科技(深圳)有限公司 Image processing model training method and text processing model training method
WO2023124029A1 (en) * 2021-12-27 2023-07-06 北京百度网讯科技有限公司 Deep learning model training method and apparatus, and content recommendation method and apparatus
WO2023245522A1 (en) * 2022-06-22 2023-12-28 极纳人工智能有限公司 Method and apparatus for generating target deep learning model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN106648654A (en) * 2016-12-20 2017-05-10 深圳先进技术研究院 Data sensing-based Spark configuration parameter automatic optimization method
CN107103050A (en) * 2017-03-31 2017-08-29 海通安恒(大连)大数据科技有限公司 A kind of big data Modeling Platform and method
CN108279890A (en) * 2017-01-06 2018-07-13 阿里巴巴集团控股有限公司 Component is issued and the constructing method of component based on graphical machine learning algorithm platform, graphical machine learning algorithm platform

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782976A (en) * 2010-01-15 2010-07-21 南京邮电大学 Automatic selection method for machine learning in cloud computing environment
CN106648654A (en) * 2016-12-20 2017-05-10 深圳先进技术研究院 Data sensing-based Spark configuration parameter automatic optimization method
CN108279890A (en) * 2017-01-06 2018-07-13 阿里巴巴集团控股有限公司 Component is issued and the constructing method of component based on graphical machine learning algorithm platform, graphical machine learning algorithm platform
CN107103050A (en) * 2017-03-31 2017-08-29 海通安恒(大连)大数据科技有限公司 A kind of big data Modeling Platform and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈敏: "《认知计算导论》", 31 May 2017, 武汉:华中科技大学出版社 *

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626760B (en) * 2019-02-28 2023-09-08 富士通株式会社 Distribution method and apparatus, extraction method and apparatus, and computer-readable recording medium
CN111626760A (en) * 2019-02-28 2020-09-04 富士通株式会社 Distribution method and apparatus, extraction method and apparatus, and computer-readable recording medium
CN111626762A (en) * 2019-02-28 2020-09-04 富士通株式会社 Computer-readable recording medium, determination method, and information processing apparatus
CN109918205B (en) * 2019-03-25 2023-11-17 深圳市网心科技有限公司 Edge equipment scheduling method, system, device and computer storage medium
CN109918205A (en) * 2019-03-25 2019-06-21 深圳市网心科技有限公司 A kind of edge device dispatching method, system, device and computer storage medium
CN111797126A (en) * 2019-04-08 2020-10-20 阿里巴巴集团控股有限公司 Data processing method, device and equipment
CN111797126B (en) * 2019-04-08 2024-04-02 阿里巴巴集团控股有限公司 Data processing method, device and equipment
CN110008028A (en) * 2019-04-10 2019-07-12 北京旷视科技有限公司 Computational resource allocation method, apparatus, computer equipment and storage medium
CN110322337A (en) * 2019-04-18 2019-10-11 中国工商银行股份有限公司 A kind of inquiry business looks into multiple method and device automatically
CN110232130A (en) * 2019-05-20 2019-09-13 平安科技(深圳)有限公司 Metadata management pedigree generation method, device, computer equipment and storage medium
CN110232130B (en) * 2019-05-20 2024-02-02 平安科技(深圳)有限公司 Metadata management pedigree generation method, apparatus, computer device and storage medium
CN110263938A (en) * 2019-06-19 2019-09-20 北京百度网讯科技有限公司 Method and apparatus for generating information
US11436540B2 (en) 2019-06-19 2022-09-06 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for generating information
CN110276456A (en) * 2019-06-20 2019-09-24 山东大学 A kind of machine learning model auxiliary construction method, system, equipment and medium
CN110211591B (en) * 2019-06-24 2021-12-21 卓尔智联(武汉)研究院有限公司 Interview data analysis method based on emotion classification, computer device and medium
CN110211591A (en) * 2019-06-24 2019-09-06 卓尔智联(武汉)研究院有限公司 Interview data analysing method, computer installation and medium based on emotional semantic classification
CN110321619A (en) * 2019-06-26 2019-10-11 深圳技术大学 Parametrization customizing model generation method based on voice data
CN110389834A (en) * 2019-06-28 2019-10-29 苏州浪潮智能科技有限公司 A kind of method and apparatus for submitting deep learning training mission
CN110389834B (en) * 2019-06-28 2022-07-12 苏州浪潮智能科技有限公司 Method and device for submitting deep learning training task
CN112183758A (en) * 2019-07-04 2021-01-05 华为技术有限公司 Method and device for realizing model training and computer storage medium
CN112199189A (en) * 2019-07-08 2021-01-08 国际商业机器公司 Adaptation of deep learning models to resource constrained edge devices
CN110334816A (en) * 2019-07-12 2019-10-15 深圳市智物联网络有限公司 A kind of industrial equipment detection method, device, equipment and readable storage medium storing program for executing
CN110597874A (en) * 2019-08-26 2019-12-20 数字广东网络建设有限公司 Data analysis model creation method and device, computer equipment and storage medium
CN110597874B (en) * 2019-08-26 2022-05-27 数字广东网络建设有限公司 Data analysis model creation method and device, computer equipment and storage medium
CN110928528A (en) * 2019-10-23 2020-03-27 深圳市华讯方舟太赫兹科技有限公司 Development method of algorithm model, terminal device and computer storage medium
CN110991649A (en) * 2019-10-28 2020-04-10 中国电子产品可靠性与环境试验研究所((工业和信息化部电子第五研究所)(中国赛宝实验室)) Deep learning model building method, device, equipment and storage medium
CN111079892A (en) * 2019-10-30 2020-04-28 华为技术有限公司 Deep learning model training method, device and system
CN111027707A (en) * 2019-11-22 2020-04-17 北京金山云网络技术有限公司 Model optimization method and device and electronic equipment
CN111027707B (en) * 2019-11-22 2023-08-22 北京金山云网络技术有限公司 Model optimization method and device and electronic equipment
CN111047049B (en) * 2019-12-05 2023-08-11 北京小米移动软件有限公司 Method, device and medium for processing multimedia data based on machine learning model
CN111047049A (en) * 2019-12-05 2020-04-21 北京小米移动软件有限公司 Method, apparatus and medium for processing multimedia data based on machine learning model
CN112949670A (en) * 2019-12-10 2021-06-11 京东数字科技控股有限公司 Data set switching method and device for federal learning model
CN111199287A (en) * 2019-12-16 2020-05-26 北京淇瑀信息科技有限公司 Feature engineering real-time recommendation method and device and electronic equipment
CN111222553A (en) * 2019-12-30 2020-06-02 广州华多网络科技有限公司 Training data processing method and device of machine learning model and computer equipment
CN111222553B (en) * 2019-12-30 2023-08-29 广州方硅信息技术有限公司 Training data processing method and device of machine learning model and computer equipment
CN111191797A (en) * 2020-01-03 2020-05-22 深圳追一科技有限公司 Information processing method, information processing device, electronic equipment and storage medium
CN111242309A (en) * 2020-01-03 2020-06-05 支付宝(杭州)信息技术有限公司 Method and device for forming machine learning application system and electronic equipment
CN111242320A (en) * 2020-01-16 2020-06-05 京东数字科技控股有限公司 Machine learning method and device, electronic equipment and storage medium
CN111291071A (en) * 2020-01-21 2020-06-16 北京字节跳动网络技术有限公司 Data processing method and device and electronic equipment
CN111291071B (en) * 2020-01-21 2023-10-17 北京字节跳动网络技术有限公司 Data processing method and device and electronic equipment
CN111310934A (en) * 2020-02-14 2020-06-19 北京百度网讯科技有限公司 Model generation method and device, electronic equipment and storage medium
CN111310934B (en) * 2020-02-14 2023-10-17 北京百度网讯科技有限公司 Model generation method and device, electronic equipment and storage medium
CN111400443B (en) * 2020-03-04 2023-10-20 北京小米松果电子有限公司 Information processing method, device and storage medium
CN113438663A (en) * 2020-03-04 2021-09-24 诺基亚通信公司 Machine learning based handover parameter optimization
CN111400443A (en) * 2020-03-04 2020-07-10 北京松果电子有限公司 Information processing method, device and storage medium
CN111444021A (en) * 2020-04-02 2020-07-24 电子科技大学 Synchronous training method, server and system based on distributed machine learning
CN111444021B (en) * 2020-04-02 2023-03-24 电子科技大学 Synchronous training method, server and system based on distributed machine learning
CN111488995B (en) * 2020-04-08 2021-12-24 北京字节跳动网络技术有限公司 Method, device and system for evaluating joint training model
CN111488995A (en) * 2020-04-08 2020-08-04 北京字节跳动网络技术有限公司 Method and apparatus for evaluating a joint training model
CN111369011A (en) * 2020-04-16 2020-07-03 光际科技(上海)有限公司 Method and device for applying machine learning model, computer equipment and storage medium
CN111708520A (en) * 2020-06-16 2020-09-25 北京百度网讯科技有限公司 Application construction method and device, electronic equipment and storage medium
CN111708520B (en) * 2020-06-16 2023-08-29 北京百度网讯科技有限公司 Application construction method and device, electronic equipment and storage medium
CN112000389A (en) * 2020-06-24 2020-11-27 苏州浪潮智能科技有限公司 Configuration recommendation method, system, device and computer storage medium
CN112000389B (en) * 2020-06-24 2022-07-08 苏州浪潮智能科技有限公司 Configuration recommendation method, system, device and computer storage medium
CN112101567A (en) * 2020-09-15 2020-12-18 厦门渊亭信息科技有限公司 Automatic modeling method and device based on artificial intelligence
CN112101297A (en) * 2020-10-14 2020-12-18 杭州海康威视数字技术股份有限公司 Training data set determination method, behavior analysis method, device, system and medium
CN112163677A (en) * 2020-10-14 2021-01-01 杭州海康威视数字技术股份有限公司 Method, device and equipment for applying machine learning model
CN112163677B (en) * 2020-10-14 2023-09-19 杭州海康威视数字技术股份有限公司 Method, device and equipment for applying machine learning model
CN112288006A (en) * 2020-10-29 2021-01-29 深圳开立生物医疗科技股份有限公司 Image processing model construction method, device, equipment and readable storage medium
CN112613580A (en) * 2020-12-30 2021-04-06 中国工商银行股份有限公司 Method, device, system and medium for defending machine learning model from attack
CN112613580B (en) * 2020-12-30 2024-02-02 中国工商银行股份有限公司 Method, device, system and medium for defending machine learning model from attack
CN113822322A (en) * 2021-07-15 2021-12-21 腾讯科技(深圳)有限公司 Image processing model training method and text processing model training method
CN113516251A (en) * 2021-08-05 2021-10-19 上海高德威智能交通系统有限公司 Machine learning system and model training method
CN113516251B (en) * 2021-08-05 2023-06-06 上海高德威智能交通系统有限公司 Machine learning system and model training method
CN113704618B (en) * 2021-08-31 2023-07-25 中国平安人寿保险股份有限公司 Data processing method, device, equipment and medium based on deep learning model
CN113704618A (en) * 2021-08-31 2021-11-26 中国平安人寿保险股份有限公司 Data processing method, device, equipment and medium based on deep learning model
WO2023124029A1 (en) * 2021-12-27 2023-07-06 北京百度网讯科技有限公司 Deep learning model training method and apparatus, and content recommendation method and apparatus
WO2023245522A1 (en) * 2022-06-22 2023-12-28 极纳人工智能有限公司 Method and apparatus for generating target deep learning model

Similar Documents

Publication Publication Date Title
CN109325541A (en) Method and apparatus for training pattern
CN109145781B (en) Method and apparatus for processing image
CN108416310B (en) Method and apparatus for generating information
CN108898185A (en) Method and apparatus for generating image recognition model
CN109922032A (en) Method and apparatus for determining the risk of logon account
CN107168952A (en) Information generating method and device based on artificial intelligence
CN108830235A (en) Method and apparatus for generating information
CN109325213A (en) Method and apparatus for labeled data
CN108989882A (en) Method and apparatus for exporting the snatch of music in video
CN108734293A (en) Task management system, method and apparatus
CN109976997A (en) Test method and device
CN109740018A (en) Method and apparatus for generating video tab model
CN109410253B (en) For generating method, apparatus, electronic equipment and the computer-readable medium of information
CN108831505A (en) The method and apparatus for the usage scenario applied for identification
CN109934242A (en) Image identification method and device
CN109815365A (en) Method and apparatus for handling video
CN110457476A (en) Method and apparatus for generating disaggregated model
CN107958247A (en) Method and apparatus for facial image identification
CN108255706A (en) Edit methods, device, terminal device and the storage medium of automatic test script
CN109947989A (en) Method and apparatus for handling video
CN108121699A (en) For the method and apparatus of output information
CN110443824A (en) Method and apparatus for generating information
CN108182472A (en) For generating the method and apparatus of information
CN109299477A (en) Method and apparatus for generating text header
CN108509921A (en) Method and apparatus for generating information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190212

RJ01 Rejection of invention patent application after publication