CN108229686A - Model training, Forecasting Methodology, device, electronic equipment and machine learning platform - Google Patents
Model training, Forecasting Methodology, device, electronic equipment and machine learning platform Download PDFInfo
- Publication number
- CN108229686A CN108229686A CN201611153818.2A CN201611153818A CN108229686A CN 108229686 A CN108229686 A CN 108229686A CN 201611153818 A CN201611153818 A CN 201611153818A CN 108229686 A CN108229686 A CN 108229686A
- Authority
- CN
- China
- Prior art keywords
- algorithm
- prediction
- file
- processing step
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The application provides a kind of model training, Forecasting Methodology, device, electronic equipment and machine learning platform;The model training method includes:The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to describe the dependence between the multiple algorithm process step;According to the description file of the algorithm, the execution file of the algorithm is loaded, model training is carried out according to training data;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.At least one embodiment of the application can allow user customize model training process.
Description
Technical field
The present invention relates to computer realm more particularly to a kind of model training, Forecasting Methodology, device, electronic equipment and machines
Device learning platform.
Background technology
Machine learning (Machine Learning, ML) is a multi-field cross discipline, how specializes in computer
Simulation or the learning behavior for realizing the mankind, to obtain new knowledge or skills, reorganize the existing structure of knowledge and are allowed to continuous
Improve the performance of itself.
Machine learning generally includes following steps 101~104:
101st, training dataset is obtained, data are pre-processed, is such as normalized, removal singular point etc.;
102nd, machine learning method, strategy (how defining loss function) and algorithm (Algorithm) are selected;Wherein,
In machine learning field, the theory that algorithm refers to solve certain a kind of particular problem based on mathematics, statistics, informatics etc. is patrolled
Volume, it can be described by computer language;
103rd, pass through training data set and algorithm training pattern (model);Wherein, model refers to that algorithm off-line training produces
The result gone out;
104th, new data is predicted using trained model.
There are following defects for current machine learning platform:
It is typically only capable to be trained and predict using the model built in platform, user oneself can not be allowed to develop algorithm, it can not
Support model training and the prediction of customizable, it is impossible to meet the needs of multiple business scene;It needs to stop instruction during more new algorithm
Practice, therefore, it is difficult to accomplish that the dynamic of algorithm updates;And the model of off-line training cannot be used directly for on-line prediction, need user
It is about to trained model certainly in prediction to be loaded into forecasting system.
Invention content
The one side of the application provides a kind of model training method, device and electronic equipment, user can be allowed to customize
The process of model training.
The application adopts the following technical scheme that.
A kind of model training method, including:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to retouch
State the dependence between the multiple algorithm process step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model instruction is carried out according to training data
Practice;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
Wherein, the description information can be the directed acyclic graph DAG structures described with structure description language, and DAG is tied
Each node in structure can correspond to an algorithm process step respectively.
Wherein, can also include before the description file of the loading algorithm:
The description file of receiving algorithm and the execution file of algorithm, metadata system is stored in by the description file of algorithm,
The execution file of algorithm is stored in file system.
Wherein, the title of algorithm can also be included in the description file of algorithm;
The description file of the loading algorithm can include:
The request of training pattern is received, the title for the algorithm for being ready to use in training pattern, root are included in the request of training pattern
According to the title of algorithm included in the request of training pattern, the description file of loading algorithm.
Wherein, the description file according to algorithm loads the execution file of the algorithm, and mould is carried out according to training data
Type training can include:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts carries out model training according to training data.
Wherein, the description file according to algorithm loads the execution file of the algorithm, and mould is carried out according to training data
It can also include after type training:
After the request for receiving deployment model, according to prediction description file loading it is each prediction processing step execution file and
The model used;Wherein, the prediction description file is used to describe what each prediction processing step that prediction process is included used
Model;When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe each prediction processing step
Rapid performs sequence.
Wherein, it is described to describe to go back after file loads each execution file for predicting processing step and the model used according to prediction
It can include:
File is described according to prediction, performs each prediction processing step respectively to prediction data;
Wherein, performing each prediction processing step respectively to prediction data can include:To it is each prediction processing step respectively into
Row is following to be operated:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction
The model that processing step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
A kind of model training method, including:
After the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads, described in loading
The execution file of algorithm;Wherein, the description file of the algorithm is used to describe the dependence pass performed between file of the algorithm
System;The execution file of the algorithm includes one or more of:The execution file of user's upload, the execution pre-saved
File;
The corresponding model of the algorithm is trained according to training data.
Wherein, the model training method can also include:
After the instruction for receiving user's more new algorithm, one or more of operation is carried out:
The description file of the algorithm uploaded using the user replaces the description file of original algorithm, according to replaced
The execution file of the description file loading algorithm of algorithm;
The execution file for performing file and replacing original algorithm of the algorithm uploaded using the user.
Wherein, the description file of the algorithm uploaded according to the user, the execution file of loading algorithm can include:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts.
A kind of model training apparatus, including:
Load-on module, for the description file of loading algorithm;The description file of the algorithm includes description information, described to retouch
Information is stated for describing the dependence between the multiple algorithm process step;
Training module for the description file according to the algorithm, loads the execution file of the algorithm, according to training number
According to progress model training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
Wherein, the model training apparatus can also include:
Deployment module, for after the request of deployment model is received, loading each prediction according to prediction description file and handling
The execution file of step and the model used;Wherein, it is described prediction description file for describe prediction process included it is each pre-
Survey the model that processing step uses;When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to
Describe each prediction processing step performs sequence.
Wherein, the model training apparatus can also include:
Prediction module, for performing text according to each prediction processing step of prediction description file loading in the deployment module
After part and the model used, file is described according to the prediction, performs each prediction processing step respectively to prediction data;
Wherein, it is described prediction data is performed respectively it is each prediction processing step include:To it is each prediction processing step respectively into
Row is following to be operated:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction
The model that processing step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
A kind of model training apparatus, including:
Scheduler module, for after the instruction for receiving user's training pattern, according to the description for the algorithm that the user uploads
File loads the execution file of the algorithm;Wherein, the description file of the algorithm is used to describe the execution file of the algorithm
Between dependence;The execution file of the algorithm includes one or more of:It is execution file that the user uploads, pre-
The execution file first preserved;
Algorithm runs module, for being trained according to training data to the corresponding model of the algorithm.
A kind of electronic equipment for carrying out model training, including:Memory and processor;
The memory is used to preserve the program for carrying out model training;It is described to exist for carrying out the program of model training
When reading execution by the processor, following operate is performed:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to retouch
State the dependence between the multiple algorithm process step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model instruction is carried out according to training data
Practice;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
Wherein, it is described for carrying out the program of model training when reading execution by the processor, according to algorithm
File is described, loads the execution file of the algorithm, following operation can also be performed after carrying out model training according to training data:
After the request for receiving deployment model, according to prediction description file loading it is each prediction processing step execution file and
The model used;Wherein, the prediction description file is used to describe what each prediction processing step that prediction process is included used
Model;When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe each prediction processing step
Rapid performs sequence.
Wherein, it is described for carrying out the program of model training when reading execution by the processor, it is retouched according to prediction
Following operation can also be performed after stating each execution file for predicting processing step of file loading and the model used:
File is described according to the prediction, performs each prediction processing step respectively to prediction data;
Wherein, each prediction processing step is performed respectively to prediction data to include:To it is each prediction processing step carry out respectively with
Lower operation:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into prediction processing
The model that step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
A kind of electronic equipment for carrying out model training, including:Memory and processor;
The memory is used to preserve the program for carrying out model training;It is described to exist for carrying out the program of model training
When reading execution by the processor, following operate is performed:
After the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads, described in loading
The execution file of algorithm;Wherein, the description file of the algorithm is used to describe the dependence pass performed between file of the algorithm
System;The execution file of the algorithm includes one or more of:The execution file of user's upload, the execution pre-saved
File;
The corresponding model of the algorithm is trained according to training data.
Further aspect of the application provides a kind of model prediction method, apparatus and electronic equipment, user can be allowed to customize
Change the process of model prediction.
A kind of model prediction method, including:
File is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein, it is described pre-
Description file is surveyed to be used to describe the model that each prediction processing step that prediction process is included uses;When prediction process is comprising multiple
When predicting processing step, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
File is described according to prediction, performs each prediction processing step respectively to prediction data.
Wherein, it is described prediction data is performed respectively it is each prediction processing step can include:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, this is pre-
The input data for surveying processing step inputs the model that the prediction processing step uses, and obtains the output number of the prediction processing step
According to;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
Wherein, the prediction description file can be the directed acyclic graph DAG structures described with structure description language, often
A prediction processing step can be respectively as a node in DAG structures.
Wherein, described to describe file according to prediction, the model for loading the execution file of each prediction processing step and using can
To include:
After the request for receiving deployment model, deployment task is scheduled to working node by host node;
Working node according to deployment task it is corresponding prediction description file, load it is each prediction processing step execution file and
The model used.
Wherein, it is described that file is described according to prediction, before loading each execution file for predicting processing step and the model used
It can also include:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to retouch
State the dependence between the multiple algorithm process step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model instruction is carried out according to training data
Practice;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
A kind of model prediction device, including:
Deployment module for describing file according to prediction, loads each execution file for predicting processing step and the mould used
Type;Wherein, the prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;When pre-
When survey process includes multiple prediction processing steps, the execution that the prediction description file is additionally operable to describe each prediction processing step is suitable
Sequence;
Prediction module for describing file according to the prediction, performs prediction data each prediction processing step respectively.
Wherein, it is described prediction data is performed respectively it is each prediction processing step can include:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, this is pre-
The input data for surveying processing step inputs the model that the prediction processing step uses, and obtains the output number of the prediction processing step
According to;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
Wherein, the model prediction device can also include:
Load-on module, for the description file of loading algorithm;The description file of the algorithm includes description information, described to retouch
Information is stated for describing the dependence between the multiple algorithm process step;
Training module for the description file according to the algorithm, loads the execution file of the algorithm, according to training number
According to progress model training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
A kind of electronic equipment for carrying out model prediction, including:Memory and processor;
The memory is used to preserve the program for carrying out model prediction;It is described to exist for carrying out the program of model prediction
When reading execution by the processor, following operate is performed:
File is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein, it is described pre-
Description file is surveyed to be used to describe the model that each prediction processing step that prediction process is included uses;When prediction process is comprising multiple
When predicting processing step, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
File is described according to the prediction, performs each prediction processing step respectively to prediction data.
Wherein, it is described prediction data is performed respectively it is each prediction processing step can include:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, this is pre-
The input data for surveying processing step inputs the model that the prediction processing step uses, and obtains the output number of the prediction processing step
According to;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
Wherein, it is described for carrying out the program of model prediction when reading execution by the processor, it is retouched according to prediction
File is stated, following operation can also be performed before loading each execution file for predicting processing step and the model used:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to retouch
State the dependence between the multiple algorithm process step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model instruction is carried out according to training data
Practice;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
The another aspect of the application provides a kind of machine learning platform and its method for carrying out model training, can allow use
Issue algorithm in family.
A kind of machine learning platform, including:
Release module, for receiving the algorithm file of user's upload and preserving;
For loading the algorithm file of user's upload, model training is carried out according to training data for algorithm processing module.
A kind of method that machine learning platform carries out model training, including:
It receives the algorithm file that user uploads and preserves;
The algorithm file that user uploads is loaded, model training is carried out according to training data.
The application includes advantages below:
At least one embodiment of the application can run the execution file of algorithm according to the description file of algorithm, to carry out
The training of model, so that user can be realized by constructing and inputting the description file of algorithm and the execution file of algorithm
The customization of model training process.
At least one embodiment of the application can according to prediction description file loading prediction processing step execution file and
The model used, to carry out the prediction of model so that execution file, to carry out the training of model, so that user
The execution file of file and each prediction processing step can be described by construction and input prediction, implementation model prediction process is determined
System.
At least one embodiment of the application can allow user to issue algorithm, allow users to the training that designs a model as needed
The algorithm needed, so as to fulfill user to the self-defined of algorithm.
Certainly, implementing any product of the application must be not necessarily required to reach all the above advantage simultaneously.
Description of the drawings
Fig. 1 is the flow chart of the model training method of embodiment one;
Fig. 2 is the system architecture schematic diagram using the method for embodiment one;
Fig. 3 is the flow chart of the model prediction method of embodiment two;
Fig. 4 is the flow chart of three model training method of embodiment;
Fig. 5 is the schematic diagram of the model training apparatus of example IV;
Fig. 6 is the schematic diagram of the model prediction device of embodiment five;
Fig. 7 is the schematic diagram of the model training apparatus of embodiment six;
Fig. 8 is the model training of embodiment ten and the schematic diagram of forecasting system;
Fig. 9 is the flow chart of the model training process of embodiment ten;
Figure 10 is the flow chart of the model predictive process of embodiment ten;
Figure 11 is the schematic diagram of the machine learning platform of embodiment 11;
Figure 12 is the flow chart that the machine learning platform of embodiment 12 carries out the method for model training.
Specific embodiment
The technical solution of the application is described in detail below in conjunction with accompanying drawings and embodiments.
If it should be noted that not conflicting, each feature in the embodiment of the present application and embodiment can be tied mutually
It closes, within the protection domain of the application.In addition, though logical order is shown in flow charts, but in certain situations
Under, it can be with the steps shown or described are performed in an order that is different from the one herein.
In one configuration, it may include one or more processors for carrying out the computing device of model training, prediction
(CPU), input/output interface, network interface and memory (memory).
Memory may include computer-readable medium in volatile memory, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.Memory may include module 1, module 2 ... ..., module N (N is the integer more than 2).
Computer-readable medium includes permanent and non-permanent, removable and non-movable storage medium, can be by appointing
What method or technique realizes that information stores.Information can be computer-readable instruction, data structure, the module of program or other
Data.The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM
(SRAM), dynamic random access memory (DRAM), other kinds of random access memory (RAM), read-only memory
(ROM), electrically erasable programmable read-only memory (EEPROM), fast flash memory bank or other memory techniques, CD-ROM are read-only
Memory (CD-ROM), digital versatile disc (DVD) or other optical storages, magnetic tape cassette, disk storage or other magnetic
Property storage device or any other non-transmission medium, available for storing the information that can be accessed by a computing device.According to herein
Define, computer-readable medium include non-temporary computer readable media (transitory media), such as modulation data
Signal and carrier wave.
Embodiment one, a kind of model training method, as shown in Figure 1, including step S110~S120.
The description file of S110, loading algorithm;The description file of the algorithm includes description information, and the description information is used
Dependence between the multiple algorithm process step is described;
S120, the description file according to the algorithm load the execution file of the algorithm, and mould is carried out according to training data
Type training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
In the present embodiment, the description file of algorithm can regard the metamessage of algorithm as, can be used for describing algorithm and calculation
The attribute of the execution file of method, such as storage location of execution file of the title of algorithm, algorithm etc..The execution file of algorithm can
To regard the code for being used to implement algorithm as.
Wherein, an execution file can be served only for an algorithm, i.e.,:It only should described in the description file of an algorithm
Perform file and other dependences for performing file.
Wherein, an execution file can be used for polyalgorithm, i.e.,:This is all described in the description file of polyalgorithm
Perform file and other dependences for performing file.
Wherein, one performs file and can be considered as " part " of a type, and the description file of algorithm can be considered as various
The splicing explanation of class " part ", " part " of a type can be used in one or more algorithms, i.e.,:One execution file can
To be served only for an algorithm, polyalgorithm can be used for.In one algorithm, " part " of some type can be used only
Once, it can also use repeatedly;Such as in some algorithm, need that operation is repeatedly normalized, then when running the algorithm,
Execution file for operation to be normalized can perform repeatedly.
In the present embodiment, polyalgorithm processing step can be included in an algorithm, each algorithm process step has respectively
Execution file, each algorithm process step can regard an independent task as.
Wherein, algorithm process step can be subalgorithm, i.e., the algorithm process step can also be used as an independence in itself
Algorithm use, according to the algorithm process step can stand-alone training go out model;Algorithm process step can also be only independent
Algorithm in part operation.
Wherein, it when including multiple subalgorithms in some algorithm X, can regard as with multiple independent algorithms or multiple
Independent algorithm adds other operations to constitute algorithm X.
Wherein, subalgorithm can be the algorithm of existing algorithm or user's designed, designed (including user to having
Algorithm be transformed after algorithm).
Wherein, the algorithm process step other than subalgorithm can be the corresponding part of corresponding operating in existing algorithm,
It can be by user's designed, designed.
Wherein, above-mentioned existing algorithm can include one or more of:In the algorithm and system of other user's publications
Included algorithm.
In the present embodiment, the dependence between algorithm process step, can regard as needs between algorithm process step
What is followed performs sequence, for example a dependence is:Algorithm process step A depends on algorithm process step B and C, the dependence
Relationship shows that algorithm process step A needs to start to perform after completing algorithm process step B and algorithm process step C.
In the present embodiment, the implementation procedure of algorithm is the process being trained according to training data to model, trains and
The model i.e. corresponding model of the algorithm;Execution can train different algorithms on different models;Using different training
Data perform identical algorithm, it is also possible to train different models.
In the present embodiment, can the execution file of algorithm be run according to the description file of algorithm, to carry out the instruction of model
Practice, so that user can be by constructing and inputting the description file of algorithm and/or perform file, implementation model training process
Customization.
Using a kind of system architecture of the method for the present embodiment as shown in Fig. 2, performing above-mentioned steps by machine learning platform
S110~S120, user can by client to machine learning platform input algorithm description file, algorithm execution file,
Training data etc.;User can send the request of publication algorithm by client, to indicate that machine learning platform is received and preserved
The description file of algorithm and the execution file of algorithm;User can also send the request of training pattern by client, with instruction
Machine learning platform is trained some algorithm.
Wherein, user can first ask publication algorithm, then ask training pattern;It can also will be calculated when asking training pattern
Method describes file and the execution file of algorithm is input to machine learning platform.
Wherein, machine learning platform can be arranged on single server;Machine learning platform can also be it is distributed,
It is arranged on one or more server clusters.
Wherein, user can issue one or more algorithms on machine learning platform, can also be to announced algorithm
It is updated (i.e. the description file of more new algorithm and/or execution file).
Wherein, in the algorithm of user's publication, other algorithms in machine learning platform can be included.For example, user can be with
Other algorithms are combined together as to the algorithm of publication according to the needs of oneself;For example, user can be in the base of other algorithms
On plinth or on the basis of the combination of other algorithms, increase or change certain operations, the algorithm as publication.Wherein, its above-mentioned
Its algorithm can include one or more of:Algorithm of the user publication in machine learning platform, other users publication exist
Algorithm originally included in algorithm, machine learning platform in machine learning platform.
Wherein, user publication algorithm could be provided as can only this user use or be set as that other use can be shared to
Family.
Wherein, user can specify algorithm to be taken when asking training pattern;Used algorithm is not limited only to
The algorithm of the user oneself publication can also be the algorithm of other user's publications.
Wherein, training data can input machine learning platform when asking training pattern by user, can also protect in advance
There are in machine learning platform, and the correspondence between algorithm is recorded, i.e.,:Training data during which algorithm performs for making
With.
Wherein, it can be, but not limited between client and machine learning platform using remote procedure call (Remote
Procedure Call, RPC) it is interactive, user indicates that machine learning platform is accordingly grasped by way of sending RPC requests
Work and input data, such as description file of input algorithm, the execution file of algorithm, training data etc..
In a kind of realization method, description information can be the directed acyclic graph described with structure description language
(Directed Acyclic Graph, DAG) structure, each node in DAG structures can correspond to an algorithm process respectively
Step.
Wherein, the execution file of algorithm process step can be, but not limited to be binary file.
In this realization method, user can represent the implementation procedure of algorithm by DAG structures, and DAG structures can represent
Various processes perform sequence, and the description to the dependence of various processes is also equivalent to the description of DAG structures.
In this realization method, DAG structures can be tree structure or chain structure.
In this realization method, it can ensure that machine can identify using structure description language.
In this realization method, user can be by Software Development Kit (Software that machine learning platform provides
Development Kit, sdk) each node in DAG structures is realized respectively, and binary file is compiled into, as respective algorithms
The execution file of processing step.
In other realization methods, the implementation procedure for otherwise representing algorithm can also be adopted;Such as according to engineering
Practise the description file for the form construction algorithm that platform provides and the execution file of algorithm.
In a kind of realization method, it can also include before the description file of loading algorithm:
The description file of receiving algorithm and the execution file of algorithm, metadata system is stored in by the description file of algorithm,
The execution file of algorithm is stored in file system.
In this realization method, the description file for recording algorithm is loaded from metadata system.
In this realization method, the title of the algorithm can be included in the description file of algorithm and the algorithm performs text
Store path of the part in file system;When user's assignment algorithm is trained, the name for the algorithm that user specifies can be passed through
Claim, find the description file of algorithm, and the store path in description file finds the execution file of algorithm.
In this realization method, the model trained can also be stored in file system.
In other realization methods, the description file of algorithm and the execution file of algorithm can not also preserve in advance, but
User needs to input during training pattern, and machine learning platform is loaded directly into the description file of algorithm input by user, and according to this
The description file of algorithm loads the execution file of algorithm input by user;In addition, also it is not excluded for description file and the execution of algorithm
File preserves way in a system.
In a kind of realization method, the title of algorithm can also be included in the description file of algorithm;
Step S110 can include:
The request of training pattern is received, the title for the algorithm for being ready to use in training pattern, root are included in the request of training pattern
According to the title of algorithm included in the request of training pattern, the description file of loading algorithm.
In a kind of realization method, according to the description file of algorithm, the execution file of loading algorithm is carried out according to training data
Model training can include:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts carries out model training according to training data.
In this realization method, actuator load and execution file in training;Even if it is used during certain algorithm is performed
The updated algorithm is issued at family, will not be influenced to the ongoing implementation procedure of the algorithm;Therefore user's height can be allowed
Effect dynamically issues algorithm.
In this realization method, scheduling engine can be also used for carrying out step S110.
In this realization method, it can also include performing the ginsengs such as degree of parallelism, the required resource of file in the description file of algorithm
Number, scheduling engine can dispatch actuator according to these parameters.These parameters can also be defeated when asking training pattern by user
Enter.
In other realization methods, algorithm is not necessarily performed, for example can incite somebody to action using the framework of " scheduling engine-actuator "
The execution file of all processing steps of one algorithm all transfers to a module to perform.
In a kind of realization method, according to the description file of algorithm, the execution file of loading algorithm is carried out according to training data
It can also include after model training:
After the request for receiving deployment model, according to prediction description file loading it is each prediction processing step execution file and
The model used;Wherein, the prediction description file is used to describe what each prediction processing step that prediction process is included used
Model;When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe each prediction processing step
Rapid performs sequence.
This realization method can describe file according to prediction, and the model that off-line training goes out directly is deployed in linear system automatically
In system.In system architecture as shown in Figure 2, after the completion of model training, user can be by client click keys " portion
Affix one's name to model " to the request of machine learning platform transmission deployment model, machine learning platform carries out pre-add according to prediction description file
It carries, it is achieved thereby that a key is disposed.
It, can be according to model to be used during prediction description file selection prediction, when using in this realization method
During multiple models, the sequencing that multiple models use during predicting can also be specified, so that user can pass through structure
Prediction description file is made to realize the customization of pre- flow gauge, pre- flow gauge is made more reasonably to be applied to actual scene.For example, with
Family judges superposition is needed to use multiple models in prediction according to the actual conditions of specific business, it is assumed that be respectively pretreated model,
Face recognition model, post processing model;Then user can be described prediction process described in file by prediction and include three successively
The processing step of execution:Pre-treatment step, identification step, post-processing step, respectively using pretreated model, face recognition model
With post processing model.Multiple models that above-mentioned superposition is used in prediction can be collectively referred to as prediction model or on-time model.
In this realization method, prediction description file predicts that the execution file of processing step can be inputted and be preserved by user
In machine learning platform;Or machine learning platform can also be inputed to when asking deployment model by user.Prediction is handled
Model used in step can also include other moulds pre-saved including training the model come in step S120
Type or model input by user.
In this realization method, the description file of algorithm can be used directly as prediction description file, it will be every in algorithm
A algorithm process step is respectively as a prediction processing step, and the difference lies in input is prediction data rather than training
Data.
In this realization method, the model that difference prediction processing step uses may be the same or different;Such as primary
It during prediction, needs to reuse some model, is then likely to occur the feelings that different prediction processing steps use same model
Condition.
In this realization method, some prediction processing steps can not use any model, for example only carry out the mathematics of data
Operation or normalized etc..
In this realization method, file is described according to prediction, preload the execution file of each prediction processing step and is used
It can also include after model:
File is described according to the prediction, performs each prediction processing step respectively to prediction data;
Wherein, each prediction processing step is performed respectively to prediction data to include:To it is each prediction processing step carry out respectively with
Lower operation:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into prediction processing
The model that step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;It is other to predict the defeated of processing steps
Enter the output data that data are the previous prediction processing step of the prediction processing step respectively.
Than predicting that process includes three processing steps performed successively as described above:Pre-treatment step, identification step, rear place
The situation of step is managed, prediction data will first input pretreated model, obtain pretreated data;Pretreated data input
Face recognition model obtains recognition result data;Recognition result data input post processing model, obtains prediction result.
In this realization method, prediction data refers to the data inputted to be predicted model.Other realization methods
In, prediction data can also be pre-stored in machine learning platform, read and used when predicting model.
Embodiment two, a kind of model prediction method, as shown in figure 3, including step S210~S220:
S210, file is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein,
The prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;When prediction process packet
During containing multiple prediction processing steps, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
S220, file is described according to prediction, performs each prediction processing step respectively to prediction data.
The model that the present embodiment can predict the execution file of processing step and use according to prediction description file loading, with
Carry out model prediction so that execution file, to carry out the training of model so that user can by construct simultaneously
Input prediction describes file and the execution file of each prediction processing step, and implementation model predicts the customization of process.
It, can be according to model to be used during prediction description file selection prediction, when using in this realization method
During multiple models, the sequencing that multiple models use during predicting can also be specified, so that user can pass through structure
Prediction description file is made to realize the customization of pre- flow gauge, pre- flow gauge is made more reasonably to be applied to actual scene.
In this realization method, prediction data refers to the data for being inputted or being prestored by user to be predicted model.
In this realization method, prediction description file predicts that the execution file of processing step can be inputted and be preserved by user
In machine learning platform;Or machine learning platform can also be inputed to when sending predictions request by user.Prediction is handled
Model used in step can train the model come or the model pre-saved according to the method for embodiment one
Or model input by user.
In this realization method, the model that difference prediction processing step uses may be the same or different;Such as primary
It during prediction, needs to reuse some model, is then likely to occur the feelings that different prediction processing steps use same model
Condition.
In this realization method, some prediction processing steps can not use any model, for example only carry out the mathematics of data
Operation or normalized etc..
It can also be as shown in Fig. 2, being performed by machine learning platform above-mentioned using a kind of system architecture of the method for the present embodiment
Step S210~S220, user can describe file, prediction data etc. by client to machine learning platform input prediction, also
It can indicate that machine learning platform receives and preserves prediction description file or indicate that machine learning platform starts mould by client
Type is predicted.
Wherein, predict that model used in processing step can be the model trained according to the method for embodiment one,
It can be the model preserved in model input by user or machine learning platform.
In a kind of realization method, performing each prediction processing step respectively to prediction data can include:Each prediction is handled
Step carries out following operate respectively:The execution file of the prediction processing step is performed, by the input data of the prediction processing step
The model that the prediction processing step uses is inputted, obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
In this realization method, prediction description file can be the DAG structures described with structure description language, each to predict
Processing step is respectively as a node in DAG structures.
Wherein, predict that the execution file of processing step can be, but not limited to be binary file.
In this realization method, it can ensure that machine can identify using structure description language.
In this realization method, user can be by Software Development Kit (Software that machine learning platform provides
Development Kit, sdk) each prediction processing step is realized respectively, and is compiled into binary file, as prediction processing step
Rapid execution file.
In a kind of realization method, file is described according to prediction, load the execution file of each prediction processing step and is used
Model can include:
After the request for receiving deployment model, deployment task is scheduled to working node by host node;
Working node according to deployment task it is corresponding prediction description file, load it is each prediction processing step execution file and
The model used.
It is described that file is described according to prediction in a kind of realization method, it loads the execution file of each prediction processing step and makes
It is further included before model:
The description file of loading algorithm;The description file of algorithm includes description information, and description information is described more for describing
Dependence between a algorithm process step;
According to the description file of algorithm, the execution file of loading algorithm carries out model training according to training data;Wherein,
The execution file for performing file and including each algorithm process step in algorithm of algorithm.
It is the first training pattern before prediction in this realization method;The process of training pattern corresponds to the step of embodiment one
S110~S120, other details of training pattern can refer to embodiment one.
Embodiment three, a kind of model training method, as shown in figure 4, including step S310~S320:
S310, after the instruction for receiving user's training pattern, according to the user upload algorithm description file, loading
The execution file of the algorithm;Wherein, the description file of the algorithm be used for describe the algorithm perform file between according to
The relationship of relying;The execution file of the algorithm includes one or more of:Execution file that the user uploads pre-saves
Perform file;
S320, the corresponding model of the algorithm is trained according to training data.
In the present embodiment, it can allow user can be with custom algorithm;User can by the description file of upper propagation algorithm come
The implementation procedure of custom algorithm, so as to fulfill to the self-defined of algorithm.In addition, user can also pass through upper propagation algorithm whole or portion
The execution file divided carrys out custom algorithm and completely or partially performs code.
In the present embodiment, user can in advance the description file of upper propagation algorithm or in advance the description file of upper propagation algorithm and
File is performed, then sends out the instruction of training pattern again;The instruction that training pattern can also first be sent out is uploaded again;It can be with
The description file (or description file and execution file) of algorithm is carried in the instruction of training pattern or to upload new algorithm
Description file (or description file and perform file) this movement be used as the instruction of training pattern.
The method of the present embodiment can be as shown in Figure 2 framework in machine learning platform perform.
Wherein, User Defined algorithm can include following a variety of situations:
(1) the description file of the upper propagation algorithm of user and the whole of algorithm perform file;The situation is equivalent to entire algorithm
All it is user-defined.
(2) part of the description file of the upper propagation algorithm of user and algorithm performs file;The situation is equivalent to user will be certainly
Existing algorithm on the code and machine learning platform that oneself constructs, new algorithm is combined into according to customized mode.
(3) user only goes up the description file of propagation algorithm, and the execution file of algorithm is existing calculation on machine learning platform
The execution file of method;The situation is equivalent to user by algorithm existing on machine learning platform according to customized mode again group
Synthesize new algorithm.
In a kind of realization method, the method can also include:After the instruction for receiving user's more new algorithm, carry out following
One or more operations:
The description file of the algorithm uploaded using the user replaces the description file of original algorithm, according to replaced
The execution file of the description file loading algorithm of algorithm;
The execution file for performing file and replacing original algorithm of the algorithm uploaded using the user.
In this realization method, user the updated description file of upper propagation algorithm and/or updated can perform text in advance
Then part sends out the instruction of more new algorithm again;The instruction that more new algorithm can also first be sent out is uploaded again;It can also update
The updated description file of algorithm and/or updated execution file or the update of Yi Shang propagation algorithm are carried in the instruction of algorithm
Description file and/or updated this movement of execution file afterwards is used as the instruction of more new algorithm.
In this realization method, when the description file (or performing file) for the algorithm that user uploads is the description of existing algorithm
During file (or perform file), it is that the updated description file of the algorithm (or performs to judge that this describes file (or performing file)
File).
It, can be with three kinds of update modes when user's more new algorithm in this realization method:
(1) the updated description file of upper propagation algorithm is equivalent to the code of not more new algorithm, is only performed in more new algorithm
Dependence between file.
(2) the updated execution file of upper propagation algorithm is equivalent to the dependence performed in not more new algorithm between file and closes
System, only some in more new algorithm or certain execution files be in itself.
(3) the updated description file of propagation algorithm and updated execution file, are equivalent to performing file in algorithm on
Between dependence, some or it is certain execution file be all updated in itself;Which can be adapted for increasing newly in the algorithm
Add the situation of one or more steps, be readily applicable to that adjustment is not only needed to perform the dependence between file but also need to repair
Change some or certain situations for performing the code in file.
In a kind of realization method, the description file of the algorithm uploaded according to the user, loading algorithm performs text
Part can include:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts.
In the present embodiment, the deployment performed after file, the detailed process of training, training of the description file, algorithm of algorithm
The realization details of process can be found in embodiment one.
Example IV, a kind of model training apparatus, as shown in figure 5, including:
Load-on module 41, for the description file of loading algorithm;The description file of algorithm includes description information, description information
For describing the dependence between polyalgorithm processing step;
Training module 42, for the description file according to algorithm, the execution file of loading algorithm is carried out according to training data
Model training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of algorithm.
In the present embodiment, load-on module 41 is the portion for the description file for being responsible for loading algorithm in above-mentioned model training apparatus
Divide, can be the combination of software, hardware or both.
In the present embodiment, training module 42 is the part for being responsible for carrying out model training in above-mentioned model prediction device, can be with
It is the combination of software, hardware or both.
In the present embodiment, load-on module 41 can be, but not limited to be that scheduling in a kind of realization method of embodiment one is drawn
It holds up;Training module 42 can be, but not limited to the scheduling engine and actuator in a kind of realization method of embodiment, wherein scheduling is drawn
It holds up and actuator is dispatched according to the description file of algorithm, the execution file of scheduled actuator loading algorithm carries out model training.
In a kind of realization method, description information can be the directed acyclic graph DAG structures described with structure description language,
Each node in DAG structures can correspond to an algorithm process step respectively.
In a kind of realization method, model training apparatus can also include:
For the description file of receiving algorithm and the execution file of algorithm, the description file of algorithm is preserved for receiving module
In metadata system, the execution file of algorithm is stored in file system.
In a kind of realization method, model training apparatus can also include:
Deployment module, for after the request of deployment model is received, loading each prediction according to prediction description file and handling
The execution file of step and the model used;Wherein, prediction description file is used to describe at each prediction that prediction process is included
The model that reason step uses;When the process of prediction includes multiple prediction processing steps, prediction description file is additionally operable to describe each pre-
That surveys processing step performs sequence.
In this realization method, model training apparatus can also include:
Prediction module, for deployment module according to prediction description file loading it is each prediction processing step execution file and
After the model used, file is described according to prediction, performs each prediction processing step respectively to prediction data;
Wherein, performing each prediction processing step respectively to prediction data can include:To it is each prediction processing step respectively into
Row is following to be operated:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction
The model that processing step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
The operation of each module of the model training apparatus of the present embodiment corresponds to step S110~S120 in embodiment one,
Other realization details of each module operation can be found in embodiment one.
Embodiment five, a kind of model prediction device, as shown in fig. 6, including:
Deployment module 51 for describing file according to prediction, loads the execution file of each prediction processing step and uses
Model;Wherein, prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;Work as prediction
When process includes multiple prediction processing steps, what prediction description file was additionally operable to describe each prediction processing step performs sequence;
Prediction module 52 for describing file according to prediction, performs prediction data each prediction processing step respectively.
In the present embodiment, deployment module 51 is to be responsible for the part of loading in above-mentioned model prediction device, can be software, hard
The combination of part or both.
In the present embodiment, prediction module 52 is to be responsible for the part of perform prediction in above-mentioned model prediction device, can be soft
The combination of part, hardware or both.
Deployment module 51 and prediction module 52 in the present embodiment can be respectively as a kind of realization methods of example IV
In deployment module and prediction module, other details of the present embodiment are equally applicable to the above-mentioned realization method of example IV.
In a kind of realization method, performing each prediction processing step respectively to prediction data can include:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, this is pre-
The input data for surveying processing step inputs the model that the prediction processing step uses, and obtains the output number of the prediction processing step
According to;
Wherein, the input data of first prediction processing step is prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
In a kind of realization method, prediction description file is the directed acyclic graph DAG structures described with structure description language,
Each prediction processing step is respectively as a node in DAG structures.
In a kind of realization method, model prediction device can also include:
Load-on module, for the description file of loading algorithm;The description file of algorithm includes description information, and description information is used
Dependence between polyalgorithm processing step is described;
Training module, for the description file according to algorithm, the execution file of loading algorithm carries out mould according to training data
Type training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of algorithm.
Load-on module and instruction in example IV can be respectively adopted in load-on module and training module in this realization method
Practice module to realize, other details can be found in example IV.
The operation of each module of the model prediction device of the present embodiment corresponds to step S210~S220 in embodiment two,
Other realization details of each module operation can be found in embodiment two.
Embodiment six, a kind of model training apparatus, as shown in fig. 7, comprises:
Scheduler module 61, for after the instruction for receiving user's training pattern, the algorithm that is uploaded according to the user is retouched
File is stated, loads the execution file of the algorithm;Wherein, what the description file of the algorithm was used to describing the algorithm performs text
Dependence between part;The execution file of the algorithm includes one or more of:The execution file of user's upload,
The execution file pre-saved;
Algorithm runs module 62, for being trained according to training data to the corresponding model of the algorithm.
In the present embodiment, scheduler module 61 is to be responsible for the part of load and execution file in above-mentioned model training apparatus, can be with
It is the combination of software, hardware or both.
In the present embodiment, algorithm operation module 62 is to be responsible for the part of training pattern in above-mentioned model prediction device, can be with
It is the combination of software, hardware or both.
In a kind of realization method, the device can also include:
Update module, for after the instruction for receiving user's more new algorithm, carrying out one or more of operation:
The description file of the algorithm uploaded using the user replaces the description file of original algorithm, according to replaced
The execution file of the description file loading algorithm of algorithm;
The execution file for performing file and replacing original algorithm of the algorithm uploaded using the user.
The operation of each module of the model training apparatus of the present embodiment corresponds to step S310~S320 in embodiment three,
Other realization details of each module operation can be found in embodiment three.
Embodiment seven, a kind of electronic equipment for carrying out model training, including:Memory and processor;
Memory is used to preserve the program for carrying out model training;For carrying out the program of model training by processor
When reading execution, following operate is performed:
The description file of loading algorithm;The description file of algorithm includes description information, and description information is used to describe multiple calculations
Dependence between method processing step;
According to the description file of algorithm, the execution file of loading algorithm carries out model training according to training data;Wherein,
The execution file for performing file and including each algorithm process step in algorithm of algorithm.
In a kind of realization method, description information can be the directed acyclic graph DAG structures described with structure description language,
Each node in DAG structures corresponds to an algorithm process step respectively.
In a kind of realization method, for carrying out the program of model training when being read out by the processor execution, in loading algorithm
Description file before following operation can also be performed:
The description file of receiving algorithm and the execution file of algorithm, metadata system is stored in by the description file of algorithm,
The execution file of algorithm is stored in file system.
In a kind of realization method, for carrying out the program of model training when being read out by the processor execution, according to algorithm
Description file, the execution file of loading algorithm, according to training data carry out model training after following operation can also be performed:
After the request for receiving deployment model, according to prediction description file loading it is each prediction processing step execution file and
The model used;Wherein, prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;
When the process of prediction includes multiple prediction processing steps, the execution that prediction description file is additionally operable to describe each prediction processing step is suitable
Sequence.
In a kind of realization method, for carrying out the program of model training when being read out by the processor execution, according to prediction
Following operation can also be performed after the execution file of each prediction processing step of description file loading and the model used:
File is described according to prediction, performs each prediction processing step respectively to prediction data;
Wherein, performing each prediction processing step respectively to prediction data can include:To it is each prediction processing step respectively into
Row is following to be operated:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction
The model that processing step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
In the present embodiment, for carrying out the program of model training when being read out by the processor execution, performed operation pair
It should be in step S110~S120 in embodiment one;Other details of operation performed by the program can be found in embodiment one.
Embodiment eight, a kind of electronic equipment for carrying out model prediction, including:Memory and processor;
Memory is used to preserve the program for carrying out model prediction;For carrying out the program of model prediction by processor
When reading execution, following operate is performed:
File is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein, prediction is retouched
File is stated for describing the model that each prediction processing step that prediction process is included uses;When prediction process includes multiple predictions
During processing step, what prediction description file was additionally operable to describe each prediction processing step performs sequence;
File is described according to prediction, performs each prediction processing step respectively to prediction data.
In a kind of realization method, performing each prediction processing step respectively to prediction data can include:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, this is pre-
The input data for surveying processing step inputs the model that the prediction processing step uses, and obtains the output number of the prediction processing step
According to;
Wherein, the input data of first prediction processing step is prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
In a kind of realization method, prediction description file can be the directed acyclic graph DAG described with structure description language
Structure, it is each to predict that processing step be respectively as a node in DAG structures.
In a kind of realization method, for carrying out the program of model prediction when being read out by the processor execution, according to prediction
File is described, following operation can also be performed before loading each execution file for predicting processing step and the model used:
The description file of loading algorithm;The description file of algorithm includes description information, and description information is used to describe multiple calculations
Dependence between method processing step;
According to the description file of algorithm, the execution file of loading algorithm carries out model training according to training data;Wherein,
The execution file for performing file and including each algorithm process step in algorithm of algorithm.
In the present embodiment, for carrying out the program of model prediction when being read out by the processor execution, performed operation pair
It should be in step S210~S220 in embodiment two;Other details of operation performed by the program can be found in embodiment two.
Embodiment nine, a kind of electronic equipment for carrying out model training, including:Memory and processor;
The memory is used to preserve the program for carrying out model training;It is described to exist for carrying out the program of model training
When reading execution by the processor, following operate is performed:
After the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads, described in loading
The execution file of algorithm;Wherein, the description file of the algorithm is used to describe the dependence pass performed between file of the algorithm
System;The execution file of the algorithm includes one or more of:The execution file of user's upload, the execution pre-saved
File;
The corresponding model of the algorithm is trained according to training data.
It is described for carrying out the program of model prediction when reading execution by the processor in a kind of realization method, also
Perform following operate:
After the instruction for receiving user's more new algorithm, one or more of operation is carried out:
The description file of the algorithm uploaded using the user replaces the description file of original algorithm, according to replaced
The execution file of the description file loading algorithm of algorithm;
The execution file for performing file and replacing original algorithm of the algorithm uploaded using the user.
In the present embodiment, for carrying out the program of model training when being read out by the processor execution, performed operation pair
It should be in step S310~S320 in embodiment three;Other details of operation performed by the program can be found in embodiment three.
Embodiment ten, a kind of model training and forecasting system, as shown in figure 8, being divided into off-line training and on-line prediction
(Prediction) two parts.Wherein, off-line training part can realize step S110~S120 in embodiment one;Online
Predicted portions can realize step S210~S220 in embodiment two.
In the present embodiment, as system entry, server-side (Server) receives the RPC requests of three types:Issue algorithm
(Release), training pattern (Train), deployment model (Deploy).It wherein issues algorithm and training pattern belongs to off-line training
Part, deployment model connect off-line training part and on-line prediction part, i.e.,:The model that off-line training part is obtained
It is deployed in on-line prediction part, to be predicted.Front-end server (Frontend) receives user as on-line prediction entrance
Predictions request and prediction data, complete prediction.
In the present embodiment, in off-line training part, server-side receives the RPC requests of user, and the class asked according to RPC
Type determines the mode of processing RPC requests.
In the present embodiment, publication algorithm can regard propagation algorithm as and the process preserved, primary control logic are set to
Server-side is to issue the RPC requests of algorithm that is, for type, and server-side can be handled voluntarily.Server-side can be by the member of algorithm
Information (describing file) is stored in metadata system (Meta System), and the execution file of algorithm is stored in file system
(File System) can be, but not limited to using distributed file system.
Wherein, the description file of propagation algorithm can be gone up when issuing algorithm and performs file, can also only go up retouching for propagation algorithm
File is stated, algorithm performs execution file of the file using existing algorithm in system.
Wherein, if during publication algorithm upper propagation algorithm execution file, the execution file of the algorithm uploaded can be calculated
The whole execution file of method or the execution file of algorithm part, remaining execution file of algorithm have using in system
Algorithm execution file.
In the present embodiment, model training can regard the input quantity using training data as algorithm as, operation algorithm so as to
The process of model is trained, execution unit includes the scheduling engine (Scheduler Engine) of algorithm and the actuator of algorithm
(Executor), the two is by accessing the description of the algorithm text that metadata system will be performed with distributed file system loading
Part and execution file carry out training pattern;The RPC that type is training pattern is asked, server-side can be given at scheduling engine
Reason.Scheduling engine receives the request of training pattern that server-side distribution comes, and carries out parameter testing, unified planning resource, then adjust
It spends on each actuator.After actuator receives the task that scheduling engine assignment comes, add first from distributed file system
The execution file of the algorithm needed for training pattern is carried, then starts to perform the main logic of algorithm.
In the present embodiment, deployment model can be regarded as model from off-line training certain applications to on-line prediction part
Process, primary control logic are arranged on deployment engine (Deploy Engine), the RPC that type is deployment model are asked, clothes
Business end can give deployment engine processing;Deployment engine can dispose off-line training model or customized model or
On person to online service, online service can be disposed more parts on different clusters, and each cluster includes a host node
(Master) and multiple working nodes (Worker).
In the present embodiment, metadata system can be stored in the member letter of model training and all algorithms of forecasting system publication
Breath, each algorithm can be specified by unique name identification, the title by algorithm development person when issuing algorithm.File system
The execution file of system storage algorithm, can be uploaded by developer when issuing algorithm.
In the present embodiment, algorithm includes the description file of algorithm and the execution file of algorithm.The description file of algorithm is suitable
In the metamessage of algorithm, algorithm title, algorithm parameter, degree of parallelism and required resource etc. are defined.The execution file of algorithm can
To be the algorithm main logic for the interface exploitation that developer is provided based on platform, device loading is performed in the execution stage.
It is described below in the present embodiment, the processing procedure of different types of RPC requests.
After it is the RPC requests for issuing algorithm that server-side, which receives type, the description file of analytical algorithm first checks algorithm
Title, the legitimacy of parameter definition, then by the execution file storage of algorithm to distributed file system, the execution file of algorithm exists
Store path in distributed file system is saved in metadata system.
After it is the RPC requests of training pattern that server-side, which receives type, RPC requests can be forwarded to scheduling and drawn by server-side
It holds up.Scheduling engine respectively generates a unique mark (ID) for each training process, and what is carried in being asked according to RPC is pending
Algorithm name query metadata system, the description file of loading algorithm, the parameter carried in RPC requests with user (such as
Performed on several machines) carrying out legitimacy verifies together, (whether such as user right meets the requirements, whether the execution file of algorithm is deposited
Waiting), resource and degree of parallelism, dispatching algorithm according to needed for the parameter that user carries in RPC requests calculates actuator arrive
It is performed on suitable actuator, i.e.,:Task is distributed to actuator.After actuator receives task, loading algorithm is corresponding to perform text
Part, and the model of output is stored to file system.User can inquire the state of training by unique ID of training process.
Wherein, publication algorithm can regard a synchronizing process as, i.e.,:Perform the process of speed;And training pattern
An asynchronous procedure can be regarded as, i.e.,:The longer process for performing the time is needed, can generally be inquired and held by way of poll
Traveling degree.
In the model training and forecasting system of the present embodiment, algorithm can be used as a kind of presence of plug-in unit rather than model instruction
Practice and the primary part of forecasting system;The model trained after algorithm performs can regard an object as.It is calculated when publication is new
It when method or more the old and new's algorithm, does not need to do model training and forecasting system any change, the RPC requests of type are calculated in a publication
It can be completed.When more the old and new's algorithm, if there is user is carrying out the algorithm, due to the algorithm description file and perform text
Part has been already loaded into the memory of actuator, so even original description file and/or execution file at no point in the update process
It is deleted, nor affects on the normal execution of algorithm.When user performs the algorithm next time, it will use updated version.Such as
Fruit wants an offline algorithm, can also be realized by way of RPC requests.Algorithm dynamic publishing is thus achieved the purpose that.
In the present embodiment, in on-line prediction part, the prediction process of an on-time model can be described as a DAG knot
Structure, user can customize structure DAG structures according to the business scenario of oneself, and determine dependence therein, wherein DAG knots
Each node of structure can correspond to a prediction processing step during prediction, each to predict that processing step is properly termed as
One prediction processing unit (processor unit), including General Porcess Unit built-in in model training and forecasting system
(builtin processor unit) and user-defined processing unit (user defined processor unit).
After user initiates the RPC requests of deployment model, server-side can ask the RPC to be forwarded to deployment engine.Deployment
Engine can forward a request to the host node of on-line prediction part, and host node describes prediction processing step institute in file according to prediction
The information and model training of the model used and the loading condition of forecasting system are by task scheduling to suitable working node, work
The prediction of description prediction process flow that node is submitted according to user describes file, loads successively in DAG structures at each prediction
Manage unit (i.e.:Load the execution file of each prediction processing step).In this example, model training and forecasting system dispose model
Flexibility to on-line prediction part is, can (can be by model that off-line training obtains or user-defined model
Existing model) it issues into on-time model (i.e.:Deployment model), to carry out the prediction of model;Off-line training can also be obtained
Model and the independent assortment publication of user-defined model into on-time model, realize that height customizes.
After user initiates predictions request, front-end server, which can be searched, asks the model of prediction corresponding in predictions request
Then working node directly forwards a request to working node, patrolled according to the DAG predictions processing of description on-time model prediction process
It collects processing user and inputs request, result will be finally returned that user.
In this example, model training and forecasting system are after the RPC requests that type is training pattern are received, according to such as dirty
Journey supports user to customize training process and dynamic publishing, as shown in figure 9, including step 401~403:
What the 401st, server-side received that the structuring DAG that user provides describes file and each node of DAG structures performs text
Part;It is DAG flow of the user using the algorithm performs described in structure description language (such as xml) that structuring DAG, which describes file,
Each node of DAG flows can regard primary distributed execution task as;The execution file of each node of DAG flows by with
Family is realized based on the sdk that model training and forecasting system provide, and can be compiled into binary file.
402nd, scheduling engine describes file according to structuring DAG, dispatches actuator to perform each corresponding point of DAG nodes
Cloth task.
403rd, after actuator starts, the execution file of each node of the DAG flows received in dynamic load step 401,
Implementation model is trained.
In this example, model training and forecasting system support user's customized prediction process according to following flow, it is assumed that have led to
It crosses off-line training and obtains m off-line model, as shown in Figure 10, model training and forecasting system perform step 501~503:
501st, model training and forecasting system receive the prediction that user provides and describe file and n customized prediction processing
Unit;Prediction description file includes DAG structures;DAG structures are by the described m off-line model of structure description language
And the connection relation between n prediction processing unit;N customized prediction processing units are based on model training and pre- by user
The sdk that examining system provides is realized, can be compiled as binary file.
502nd, model training and forecasting system carry out m off-line model and n prediction processing unit according to DAG structures pre-
Loading.
503rd, after the predictions request for receiving user, model training and forecasting system are according to DAG structures, by data flow successively
M model and n prediction processing unit is transferred to be handled.
Embodiment 11, a kind of machine learning platform, as shown in figure 11, including:
Release module 71, for receiving the algorithm file of user's upload and preserving;
For loading the algorithm file of user's upload, model training is carried out according to training data for algorithm processing module 72.
In the present embodiment, release module 71 is to be responsible for receiving and preserving the portion of algorithm file in above-mentioned model training apparatus
Divide, can be the combination of software, hardware or both.
In the present embodiment, algorithm processing module 72 is to be responsible for carrying out the part of model training in above-mentioned model prediction device,
It can be the combination of software, hardware or both.
In the present embodiment, algorithm file can include the description file of algorithm and the execution file of algorithm.Wherein, algorithm
Description file can regard the metamessage of algorithm as, can be used for describing the attribute of the execution file of algorithm and algorithm, for example calculate
The title of method, storage location of execution file of algorithm etc..The execution file of algorithm can regard the generation for being used to implement algorithm as
Code.
In a kind of realization method, the algorithm file that user uploads can only include the description file of algorithm;Algorithm process mould
Block can first load the description file of the algorithm of user's upload, and the description file of the algorithm then uploaded according to user loads machine
The execution file of existing algorithm in learning platform carries out model training according to training data.In a kind of realization method, Yong Hushang
The algorithm file of biography only the one or more comprising algorithm can perform files;In the description file of corresponding algorithm, comprising
User upload algorithm perform file between dependence, and/or, user upload algorithm execution file and engineering
Practise the dependence performed between file of existing algorithm on platform;Algorithm processing module can be according to above-mentioned corresponding algorithm
Description file after, loading user upload algorithm execution file (or loading user upload algorithm execution file and machine
The execution file of existing algorithm on device learning platform), model training is carried out according to training data.
In a kind of realization method, the algorithm file that user uploads can not only include the description file of algorithm, but also including algorithm
One or more perform files;Algorithm processing module can first load the description file of the algorithm of user's upload, then basis
The description file for the algorithm that user uploads, execution file (or the algorithm that loading user uploads for the algorithm that loading user uploads
Perform the execution file of existing algorithm on file and machine learning platform), model training is carried out according to training data.
The present embodiment can allow user to issue algorithm, allow users to the algorithm that training needs that designs a model as needed,
So as to fulfill user to the self-defined of algorithm.
Machine learning platform in the present embodiment may be used as the machine learning platform in Fig. 2;Wherein, algorithm processing module
Specific operation process step S110~S120 of embodiment one can be used;Other details of operation can be found in embodiment one.
A kind of method that embodiment 12, machine learning platform carry out model training, as shown in figure 12, including step S710
~S720:
S710, it receives the algorithm file that user uploads and preserves;
The algorithm file that S720, loading user upload carries out model training according to training data.
The present embodiment can allow user to issue algorithm, allow users to the algorithm that training needs that designs a model as needed,
So as to fulfill user to the self-defined of algorithm.
The release module that the step S710 of the present embodiment may be used in embodiment 11 realizes that step S720 may be used
Algorithm processing module in embodiment 11 is realized.
Step S110~S120 in embodiment one can be used in the specific implementation process of the step S720 of the present embodiment;It is other
Details of operation can be found in embodiment one.
One of ordinary skill in the art will appreciate that all or part of step in the above method can be instructed by program
Related hardware is completed, and program can be stored in computer readable storage medium, such as read-only memory, disk or CD.It can
Selection of land, all or part of step of above-described embodiment can also be realized using one or more integrated circuits.Correspondingly, it is above-mentioned
The form that hardware may be used in each module/unit in embodiment is realized, can also be realized in the form of software function module.
The application is not limited to the combination of the hardware and software of any particular form.
Certainly, the application can also have other various embodiments, ripe in the case of without departing substantially from the application spirit and its essence
Various corresponding changes and deformation, but these corresponding changes and change ought can be made according to the application by knowing those skilled in the art
Shape should all belong to the protection domain of claims hereof.
Claims (31)
1. a kind of model training method, including:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to describe institute
State the dependence between polyalgorithm processing step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model training is carried out according to training data;Its
In, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
2. model training method as described in claim 1, it is characterised in that:
The description information is the directed acyclic graph DAG structures described with structure description language, each node in DAG structures
An algorithm process step is corresponded to respectively.
3. model training method as described in claim 1, which is characterized in that also wrapped before the description file of the loading algorithm
It includes:
The description file of receiving algorithm and the execution file of algorithm, are stored in metadata system by the description file of algorithm, will calculate
The execution file of method is stored in file system.
4. model training method as described in claim 1, it is characterised in that:The name of algorithm is further included in the description file of algorithm
Claim;
The description file of the loading algorithm includes:
The request of training pattern is received, the title for the algorithm for being ready to use in training pattern is included in the request of training pattern, according to instruction
Practice the title of algorithm included in the request of model, the description file of loading algorithm.
5. model training method as described in claim 1, which is characterized in that the description file according to algorithm loads institute
The execution file of algorithm is stated, carrying out model training according to training data includes:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts carries out model training according to training data.
6. model training method as described in claim 1, which is characterized in that the description file according to algorithm loads institute
The execution file of algorithm is stated, is further included after carrying out model training according to training data:
After the request for receiving deployment model, according to the execution file of each prediction processing step of prediction description file loading and use
Model;Wherein, the prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;
When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe holding for each prediction processing step
Row sequence.
7. model training method as claimed in claim 6, which is characterized in that described that each prediction is loaded according to prediction description file
It is further included after the execution file of processing step and the model used:
File is described according to prediction, performs each prediction processing step respectively to prediction data;
Wherein, each prediction processing step is performed respectively to prediction data to include:Following grasp is carried out respectively to each prediction processing step
Make:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction processing step
The model used obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
8. a kind of model prediction method, including:
File is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein, the prediction is retouched
File is stated for describing the model that each prediction processing step that prediction process is included uses;When prediction process includes multiple predictions
During processing step, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
File is described according to prediction, performs each prediction processing step respectively to prediction data.
9. model prediction method as claimed in claim 8, which is characterized in that described to be performed respectively at each prediction to prediction data
Reason step includes:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, at the prediction
The input data of reason step inputs the model that the prediction processing step uses, and obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
10. model prediction method as claimed in claim 8, which is characterized in that the prediction description file is retouched with structuring
The directed acyclic graph DAG structures of language description are stated, it is each to predict processing step respectively as a node in DAG structures.
11. model prediction method as claimed in claim 8, which is characterized in that described to describe file according to prediction, loading is each pre-
The model surveyed the execution file of processing step and used includes:
After the request for receiving deployment model, deployment task is scheduled to working node by host node;
Working node loads execution file and the use of each prediction processing step according to the corresponding prediction description file of deployment task
Model.
12. model prediction method as claimed in claim 8, which is characterized in that described to describe file according to prediction, loading is each pre-
It is further included before the execution file of survey processing step and the model used:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to describe institute
State the dependence between polyalgorithm processing step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model training is carried out according to training data;Its
In, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
13. a kind of model training method, including:
After the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads, the algorithm is loaded
Execution file;Wherein, the description file of the algorithm is used to describe the dependence performed between file of the algorithm;Institute
The execution file for stating algorithm includes one or more of:The execution file of user's upload, the execution file pre-saved;
The corresponding model of the algorithm is trained according to training data.
14. model training method as claimed in claim 13, which is characterized in that further include:
After the instruction for receiving user's more new algorithm, one or more of operation is carried out:
The description file of the algorithm uploaded using the user replaces the description file of original algorithm, according to replaced algorithm
Description file loading algorithm execution file;
The execution file for performing file and replacing original algorithm of the algorithm uploaded using the user.
15. model training method as claimed in claim 13, which is characterized in that the algorithm uploaded according to the user
File is described, the execution file of loading algorithm includes:
Scheduling engine dispatches actuator according to the description file of algorithm;
The execution file of loading algorithm after actuator starts.
16. a kind of model training apparatus, which is characterized in that including:
Load-on module, for the description file of loading algorithm;The description file of the algorithm includes description information, the description letter
It ceases to describe the dependence between the multiple algorithm process step;
Training module for the description file according to the algorithm, loads the execution file of the algorithm, according to training data into
Row model training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
17. model training apparatus as claimed in claim 16, which is characterized in that further include:
Deployment module, for after the request of deployment model is received, each prediction processing step to be loaded according to prediction description file
Execution file and the model that uses;Wherein, the prediction description file is used to describe at each prediction that prediction process is included
The model that reason step uses;When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe
Each prediction processing step performs sequence.
18. model training apparatus as claimed in claim 17, which is characterized in that further include:
Prediction module, for the deployment module according to prediction description file loading it is each prediction processing step execution file and
After the model used, file is described according to the prediction, performs each prediction processing step respectively to prediction data;
Wherein, it is described prediction data is performed respectively it is each prediction processing step include:To it is each prediction processing step carry out respectively with
Lower operation:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into prediction processing
The model that step uses obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
19. a kind of model prediction device, which is characterized in that including:
Deployment module for describing file according to prediction, loads each execution file for predicting processing step and the model used;Its
In, the prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;When predicting
When journey includes multiple prediction processing steps, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
Prediction module for describing file according to the prediction, performs prediction data each prediction processing step respectively.
20. model prediction device as claimed in claim 19, which is characterized in that described to perform each prediction respectively to prediction data
Processing step includes:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, at the prediction
The input data of reason step inputs the model that the prediction processing step uses, and obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
21. model prediction device as claimed in claim 19, which is characterized in that further include:
Load-on module, for the description file of loading algorithm;The description file of the algorithm includes description information, the description letter
It ceases to describe the dependence between the multiple algorithm process step;
Training module for the description file according to the algorithm, loads the execution file of the algorithm, according to training data into
Row model training;Wherein, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
22. a kind of model training apparatus, which is characterized in that including:
Scheduler module, for after the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads,
Load the execution file of the algorithm;Wherein, the description file of the algorithm is used between the execution file for describing the algorithm
Dependence;The execution file of the algorithm includes one or more of:The execution file of user's upload is protected in advance
The execution file deposited;
Algorithm runs module, for being trained according to training data to the corresponding model of the algorithm.
23. a kind of electronic equipment for carrying out model training, including:Memory and processor;
It is characterized in that:
The memory is used to preserve the program for carrying out model training;The program for being used to carry out model training is by institute
When stating processor reading execution, following operate is performed:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to describe institute
State the dependence between polyalgorithm processing step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model training is carried out according to training data;Its
In, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
24. electronic equipment as claimed in claim 23, which is characterized in that the program for being used to carry out model training is by institute
State processor read perform when, in the description file according to algorithm, load the execution file of the algorithm, according to training data into
Following operate also is performed after row model training:
After the request for receiving deployment model, according to the execution file of each prediction processing step of prediction description file loading and use
Model;Wherein, the prediction description file is used to describe the model that each prediction processing step that prediction process is included uses;
When the process of prediction includes multiple prediction processing steps, the prediction description file is additionally operable to describe holding for each prediction processing step
Row sequence.
25. electronic equipment as claimed in claim 24, which is characterized in that the program for being used to carry out model training is by institute
When stating processor reading execution, according to each execution file for predicting processing step of prediction description file loading and the model used
Following operate also is performed afterwards:
File is described according to the prediction, performs each prediction processing step respectively to prediction data;
Wherein, each prediction processing step is performed respectively to prediction data to include:Following grasp is carried out respectively to each prediction processing step
Make:The execution file of the prediction processing step is performed, the input data of the prediction processing step is inputted into the prediction processing step
The model used obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
26. a kind of electronic equipment for carrying out model prediction, including:Memory and processor;
It is characterized in that:
The memory is used to preserve the program for carrying out model prediction;The program for being used to carry out model prediction is by institute
When stating processor reading execution, following operate is performed:
File is described according to prediction, the model for loading the execution file of each prediction processing step and using;Wherein, the prediction is retouched
File is stated for describing the model that each prediction processing step that prediction process is included uses;When prediction process includes multiple predictions
During processing step, what the prediction description file was additionally operable to describe each prediction processing step performs sequence;
File is described according to the prediction, performs each prediction processing step respectively to prediction data.
27. electronic equipment as claimed in claim 26, which is characterized in that described to perform each prediction processing respectively to prediction data
Step includes:
Following operate is carried out respectively to each prediction processing step:The execution file of the prediction processing step is performed, at the prediction
The input data of reason step inputs the model that the prediction processing step uses, and obtains the output data of the prediction processing step;
Wherein, the input data of first prediction processing step is the prediction data;The input number of other prediction processing steps
According to the output data for the previous prediction processing step for being respectively the prediction processing step.
28. electronic equipment as claimed in claim 26, which is characterized in that the program for being used to carry out model prediction is by institute
When stating processor reading execution, file is being described according to prediction, is loading each execution file for predicting processing step and the mould used
Following operate also is performed before type:
The description file of loading algorithm;The description file of the algorithm includes description information, and the description information is used to describe institute
State the dependence between polyalgorithm processing step;
According to the description file of the algorithm, the execution file of the algorithm is loaded, model training is carried out according to training data;Its
In, the execution file for performing file and including each algorithm process step in algorithm of the algorithm.
29. a kind of electronic equipment for carrying out model training, including:Memory and processor;
It is characterized in that:
The memory is used to preserve the program for carrying out model training;The program for being used to carry out model training is by institute
When stating processor reading execution, following operate is performed:
After the instruction for receiving user's training pattern, according to the description file for the algorithm that the user uploads, the algorithm is loaded
Execution file;Wherein, the description file of the algorithm is used to describe the dependence performed between file of the algorithm;Institute
The execution file for stating algorithm includes one or more of:The execution file of user's upload, the execution file pre-saved;
The corresponding model of the algorithm is trained according to training data.
30. a kind of machine learning platform, which is characterized in that including:
Release module, for receiving the algorithm file of user's upload and preserving;
For loading the algorithm file of user's upload, model training is carried out according to training data for algorithm processing module.
31. a kind of method that machine learning platform carries out model training, which is characterized in that including:
It receives the algorithm file that user uploads and preserves;
The algorithm file that user uploads is loaded, model training is carried out according to training data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611153818.2A CN108229686B (en) | 2016-12-14 | 2016-12-14 | Model training and predicting method and device, electronic equipment and machine learning platform |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611153818.2A CN108229686B (en) | 2016-12-14 | 2016-12-14 | Model training and predicting method and device, electronic equipment and machine learning platform |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108229686A true CN108229686A (en) | 2018-06-29 |
CN108229686B CN108229686B (en) | 2022-07-05 |
Family
ID=62639023
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611153818.2A Active CN108229686B (en) | 2016-12-14 | 2016-12-14 | Model training and predicting method and device, electronic equipment and machine learning platform |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108229686B (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107766940A (en) * | 2017-11-20 | 2018-03-06 | 北京百度网讯科技有限公司 | Method and apparatus for generation model |
CN109255234A (en) * | 2018-08-15 | 2019-01-22 | 腾讯科技(深圳)有限公司 | Processing method, device, medium and the electronic equipment of machine learning model |
CN109376012A (en) * | 2018-10-10 | 2019-02-22 | 电子科技大学 | A kind of self-adapting task scheduling method based on Spark for isomerous environment |
CN110188884A (en) * | 2019-05-14 | 2019-08-30 | 深圳极视角科技有限公司 | A kind of data processing method and Inference Platform |
CN110210624A (en) * | 2018-07-05 | 2019-09-06 | 第四范式(北京)技术有限公司 | Execute method, apparatus, equipment and the storage medium of machine-learning process |
CN110209574A (en) * | 2019-05-14 | 2019-09-06 | 深圳极视角科技有限公司 | A kind of data mining system based on artificial intelligence |
CN110795077A (en) * | 2019-09-26 | 2020-02-14 | 北京你财富计算机科技有限公司 | Software development method and device based on artificial intelligence and electronic equipment |
CN111310934A (en) * | 2020-02-14 | 2020-06-19 | 北京百度网讯科技有限公司 | Model generation method and device, electronic equipment and storage medium |
CN111338630A (en) * | 2018-11-30 | 2020-06-26 | 上海寒武纪信息科技有限公司 | Method and device for generating universal machine learning model file and storage medium |
CN111523676A (en) * | 2020-04-17 | 2020-08-11 | 第四范式(北京)技术有限公司 | Method and device for assisting machine learning model to be online |
CN111522570A (en) * | 2020-06-19 | 2020-08-11 | 杭州海康威视数字技术股份有限公司 | Target library updating method and device, electronic equipment and machine-readable storage medium |
CN111625202A (en) * | 2020-07-28 | 2020-09-04 | 上海聪链信息科技有限公司 | Algorithm extension customizing method and system of block chain chip |
CN111917579A (en) * | 2020-07-30 | 2020-11-10 | 云知声智能科技股份有限公司 | Distributed training method, device, equipment and storage medium |
CN112288133A (en) * | 2020-09-28 | 2021-01-29 | 珠海大横琴科技发展有限公司 | Algorithm service processing method and device |
CN112529167A (en) * | 2020-12-25 | 2021-03-19 | 东云睿连(武汉)计算技术有限公司 | Interactive automatic training system and method for neural network |
CN112541836A (en) * | 2020-12-10 | 2021-03-23 | 贵州电网有限责任公司 | Multi-energy system digital twin application process modeling and deployment method and system |
CN112667303A (en) * | 2019-09-27 | 2021-04-16 | 杭州海康威视数字技术股份有限公司 | Method and device for processing artificial intelligence task |
CN112799658A (en) * | 2021-04-12 | 2021-05-14 | 北京百度网讯科技有限公司 | Model training method, model training platform, electronic device, and storage medium |
CN113127182A (en) * | 2019-12-30 | 2021-07-16 | 中国移动通信集团上海有限公司 | Deep learning scheduling configuration system and method |
CN113129049A (en) * | 2019-12-31 | 2021-07-16 | 上海哔哩哔哩科技有限公司 | File configuration method and system for model training and application |
US11307836B2 (en) | 2018-06-08 | 2022-04-19 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
WO2022120200A3 (en) * | 2020-12-04 | 2022-07-21 | Google Llc | Example-based voice bot development techniques |
US11804211B2 (en) | 2020-12-04 | 2023-10-31 | Google Llc | Example-based voice bot development techniques |
US11902222B2 (en) | 2021-02-08 | 2024-02-13 | Google Llc | Updating trained voice bot(s) utilizing example-based voice bot development techniques |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102624865A (en) * | 2012-01-09 | 2012-08-01 | 浙江大学 | Cluster load prediction method and distributed cluster management system |
CN103886203A (en) * | 2014-03-24 | 2014-06-25 | 美商天睿信息系统(北京)有限公司 | Automatic modeling system and method based on index prediction |
CN105022670A (en) * | 2015-07-17 | 2015-11-04 | 中国海洋大学 | Heterogeneous distributed task processing system and processing method in cloud computing platform |
CN105378699A (en) * | 2013-11-27 | 2016-03-02 | Ntt都科摩公司 | Automatic task classification based upon machine learning |
CN105575389A (en) * | 2015-12-07 | 2016-05-11 | 百度在线网络技术(北京)有限公司 | Model training method, system and device |
CN105956021A (en) * | 2016-04-22 | 2016-09-21 | 华中科技大学 | Automated task parallel method suitable for distributed machine learning and system thereof |
CN106095391A (en) * | 2016-05-31 | 2016-11-09 | 携程计算机技术(上海)有限公司 | Based on big data platform and the computational methods of algorithm model and system |
-
2016
- 2016-12-14 CN CN201611153818.2A patent/CN108229686B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102624865A (en) * | 2012-01-09 | 2012-08-01 | 浙江大学 | Cluster load prediction method and distributed cluster management system |
CN105378699A (en) * | 2013-11-27 | 2016-03-02 | Ntt都科摩公司 | Automatic task classification based upon machine learning |
CN103886203A (en) * | 2014-03-24 | 2014-06-25 | 美商天睿信息系统(北京)有限公司 | Automatic modeling system and method based on index prediction |
CN105022670A (en) * | 2015-07-17 | 2015-11-04 | 中国海洋大学 | Heterogeneous distributed task processing system and processing method in cloud computing platform |
CN105575389A (en) * | 2015-12-07 | 2016-05-11 | 百度在线网络技术(北京)有限公司 | Model training method, system and device |
CN105956021A (en) * | 2016-04-22 | 2016-09-21 | 华中科技大学 | Automated task parallel method suitable for distributed machine learning and system thereof |
CN106095391A (en) * | 2016-05-31 | 2016-11-09 | 携程计算机技术(上海)有限公司 | Based on big data platform and the computational methods of algorithm model and system |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107766940B (en) * | 2017-11-20 | 2021-07-23 | 北京百度网讯科技有限公司 | Method and apparatus for generating a model |
CN107766940A (en) * | 2017-11-20 | 2018-03-06 | 北京百度网讯科技有限公司 | Method and apparatus for generation model |
US11726754B2 (en) | 2018-06-08 | 2023-08-15 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
US11403080B2 (en) | 2018-06-08 | 2022-08-02 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
US11379199B2 (en) | 2018-06-08 | 2022-07-05 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
US11334329B2 (en) | 2018-06-08 | 2022-05-17 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
US11334330B2 (en) | 2018-06-08 | 2022-05-17 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
US11307836B2 (en) | 2018-06-08 | 2022-04-19 | Shanghai Cambricon Information Technology Co., Ltd. | General machine learning model, and model file generation and parsing method |
CN110210624A (en) * | 2018-07-05 | 2019-09-06 | 第四范式(北京)技术有限公司 | Execute method, apparatus, equipment and the storage medium of machine-learning process |
CN109255234A (en) * | 2018-08-15 | 2019-01-22 | 腾讯科技(深圳)有限公司 | Processing method, device, medium and the electronic equipment of machine learning model |
CN109376012A (en) * | 2018-10-10 | 2019-02-22 | 电子科技大学 | A kind of self-adapting task scheduling method based on Spark for isomerous environment |
CN111338630A (en) * | 2018-11-30 | 2020-06-26 | 上海寒武纪信息科技有限公司 | Method and device for generating universal machine learning model file and storage medium |
CN111338630B (en) * | 2018-11-30 | 2022-02-08 | 上海寒武纪信息科技有限公司 | Method and device for generating universal machine learning model file and storage medium |
CN110188884A (en) * | 2019-05-14 | 2019-08-30 | 深圳极视角科技有限公司 | A kind of data processing method and Inference Platform |
CN110209574A (en) * | 2019-05-14 | 2019-09-06 | 深圳极视角科技有限公司 | A kind of data mining system based on artificial intelligence |
CN110795077A (en) * | 2019-09-26 | 2020-02-14 | 北京你财富计算机科技有限公司 | Software development method and device based on artificial intelligence and electronic equipment |
CN112667303A (en) * | 2019-09-27 | 2021-04-16 | 杭州海康威视数字技术股份有限公司 | Method and device for processing artificial intelligence task |
CN113127182A (en) * | 2019-12-30 | 2021-07-16 | 中国移动通信集团上海有限公司 | Deep learning scheduling configuration system and method |
CN113129049A (en) * | 2019-12-31 | 2021-07-16 | 上海哔哩哔哩科技有限公司 | File configuration method and system for model training and application |
CN113129049B (en) * | 2019-12-31 | 2023-07-28 | 上海哔哩哔哩科技有限公司 | File configuration method and system for model training and application |
CN111310934A (en) * | 2020-02-14 | 2020-06-19 | 北京百度网讯科技有限公司 | Model generation method and device, electronic equipment and storage medium |
CN111310934B (en) * | 2020-02-14 | 2023-10-17 | 北京百度网讯科技有限公司 | Model generation method and device, electronic equipment and storage medium |
CN111523676A (en) * | 2020-04-17 | 2020-08-11 | 第四范式(北京)技术有限公司 | Method and device for assisting machine learning model to be online |
WO2021208774A1 (en) * | 2020-04-17 | 2021-10-21 | 第四范式(北京)技术有限公司 | Method and apparatus for assisting machine learning model to go online |
CN111523676B (en) * | 2020-04-17 | 2024-04-12 | 第四范式(北京)技术有限公司 | Method and device for assisting machine learning model to be online |
CN111522570A (en) * | 2020-06-19 | 2020-08-11 | 杭州海康威视数字技术股份有限公司 | Target library updating method and device, electronic equipment and machine-readable storage medium |
CN111522570B (en) * | 2020-06-19 | 2023-09-05 | 杭州海康威视数字技术股份有限公司 | Target library updating method and device, electronic equipment and machine-readable storage medium |
CN111625202A (en) * | 2020-07-28 | 2020-09-04 | 上海聪链信息科技有限公司 | Algorithm extension customizing method and system of block chain chip |
CN111625202B (en) * | 2020-07-28 | 2021-03-09 | 上海聪链信息科技有限公司 | Algorithm extension customizing method and system of block chain chip |
CN111917579A (en) * | 2020-07-30 | 2020-11-10 | 云知声智能科技股份有限公司 | Distributed training method, device, equipment and storage medium |
CN112288133A (en) * | 2020-09-28 | 2021-01-29 | 珠海大横琴科技发展有限公司 | Algorithm service processing method and device |
WO2022120200A3 (en) * | 2020-12-04 | 2022-07-21 | Google Llc | Example-based voice bot development techniques |
US11804211B2 (en) | 2020-12-04 | 2023-10-31 | Google Llc | Example-based voice bot development techniques |
CN112541836A (en) * | 2020-12-10 | 2021-03-23 | 贵州电网有限责任公司 | Multi-energy system digital twin application process modeling and deployment method and system |
WO2022134600A1 (en) * | 2020-12-25 | 2022-06-30 | 东云睿连(武汉)计算技术有限公司 | Interactive automatic training system and method for neural network |
CN112529167A (en) * | 2020-12-25 | 2021-03-19 | 东云睿连(武汉)计算技术有限公司 | Interactive automatic training system and method for neural network |
CN112529167B (en) * | 2020-12-25 | 2024-05-14 | 东云睿连(武汉)计算技术有限公司 | Neural network interactive automatic training system and method |
US11902222B2 (en) | 2021-02-08 | 2024-02-13 | Google Llc | Updating trained voice bot(s) utilizing example-based voice bot development techniques |
CN112799658A (en) * | 2021-04-12 | 2021-05-14 | 北京百度网讯科技有限公司 | Model training method, model training platform, electronic device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108229686B (en) | 2022-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108229686A (en) | Model training, Forecasting Methodology, device, electronic equipment and machine learning platform | |
CN113261003B (en) | Generating integrated circuit plan using neural networks | |
Barbieri et al. | A virtual commissioning based methodology to integrate digital twins into manufacturing systems | |
US10372859B2 (en) | System and method for designing system on chip (SoC) circuits using single instruction multiple agent (SIMA) instructions | |
CN107077385B (en) | For reducing system, method and the storage medium of calculated examples starting time | |
Efthymiou et al. | On knowledge reuse for manufacturing systems design and planning: A semantic technology approach | |
CN114036704A (en) | Controller system for supervisory control of independent transport technology tracks and lines | |
CN110622139A (en) | Building management system development and control platform | |
Bisong et al. | Tensorflow 2.0 and keras | |
CN114154641A (en) | AI model training method and device, computing equipment and storage medium | |
KR101886317B1 (en) | Apparatus and method for selecting placement of virtual machine | |
CN109754090A (en) | It supports to execute distributed system and method that more machine learning model predictions service | |
JP7246447B2 (en) | Model training method, apparatus, electronic device, storage medium, development system and program | |
Al-Azawi et al. | Towards agent-based agile approach for game development methodology | |
CN101373432A (en) | Method and system for predicting component system performance based on intermediate part | |
CN111985631B (en) | Information processing apparatus, information processing method, and computer-readable recording medium | |
US10402399B2 (en) | Computer implemented system and method for dynamically optimizing business processes | |
CN116127899A (en) | Chip design system, method, electronic device, and storage medium | |
CN116868202A (en) | Data processing method, device, equipment and medium | |
Blondet et al. | An ontology for numerical design of experiments processes | |
Krenczyk et al. | ERP, APS and simulation systems integration to support production planning and scheduling | |
US20190005169A1 (en) | Dynamic Design of Complex System-of-Systems for Planning and Adaptation to Unplanned Scenarios | |
Borsodi et al. | Generative Design: An overview and its relationship to artificial intelligence | |
CN117311778A (en) | Application management method and device | |
US20230222385A1 (en) | Evaluation method, evaluation apparatus, and non-transitory computer-readable recording medium storing evaluation program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |