CN107844837A - The method and system of algorithm parameter tuning are carried out for machine learning algorithm - Google Patents
The method and system of algorithm parameter tuning are carried out for machine learning algorithm Download PDFInfo
- Publication number
- CN107844837A CN107844837A CN201711048805.3A CN201711048805A CN107844837A CN 107844837 A CN107844837 A CN 107844837A CN 201711048805 A CN201711048805 A CN 201711048805A CN 107844837 A CN107844837 A CN 107844837A
- Authority
- CN
- China
- Prior art keywords
- algorithm
- machine learning
- algorithm parameter
- parameter value
- candidate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/445—Program loading or initiating
- G06F9/44505—Configuring for program initiating, e.g. using registry, configuration files
- G06F9/4451—User profiles; Roaming
Abstract
A kind of method and system that algorithm parameter tuning is carried out for machine learning algorithm are provided.Methods described includes:(A) machine learning algorithm for training machine learning model is determined;(B) graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm is provided a user, wherein, it is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate algorithm parameter value;(C) user is received in order to set the input operation adjusted ginseng configuration item and performed on graphical interfaces, and configuration item is joined to obtain the tune of user's setting according to the input operation;(D) tune based on acquisition joins configuration item to generate multigroup candidate algorithm parameter value;(E) respectively under every group of candidate algorithm parameter value, machine learning model corresponding with every group of candidate algorithm parameter value is trained according to the machine learning algorithm;(F) effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained is assessed.
Description
Technical field
All things considered of the present invention is related to artificial intelligence field, more particularly, is related to one kind and enters for machine learning algorithm
The method and system of row algorithm parameter tuning.
Background technology
At this stage, the basic process of training machine learning model mainly includes:
1st, the data set (for example, tables of data) for including historgraphic data recording is imported;
2nd, Feature Engineering is completed, wherein, various processing are carried out by the attribute information for the data record concentrated to data, with
Obtain each feature (such as, it may include assemblage characteristic), these features form characteristic vector can be used as machine learning sample;
3rd, training pattern, wherein, according to setting machine learning algorithm (for example, logistic regression algorithm, decision Tree algorithms,
Neural network algorithm etc.), learn model based on the machine learning sample obtained by process Feature Engineering.Here, engineering
The quality of model of the algorithm parameter of habit algorithm on learning, which has, to be significantly affected.
On existing machine learning platform, it can use and machine learning model is completed based on the interactive mode of graphical interfaces
Flow is trained, program code is write in person without user.However, in training pattern link, but will often be manually set
Algorithm parameter value be manually input in plateform system.That is, user needs to carry out algorithm parameter tuning in advance, and nothing
Method effectively realizes algorithm parameter automated tuning by platform.
However, algorithm parameter tuning is often more complicated, it usually needs artificial tuning one by one, so algorithm parameter
Tuning is a more time-consuming thing;And, it is desirable to obtain making the preferable algorithm parameter value of modelling effect, it is to be understood that
Principle inside machine learning, it is known that each pass that influences each other between the implication of algorithm parameter, coverage and algorithm parameter
System etc., technical threshold is higher, and needs user constantly to be attempted, efficiency when greatly affected user's training pattern with
Experience.
The content of the invention
The exemplary embodiment of the present invention is to provide a kind of side for carrying out algorithm parameter tuning for machine learning algorithm
Method and system, it is used for training machine study mould to solve easily to be directed in machine learning system existing for prior art
The machine learning algorithm of type carries out the problem of algorithm parameter automated tuning.
According to the exemplary embodiment of the present invention, there is provided a kind of side that algorithm parameter tuning is carried out for machine learning algorithm
Method, including:(A) machine learning algorithm for training machine learning model is determined;(B) provide a user for setting the machine
The graphical interfaces of the tune ginseng configuration item of device learning algorithm, wherein, it is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate
Algorithm parameter value, wherein, every group of candidate algorithm parameter value includes the one of each algorithm parameter to be adjusted of the machine learning algorithm
Individual candidate algorithm parameter value;(C) user is received in order to set the input behaviour for adjusting ginseng configuration item and being performed on graphical interfaces
Make, and configuration item is joined to obtain the tune of user's setting according to the input operation;(D) tune based on acquisition joins configuration item to generate
Multigroup candidate algorithm parameter value;(E) respectively under every group of candidate algorithm parameter value, according to the machine learning algorithm come train with
Machine learning model corresponding to every group of candidate algorithm parameter value;(F) assessment trains corresponding with every group of candidate algorithm parameter value
Machine learning model effect.
Alternatively, methods described also includes:(G) show multigroup candidate algorithm parameter value of generation to user and train
Machine learning model corresponding with every group of candidate algorithm parameter value effect.
Alternatively, methods described also includes:(H) directly the algorithm of the algorithm parameter to be adjusted of the machine learning algorithm is joined
Numerical value is arranged to one group of candidate algorithm parameter value corresponding to the best machine learning model of effect, and by the algorithm parameter of setting
In the step of value is applied to follow-up training machine learning model.
Alternatively, it is described to adjust ginseng configuration item to include at least one among following item:Initial value configuration item, treated for specifying
Adjust the initial value of algorithm parameter so that calculated in step (D) based on the initial value of specified algorithm parameter to be adjusted to generate to wait to adjust
At least one candidate algorithm parameter value of method parameter;Span configuration item, for specifying the span of algorithm parameter to be adjusted,
So that at least one time of algorithm parameter to be adjusted is generated based on the span of specified algorithm parameter to be adjusted in step (D)
Select algorithm parameter value;Parameter adjustment method configuration item, the method for specifying the multigroup candidate algorithm parameter value of generation so that in step
(D) multigroup candidate is generated based at least one candidate algorithm parameter value of each algorithm parameter to be adjusted according to specified method in
Algorithm parameter value.
Alternatively, in step (E), machine learning model corresponding with every group of candidate algorithm parameter value is concurrently trained,
Wherein, concurrently train machine learning model corresponding with every group of candidate algorithm parameter value when, by parameter server safeguard and
The parameter of machine learning model corresponding to every group of candidate algorithm parameter value, wherein, the parameter has the form of key-value pair, described
Parameter server preserves multiple key-value pairs with same keys according to single key corresponding to the form of multiple values.
Alternatively, in step (E), concurrently trained as multiple computing devices corresponding with every group of candidate algorithm parameter value
Machine learning model, wherein, when concurrently training machine learning model corresponding with every group of candidate algorithm parameter value, by parameter
Server safeguards the parameter of machine learning model corresponding with every group of candidate algorithm parameter value, wherein, the parameter server bag
At least one server end and multiple client are included, wherein, client corresponds with computing device, also, corresponding client
End and computing device become one, wherein, at least one server end is used to preserve and every group of candidate algorithm parameter value
The parameter of corresponding machine learning model;Each client is used to be transmitted at least one set between one or more server ends
The parameter manipulation instruction of the parameter involved by the machine learning algorithm under candidate algorithm parameter value, wherein, it is and described each
Computing device corresponding to client is configured under at least one set of candidate algorithm parameter value, according to the engineering
Practise algorithm and carry out training machine learning model, wherein, in parameter manipulation instruction, same keys are compressed and/or merged.
Alternatively, in step (E), under every group of candidate algorithm parameter value, performed according to the machine learning algorithm
The same data stream type on machine learning model training calculates, wherein, between being calculated by merging each data stream type
Same treatment step calculates to perform data stream type.
Alternatively, methods described also includes:(I) by the algorithm parameter value of the algorithm parameter to be adjusted of the machine learning algorithm
It is arranged to one group of candidate algorithm parameter value that user selects from multigroup candidate algorithm parameter value of display, and by the algorithm of setting
Parameter value was applied in the step of follow-up training machine learning model.
Alternatively, methods described also includes:(J) by one group of candidate algorithm corresponding to the best machine learning model of effect
Parameter value is preserved in the form of configuration file.
Alternatively, methods described also includes:(K) select user from multigroup candidate algorithm parameter value of display one group
Candidate algorithm parameter value is preserved in the form of configuration file.
In accordance with an alternative illustrative embodiment of the present invention, there is provided one kind carries out algorithm parameter tuning for machine learning algorithm
System, including:Algorithm determining device, for determining the machine learning algorithm for training machine learning model;Display device,
For providing a user the graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm, wherein, the tune ginseng is matched somebody with somebody
Put item be used for limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm parameter value includes the engineering
Practise a candidate algorithm parameter value of each algorithm parameter to be adjusted of algorithm;Configuration item acquisition device, for receive user in order to
The input operation adjusted ginseng configuration item and performed on graphical interfaces is set, and set according to the input operation to obtain user
The tune ginseng configuration item put;Algorithm parameter value generation device, multigroup candidate algorithm is generated for the tune ginseng configuration item based on acquisition
Parameter value;At least one computing device, under every group of candidate algorithm parameter value, coming respectively according to the machine learning algorithm
Training machine learning model corresponding with every group of candidate algorithm parameter value;Apparatus for evaluating, for assess train with every group of time
Select the effect of machine learning model corresponding to algorithm parameter value.
Alternatively, display device also to user show generation multigroup candidate algorithm parameter value and train with every group
The effect of machine learning model corresponding to candidate algorithm parameter value.
Alternatively, the system also includes:Application apparatus, for directly joining the algorithm to be adjusted of the machine learning algorithm
Several algorithm parameter values is arranged to one group of candidate algorithm parameter value corresponding to the best machine learning model of effect, and will set
Algorithm parameter value the step of being applied to follow-up training machine learning model in.
Alternatively, it is described to adjust ginseng configuration item to include at least one among following item:Initial value configuration item, treated for specifying
Adjust the initial value of algorithm parameter so that algorithm parameter value generation device is generated based on the initial value of specified algorithm parameter to be adjusted
At least one candidate algorithm parameter value of algorithm parameter to be adjusted;Span configuration item, for specifying taking for algorithm parameter to be adjusted
It is worth scope so that algorithm parameter value generation device generates algorithm ginseng to be adjusted based on the span of specified algorithm parameter to be adjusted
Several at least one candidate algorithm parameter values;Parameter adjustment method configuration item, for specifying the side of the multigroup candidate algorithm parameter value of generation
Method so that algorithm parameter value generation device is according at least one candidate algorithm of the specified method based on each algorithm parameter to be adjusted
Parameter value generates multigroup candidate algorithm parameter value.
Alternatively, at least one computing device concurrently trains engineering corresponding with every group of candidate algorithm parameter value
Model is practised, wherein, concurrently train engineering corresponding with every group of candidate algorithm parameter value at least one computing device
When practising model, the parameter of machine learning model corresponding with every group of candidate algorithm parameter value is safeguarded as parameter server, wherein, institute
Stating parameter has the form of key-value pair, and the parameter server is preserved with phase according to single key corresponding to the form of multiple values
With multiple key-value pairs of key.
Alternatively, the system includes multiple computing devices, wherein, the multiple computing device is concurrently trained and every group
Machine learning model corresponding to candidate algorithm parameter value, wherein, concurrently trained and every group of candidate in the multiple computing device
Corresponding to algorithm parameter value during machine learning model, machine corresponding with every group of candidate algorithm parameter value is safeguarded as parameter server
The parameter of learning model, wherein, the parameter server includes at least one server end and multiple client, wherein, client
End corresponds with computing device, also, corresponding client and computing device become one, wherein, it is described at least one
Server end is used for the parameter for preserving machine learning model corresponding with every group of candidate algorithm parameter value;Each client be used for
It is transmitted between one or more server ends involved by the machine learning algorithm under at least one set of candidate algorithm parameter value
Parameter parameter manipulation instruction, wherein, computing device corresponding with each client be configured to it is described extremely
Under few one group of candidate algorithm parameter value, carry out training machine learning model according to the machine learning algorithm, wherein, in the parameter
In operational order, same keys are compressed and/or merged.
Alternatively, at least one computing device is calculated under every group of candidate algorithm parameter value according to the machine learning
Method calculates to perform the same data stream type on machine learning model training, wherein, by merging each data stream type meter
Same treatment step between calculation calculates to perform data stream type.
Alternatively, the system also includes:Application apparatus, for by the algorithm parameter to be adjusted of the machine learning algorithm
Algorithm parameter value is arranged to one group of candidate algorithm parameter value that user selects from multigroup candidate algorithm parameter value of display, and will
The algorithm parameter value of setting was applied in the step of follow-up training machine learning model.
Alternatively, the system also includes:Save set, for by one corresponding to the best machine learning model of effect
Group candidate algorithm parameter value is preserved in the form of configuration file.
Alternatively, the system also includes:Save set, for by user from multigroup candidate algorithm parameter value of display
The one group of candidate algorithm parameter value selected is preserved in the form of configuration file.
In accordance with an alternative illustrative embodiment of the present invention, there is provided one kind is used to carry out algorithm parameter for machine learning algorithm
The computer-readable medium of tuning, wherein, record has on the computer-readable medium is directed to as described above for performing
Machine learning algorithm carries out the computer program of the method for algorithm parameter tuning.
In accordance with an alternative illustrative embodiment of the present invention, there is provided one kind is used to carry out algorithm parameter for machine learning algorithm
The computer of tuning, including memory unit and processor, wherein, set of computer-executable instructions conjunction is stored with memory unit,
When the set of computer-executable instructions is closed by the computing device, execution is entered for machine learning algorithm as described above
The method of row algorithm parameter tuning.
The method and system according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm,
Provide a kind of convenient and efficient and the algorithm parameter evolutionary process of interaction close friend, user need to be only provided for by interactive interface
Limit the relevant configuration item for how generating multigroup candidate algorithm parameter value, you can realize automatic algorithms arameter optimization, both improved
Consumer's Experience, also improve the effect of machine learning model.
By in terms of partly illustrating that present general inventive concept is other in following description and/or advantage, also one
Divide and will be apparent by description, or the implementation of present general inventive concept can be passed through and learnt.
Brief description of the drawings
By with reference to be exemplarily illustrated embodiment accompanying drawing carry out description, exemplary embodiment of the present it is upper
State and will become apparent with other purposes and feature, wherein:
Fig. 1 shows the method according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
Flow chart;
Fig. 2 shows the example of the parameter of preservation machine learning model according to an exemplary embodiment of the present invention;
Fig. 3 shows that the machine learning algorithm that is directed in accordance with an alternative illustrative embodiment of the present invention carries out algorithm parameter tuning
Method flow chart;
Fig. 4 and Fig. 5 shows that the tune according to an exemplary embodiment of the present invention for being used to set machine learning algorithm joins configuration item
Graphical interfaces example;
Fig. 6 shows the example of algorithm parameter tuning analysis report according to an exemplary embodiment of the present invention;
Fig. 7 shows according to an exemplary embodiment of the present invention be used for for machine learning algorithm progress algorithm parameter tuning
The example of DAG figures;
Fig. 8 shows the system according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
Block diagram.
Embodiment
Embodiments of the invention are reference will now be made in detail, the example of the embodiment is shown in the drawings, wherein, identical mark
Number identical part is referred to all the time.The embodiment will be illustrated by referring to accompanying drawing below, to explain the present invention.
Here, machine learning is the inevitable outcome that artificial intelligence study develops into certain phase, and it is directed to passing through calculating
Means, improve the performance of system itself using experience.In computer systems, " experience " is generally deposited in the form of " data "
By machine learning algorithm, " model " can be being produced from data, that is to say, that be supplied to machine learning to calculate empirical data
Method, it can just be based on these empirical datas and produce model, when in face of news, model can provide corresponding judgement, i.e. prediction
As a result.Whether training machine learning model, or be predicted using the machine learning model trained, data are required for turning
It is changed to the machine learning sample including various features.Machine learning can be implemented as " supervised learning ", " unsupervised learning " or
The form of " semi-supervised learning ", it should be noted that exemplary embodiment of the invention is to specific machine learning algorithm and without spy
Definite limitation.Further, it should also be noted that train and application model during, may also be combined with other means such as statistic algorithm.
Fig. 1 shows the method according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
Flow chart.Here, as an example, methods described can be performed by computer program, also engineering can be directed to by special
Practise that algorithm carries out the system of algorithm parameter tuning or computer performs.
In step slo, it is determined that machine learning algorithm for training machine learning model.
As an example, the machine learning algorithm can be that FTRL (Follow the Regularized Leader) is excellent
Change logistic regression algorithm or the other machines learning algorithms such as algorithm, the invention is not limited in this regard.
As an example, can according to user the machine learning algorithm for being provided for training machine learning model figure
The input operation performed on interface, to determine the machine learning algorithm for training machine learning model.
In step S20, the graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm is provided a user,
Wherein, it is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm parameter value
One candidate algorithm parameter value of each algorithm parameter to be adjusted including the machine learning algorithm.According to the exemplary of the present invention
Embodiment, the tune that multigroup candidate algorithm parameter value for algorithm parameter tuning can be set based on user join configuration item to generate.
It should be understood that the algorithm parameter to be adjusted of different machines learning algorithm can be different, can also be identical.As an example,
When the machine learning algorithm is FTRL optimized algorithms, the algorithm parameter to be adjusted of the machine learning algorithm may include:It is maximum
Exercise wheel number, learning rate, L1 regularization coefficients, L2 regularization coefficients.
In step s 30, user is received in order to set the input behaviour for adjusting ginseng configuration item and being performed on graphical interfaces
Make, and configuration item is joined to obtain the tune of user's setting according to the input operation.
As an example, the graphical interfaces provided a user may include that each adjust joins input control corresponding to configuration item to select
And/or content of edit, so as to be joined by receiving selection operation and/or the edit operation of user to obtain the tune set by user
Configuration item.
In step s 40, the tune based on acquisition joins configuration item to generate multigroup candidate algorithm parameter value.
As an example, tune ginseng configuration item may include at least one among following item:Initial value configuration item, span are matched somebody with somebody
Put item, parameter adjustment method configuration item.It should be understood that it is other for limiting how to generate multigroup candidate to adjust ginseng configuration item may also comprise
The configuration item of algorithm parameter value.
Particularly, initial value configuration item is used for the initial value for specifying algorithm parameter to be adjusted so that is based in step s 40
The initial value for the algorithm parameter to be adjusted specified generates at least one candidate algorithm parameter value of algorithm parameter to be adjusted.
Span configuration item is used for the span for specifying algorithm parameter to be adjusted so that in step s 40 based on specified
The span of algorithm parameter to be adjusted generate at least one candidate algorithm parameter value of algorithm parameter to be adjusted.
As an example, span configuration item can further comprise sample range configuration item and sampling number configuration item.Tool
Body, sample range configuration item is used to specify the number range sampled, and sampling number configuration item, which is used to specify, to be sampled
Number so that sample the number specified in specified number range in step s 40, and using the numerical value collected as treating
Adjust the candidate algorithm parameter value of algorithm parameter.
As another example, span configuration item can further comprise specific value configuration item.Specifically, specific value
Configuration item is used for the specific value for directly specifying algorithm parameter to be adjusted so that in step s 40 by specified algorithm parameter to be adjusted
Candidate algorithm parameter value of the specific value as algorithm parameter to be adjusted.
As an example, span configuration item can both include sample range configuration item and sampling number configuration item, also include
Specific value configuration item, it can be matched somebody with somebody according to the input operation that user performs on graphical interfaces to obtain the sample range of user's setting
Put item and sampling number configuration item, or the specific value configuration item that user is set.
Moreover, it should be understood that if ginseng configuration item is adjusted to include initial value configuration item and span configuration item, in step
In S40, what the initial value and span configuration item of the algorithm parameter to be adjusted that can be specified based on initial value configuration item were specified waits to adjust
The span of algorithm parameter generates at least one candidate algorithm parameter value of algorithm parameter to be adjusted.
Parameter adjustment method configuration item is used for the method for specifying the multigroup candidate algorithm parameter value of generation so that presses in step s 40
Multigroup candidate algorithm ginseng is generated based at least one candidate algorithm parameter value of each algorithm parameter to be adjusted according to specified method
Numerical value.
As an example, the method for the multigroup candidate algorithm parameter value of the generation can be based on needing to be adjusted algorithm parameter
All candidate algorithm parameter values, the random method for generating N group candidate algorithm parameter values, wherein, every group of candidate algorithm parameter value bag
A candidate algorithm parameter value of each algorithm parameter to be adjusted is included, wherein, N is the integer more than 0, and N value can be by adjusting ginseng side
Method configuration item is specified, or pre-set.Here, for example, the method for the multigroup candidate algorithm parameter value of generation
It can be Random Search (random search) method.
As another example, the method for the multigroup candidate algorithm parameter value of generation can be based on needing to be adjusted algorithm to join
Several all candidate algorithm parameter values, generation include all differences of each candidate algorithm parameter value of algorithm parameter to be adjusted
The method of combination.Here, for example, the method for the multigroup candidate algorithm parameter value of generation can be that Grid Search (search by grid
Rope) method.
However, it should be noted that above example be only used for illustrating and explain the present invention exemplary embodiment, and the present invention show
Example property embodiment not necessarily needs user to configure above-mentioned project, for example, can be based on the algorithm parameter to be adjusted pre-set
Initial value generate at least one candidate algorithm parameter value of algorithm parameter to be adjusted;Or it can be treated based on what is pre-set
The span of algorithm parameter is adjusted to generate at least one candidate algorithm parameter value of algorithm parameter to be adjusted;Or can be according to pre-
First set be used for generate at least one candidate of the method for multigroup candidate algorithm parameter value based on each algorithm parameter to be adjusted
Algorithm parameter value generates multigroup candidate algorithm parameter value.
In step s 50, respectively under every group of candidate algorithm parameter value, according to the machine learning algorithm come train with often
Machine learning model corresponding to group candidate algorithm parameter value.Particularly, it is in the algorithm parameter value of algorithm parameter to be adjusted respectively
Machine learning algorithm is performed in the case of every group of candidate algorithm parameter value, it is corresponding with every group of candidate algorithm parameter value to respectively obtain
Machine learning model.
As an example, in step s 50, it can concurrently train machine learning mould corresponding with every group of candidate algorithm parameter value
Type, to improve the efficiency of algorithm parameter tuning, and fully utilize computing resource.
, can when concurrently training machine learning model corresponding with every group of candidate algorithm parameter value as an example
The parameter of machine learning model corresponding with every group of candidate algorithm parameter value is safeguarded as parameter server, wherein, the parameter tool
There is key-value pair (key-value) form, the form that the parameter server corresponds to multiple values according to single key has to preserve
There are multiple key-value pairs of same keys, machine learning mould corresponding with every group of candidate algorithm parameter value is stored at the same time so as to avoid
Storage overhead linearly increases during the parameter of type.
Particularly, with every group of candidate algorithm parameter value corresponding to machine learning model can corresponding a set of key-value pair, at this
Cover in key-value pair, each key can be related to the aspect of model, and each key corresponds to respective value.Also, join from different groups of candidate algorithms
Key-value pair corresponding to machine learning model corresponding to numerical value has identical key.As shown in Fig. 2 calculated with the 1st group of candidate
Machine learning model corresponding to method parameter value corresponds to a set of key-value pair, including key k1, k2, k3 ..., km, respectively respective value
v11、v12、v13、…、v1m;A set of key-value pair corresponding with the 2nd group of machine learning model corresponding to candidate algorithm parameter value, wherein
Including key k1, k2, k3 ..., km, respectively respective value v21, v22, v23 ..., v2m;It is corresponding with n-th group candidate algorithm parameter value
Machine learning model corresponds to a set of key-value pair, including key k1, k2, k3 ..., km, respectively respective value vn1, vn2, vn3 ...,
Vnm, wherein, m is the integer more than 1, and n is the integer more than 1.As can be seen that the key-value pair among n set key-value pairs has completely
Identical key, therefore, according to the exemplary embodiment of the present invention, parameter server can correspond to the shape of multiple values according to single key
Formula preserves the key-value pair with same keys, that is, by the key-value pair corresponding to different machines learning model with same keys
Merge and save as form of the single key corresponding to multiple values, for example, saving as the shape that key k1 corresponds to value v11, v21 ..., vn1
Formula.
As another example, machine corresponding with every group of candidate algorithm parameter value can be concurrently trained as multiple computing devices
Learning model, wherein, when concurrently training machine learning model corresponding with every group of candidate algorithm parameter value, can be taken by parameter
The parameter for device maintenance machine learning model corresponding with every group of candidate algorithm parameter value of being engaged in, the parameter server can have distribution
Formula structure, wherein, the parameter server may include at least one server end and multiple client, wherein, client and meter
Device is calculated to correspond, also, corresponding client and computing device become one, wherein, at least one server
Hold the parameter for preserving machine learning model corresponding with every group of candidate algorithm parameter value;Each client be used for one or
The parameter involved by the machine learning algorithm under at least one set of candidate algorithm parameter value is transmitted between multiple server ends
The parameter manipulation of (that is, the parameter of machine learning model corresponding with least one set of candidate algorithm parameter value) instructs, wherein,
Computing device corresponding with each client is configured under at least one set of candidate algorithm parameter value, according to
The machine learning algorithm carrys out training machine learning model, wherein, the parameter manipulation instruction in, same keys by compression and/
Or merge.According to the exemplary embodiment of the present invention, the operation that passes a parameter between client and server can be effectively reduced
The network overhead of instruction.
Join as an example, each client can receive from corresponding computing device at least one set of candidate algorithm
The parameter manipulation request of the parameter involved by the machine learning algorithm under numerical value, one or more for preserving the parameter
Individual server end generates parameter manipulation instruction corresponding with parameter manipulation request respectively, and the parameter manipulation of generation is instructed
It is respectively transmitted to one or more of server ends.Further, as an example, each client can be from one
Or the parameter manipulation instruction corresponding with the parameter manipulation result of the parameter of multiple received server-sides, based on the parameter received
Operational order generates each corresponding parameter manipulation result among being asked with the parameter manipulation, and the parameter of generation is grasped
Corresponding computing device is sent to as result.For example, parameter manipulation request may include to pull (pull) operation requests and/or
Push (push) operation requests.
According to the exemplary embodiment of the present invention, each computing device is training the process of at least one machine learning model
The middle parameter that can obtain and/or update at least one machine learning model to its corresponding client request, it is here, described
Parameter is stored in one or more server ends with being distributed.Therefore, client is after the request of any parameter manipulation is received,
Parameter manipulation requested part corresponding with each server end can be split as, and the various pieces after fractionation are stored in phase
In the queue answered.For example, can be directed to each server end sets corresponding queue.As an example, client generates ginseng every time
The parameter manipulation request that number operational order is based on can be the various pieces cached in queue, i.e. last from the client
After generating parameter manipulation instruction, to before this generation parameter manipulation instruction, it is directed to from what corresponding computing device received
At least one parameter manipulation requested part at respective server end.Due to being generated respectively and each server based on each queue
Parameter manipulation corresponding to end instructs, accordingly, it is considered to be cached with into each queue related at least one machine learning model
Parameter manipulation is asked, the parameter manipulation request that the parameter manipulation instruction accordingly generated can be based on identical or different type, these ginsengs
Number operation requests can be directed to identical or different machine learning model.
As another example, can be performed under every group of candidate algorithm parameter value according to the machine learning algorithm same
The data stream type on machine learning model training calculate, wherein, it is identical between being calculated by merging each data stream type
Processing step calculates to perform data stream type, so as to reduce actual amount of calculation and read-write amount, brings performance boost.
As an example, the same treatment step between being calculated since upstream each data stream type merges,
That is, the common upstream treatment step between each data stream type calculating is merged.
Fig. 1 is returned, in step S60, assesses the machine learning mould corresponding with every group of candidate algorithm parameter value trained
The effect of type.It should be understood that the quality of the effect of machine learning model corresponding with every group of candidate algorithm parameter value can reflect
The quality of every group of candidate algorithm parameter value.
As an example, can be according to machine learning model commenting on evaluation index corresponding with every group of candidate algorithm parameter value
It is worth to assess the effect of machine learning model corresponding with every group of candidate algorithm parameter value.Here, the evaluation index can be with
It is evaluation index specified by evaluation index configuration item that user is set by graphical interfaces or what is pre-set comment
Valency index.
As an example, the model evaluation that the evaluation index can be the various effects for weighing machine learning model refers to
Mark.For example, the evaluation index can be AUC (ROC (Receiver Operating Characteristics, Receiver Operating
Characteristic) area under a curve, Area Under ROC Curve), MAE (mean absolute error, Mean
Absolute Error) or logarithm loss function (logloss) etc..
As an example, after step S60, it is according to an exemplary embodiment of the present invention to be calculated for machine learning algorithm
The method of method arameter optimization may also include:Directly the algorithm parameter value of the algorithm parameter to be adjusted of the machine learning algorithm is set
For one group of candidate algorithm parameter value corresponding to the best machine learning model of effect, and the algorithm parameter value of setting is applied to
In the step of follow-up training machine learning model.Here, one group of candidate corresponding to the best machine learning model of effect calculates
Method parameter value is to pass through the optimal one group algorithm parameter value obtained by algorithm parameter automated tuning.
As an example, after step S60, it is according to an exemplary embodiment of the present invention to be calculated for machine learning algorithm
The method of method arameter optimization may also include:By one group of candidate algorithm parameter value corresponding to the best machine learning model of effect with
The form of configuration file is preserved, so as to training machine learning model continuous after execution step when can be according to user's request
Directly invoke, or can be directly invoked when carrying out other machines learning process according to user's request.
Fig. 3 shows that the machine learning algorithm that is directed in accordance with an alternative illustrative embodiment of the present invention carries out algorithm parameter tuning
Method flow chart.As shown in figure 3, being calculated for machine learning algorithm in accordance with an alternative illustrative embodiment of the present invention
The method of method arameter optimization removes step S10, step S20, step S30, step S40, step S50 and the step included shown in Fig. 1
Outside S60, step S70 may also include.Step S10 to step S60 can refer to the embodiment described according to Fig. 1 come real
It is existing, it will not be repeated here.
In step S70, to user show generation multigroup candidate algorithm parameter value and train with every group of candidate
The effect of machine learning model corresponding to algorithm parameter value.Here, the multigroup of generation can be shown according to any effective form
The effect of candidate algorithm parameter value and the machine learning model corresponding with every group of candidate algorithm parameter value trained.
As an example, the machine learning algorithm that is directed in accordance with an alternative illustrative embodiment of the present invention carries out algorithm parameter tune
Excellent method may also include:The algorithm parameter value of the algorithm parameter to be adjusted of the machine learning algorithm is arranged to user from display
Multigroup candidate algorithm parameter value in one group of candidate algorithm parameter value selecting, and the algorithm parameter value of setting is applied to follow-up
Training machine learning model the step of in.
As another example, the machine learning algorithm that is directed in accordance with an alternative illustrative embodiment of the present invention carries out algorithm ginseng
The method of number tuning may also include:One group of candidate algorithm parameter that user is selected from multigroup candidate algorithm parameter value of display
Value is preserved in the form of configuration file, so as to training machine learning model continuous after execution step when can be according to user
Demand directly invokes, or can be directly invoked when carrying out other machines learning process according to user's request.
As an example, the machine learning algorithm that is directed in accordance with an alternative illustrative embodiment of the present invention carries out algorithm parameter tune
Excellent method may also include:The algorithm parameter value of the algorithm parameter to be adjusted of the machine learning algorithm is arranged to user from display
Multigroup candidate algorithm parameter value in one group of candidate algorithm parameter value selecting, the algorithm parameter value of setting is applied to follow-up
In the step of training machine learning model, and select one group of candidate algorithm parameter value is protected in the form of configuration file
Deposit.
Describe according to an exemplary embodiment of the present invention to be set by graphic interface by user with reference to Fig. 4 and Fig. 5
Put the example for adjusting ginseng configuration item.Fig. 4 and Fig. 5 shows according to an exemplary embodiment of the present invention for setting machine learning algorithm
Adjust the example of the graphical interfaces of ginseng configuration item.It should be understood that the exemplary embodiment of the present invention is when setting each tune ginseng configuration item
Specific interaction detail be not limited to example shown in Fig. 4 and Fig. 5.
As shown in Figure 4 and Figure 5, for set adjust ginseng configuration item graphical interfaces can show respectively with initial value configuration item,
Content options and/or content input frame corresponding to span configuration item and parameter adjustment method configuration item.Particularly, can root
Parameter adjustment method configuration item is set according to selection operation of the user in drop-down menu so that the content of user's selection is designated as adjusting
Ginseng method.For example, as shown in figure 4, user selects " this parameter adjustment method option of Random Search " so that " Random
Search " is designated as parameter adjustment method, also, the content input frame for setting " adjusting ginseng number " can be also shown to user, and
" Random is utilized to specify according to edit operation (for example, as shown in Figure 4 input numerical value " 6 ") of the user to the content input frame
The quantity (for example, specifying 6 groups of candidate algorithm parameter values of generation) for the candidate algorithm combining parameter values that Search " is generated.
Graphical interfaces may also display initial value configuration item corresponding with the algorithm parameter to be adjusted included by machine learning algorithm
And/or span configuration item.As shown in figure 4, here, the machine learning algorithm for training machine learning model is FTRL excellent
Change algorithm, the algorithm parameter to be adjusted of FTRL optimized algorithms may include:Maximum exercise wheel number, learning rate, L1 regularization coefficients, L2 are just
Then term coefficient, correspondingly, graphical interfaces can show corresponding for setting the interior of initial value respectively with above-mentioned algorithm parameter to be adjusted
Hold input frame, so as to according to user to the content input frame edit operation (for example, as shown in Figure 4 respectively corresponding interior
Hold input numerical value " 4 ", " 0.5 ", " 0 ", " 0 " in input frame) realize the setting to initial value configuration item.
The input operation of " parameter area setting " option can be chosen according to user, is ejected for setting span configuration item
Graphical interfaces.As shown in figure 5, the graphical interfaces of ejection can be shown distinguishes corresponding span with each algorithm parameter to be adjusted
Configuration item.For each algorithm parameter to be adjusted, " specified range " option or " numerical value is enumerated " option may be selected in user, if user
" specified range " option is chosen, then the content for setting sample range configuration item and sampling number configuration item can be shown to user
Input frame, and the algorithm to be adjusted corresponding to the content input frame being set according to input operation of the user to the content input frame
The span of parameter;If user chooses " numerical value is enumerated " option, can be shown to user for setting specific value to configure
Content input frame, and corresponding to the content input frame being set according to input operation of the user to the content input frame
The span of algorithm parameter to be adjusted.For example, as shown in figure 5, for maximum exercise wheel number this parameter to be adjusted, can be according to user
Edit operation (input numerical value " 1 and 10 ") to " sample range " content input frame, to the volume of " sampling number " content input frame
Operation (input numerical value " 1 ") is collected to be appointed as the span of maximum exercise wheel number to sample 1 institute in number range 1-10
The numerical value collected.This parameter to be adjusted for learning rate, the editor of " numerical value is enumerated " content input frame can be grasped according to user
Make (input numerical value " 2 ", " 4 ", " 8 ") and the span of learning rate is appointed as numerical value " 2 ", " 4 ", " 8 ".
Multigroup candidate algorithm according to an exemplary embodiment of the present invention that generation is shown to user is described with reference to Fig. 6
The example of the effect of parameter value and the machine learning model corresponding with every group of candidate algorithm parameter value trained.Fig. 6's
In example, multigroup candidate algorithm parameter value of generation and the machine learning corresponding with every group of candidate algorithm parameter value trained
The effect of model is shown as the form of algorithm parameter tuning analysis report.
As shown in fig. 6, shown in analysis report generation 6 groups of candidate algorithm parameter values and train with every group of candidate
The effect (that is, the evaluation of estimate on evaluation index " AUC ") of machine learning model corresponding to algorithm parameter value, also, can be according to
The quality of the effect of corresponding machine learning model, to arrange 6 groups of candidate algorithm parameter values.In addition, may be used also in analysis report
Show parameter adjustment method used in algorithm parameter automated tuning for " Random Search ", it is " 6 " to adjust ginseng number, algorithm to be adjusted
Parameter maximum exercise wheel number, learning rate, L1 regularization coefficients, L2 regularization coefficients initial value be respectively " 4 ", " 0.5 ", " 0 ",
“0”。
Further, as an example, user can select one group of candidate from the algorithm parameter tuning analysis report shown in Fig. 6
Algorithm parameter value, preserved by applied to follow-up machine learning step and/or in the form of configuration file.
According to the exemplary embodiment of the present invention, machine learning can be performed by the form of directed acyclic graph (DAG figures)
Flow, the machine learning flow can cover all or part of step for carrying out machine learning model training, testing or estimate.
For example, it can be walked for algorithm parameter automated tuning to establish including historical data steps for importing, data splitting step, feature extraction
Suddenly, the automatic DAG figures for adjusting ginseng step.That is, above-mentioned each step can be performed as the node in DAG figures.
Fig. 7 shows the DAG according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
The example of figure.
Reference picture 7, the first step:Establish data delivery node.For example, as shown in fig. 7, user's operation is may be in response to data
Delivery node is configured so that the banking business data table of entitled " bank_jin " is imported in machine learning platform, wherein,
A plurality of historgraphic data recording can be included in the tables of data.
Second step:Establish data and split node, and import data to node and be connected to data fractionation node, led above-mentioned
The tables of data entered is split as training set and checking collects, wherein, the data record in training set is used to be converted to machine learning sample
To learn model, and verify the data record concentrated and be used to be converted to test sample to verify the effect for the model for learning.
User's operation is may be in response to be configured in an arranged manner to split the tables of data of above-mentioned importing data fractionation node
Collect for training set and checking.
3rd step:Two feature extraction nodes are established, and data fractionation node is connected into spy respectively and taken out to the two features
Node is taken, feature extraction is carried out respectively so that data are split with the training set of node output and checking collection, for example, default data is split
Output is training set on the left of node, and right side output is checking collection.The spy that can be set based on user in feature extraction node
Sign configuration or the code write carry out feature extraction to training set and checking collection.It should be understood that for machine learning sample and test
For sample, both feature extraction modes are corresponding consistent.User can will extract the feature of node configuration to left feature
Extraction mode directly applies to the feature extraction that node is extracted to right feature, or, the two can be set to automatic synchronization by platform
Set.
4th step:Establish automatic adjust and join node, and two feature extraction nodes are connected respectively to and adjust ginseng node automatically.Can
Ginseng node is adjusted to be configured to automatic in response to user's operation, for example, clicking on the defeated of " automatic to adjust ginseng " node when receiving user
When entering to operate, the graphical interfaces for being used to set tune ginseng configuration item as shown in Figure 4 and Figure 5 can be provided a user, in order to user
Ginseng configuration item is adjusted to set by the graphical interfaces.
After foundation includes the DAG figures of above-mentioned steps, whole DAG figures can be run according to the instruction of user.Running
During, machine learning platform can automatically generate multigroup candidate algorithm parameter value according to the configuration item that user is set;Exist respectively
Under every group of candidate algorithm parameter value, machine corresponding with every group of candidate algorithm parameter value is trained according to the machine learning algorithm
Learning model;And assess the effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained.
In addition, as an example, after automatic tune ginseng node, model training node can be also established, and ginseng will be adjusted to save automatically
Point is connected to model training node, by the algorithm of the algorithm parameter to be adjusted of machine learning algorithm used in model training node
Parameter value is directly disposed as one group of candidate algorithm parameter value corresponding to the best machine learning model of effect.Correspondingly, can ring
It should be operated in user and model training node is configured to carry out training pattern in an arranged manner.
Fig. 8 shows the system according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
Block diagram.As shown in figure 8, according to an exemplary embodiment of the present invention be for what machine learning algorithm carried out algorithm parameter tuning
System includes:Algorithm determining device 10, display device 20, configuration item acquisition device 30, algorithm parameter value generation device 40, at least one
Individual computing device 50, apparatus for evaluating 60.
Particularly, algorithm determining device 10 is used to determine the machine learning algorithm for training machine learning model.
Display device 20 is used to provide a user figure circle of the tune ginseng configuration item for setting the machine learning algorithm
Face, wherein, it is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm ginseng
Numerical value includes a candidate algorithm parameter value of each algorithm parameter to be adjusted of the machine learning algorithm.
Configuration item acquisition device 30 performs for receiving user in order to set the tune ginseng configuration item on graphical interfaces
Input operation, and according to the input operation come obtain user setting tune join configuration item.
Algorithm parameter value generation device 40 is used for the tune ginseng configuration item based on acquisition to generate multigroup candidate algorithm parameter value.
As an example, the tune ginseng configuration item may include at least one among following item:Initial value configuration item, for referring to
Surely the initial value of algorithm parameter to be adjusted so that initial value of the algorithm parameter value generation device 40 based on specified algorithm parameter to be adjusted
To generate at least one candidate algorithm parameter value of algorithm parameter to be adjusted;Span configuration item, for specifying algorithm to be adjusted to join
Several span so that algorithm parameter value generation device 40 is treated based on the span of specified algorithm parameter to be adjusted to generate
Adjust at least one candidate algorithm parameter value of algorithm parameter;Parameter adjustment method configuration item, for specifying the multigroup candidate algorithm ginseng of generation
The method of numerical value so that algorithm parameter value generation device 40 is according at least one of specified method based on each algorithm parameter to be adjusted
Individual candidate algorithm parameter value generates multigroup candidate algorithm parameter value.
At least one computing device 50 is used for respectively under every group of candidate algorithm parameter value, according to the machine learning
Algorithm trains machine learning model corresponding with every group of candidate algorithm parameter value.
As an example, at least one computing device 50 can concurrently train it is corresponding with every group of candidate algorithm parameter value
Machine learning model, wherein, concurrently trained at least one computing device 50 corresponding with every group of candidate algorithm parameter value
Machine learning model when, the ginseng of corresponding with every group of candidate algorithm parameter value machine learning model can be safeguarded as parameter server
Number, wherein, the parameter has the form of key-value pair, and the form that the parameter server corresponds to multiple values according to single key is come
Preserve multiple key-value pairs with same keys.
It is according to an exemplary embodiment of the present invention to carry out algorithm parameter tuning for machine learning algorithm as another example
System may include multiple computing devices 50, wherein, the multiple computing device 50 can be trained concurrently and every group of candidate algorithm
Machine learning model corresponding to parameter value, wherein, concurrently train in the multiple computing device 50 and join with every group of candidate algorithm
Corresponding to numerical value during machine learning model, machine learning corresponding with every group of candidate algorithm parameter value can be safeguarded as parameter server
The parameter of model, wherein, the parameter server includes at least one server end and multiple client, wherein, client with
Computing device 50 corresponds, also, corresponding client and computing device 50 become one, wherein, it is described at least one
Server end is used for the parameter for preserving machine learning model corresponding with every group of candidate algorithm parameter value;Each client be used for
It is transmitted between one or more server ends involved by the machine learning algorithm under at least one set of candidate algorithm parameter value
Parameter parameter manipulation instruction, wherein, computing device 50 corresponding with each client is configured to described
Under at least one set of candidate algorithm parameter value, carry out training machine learning model according to the machine learning algorithm, wherein, in the ginseng
In number operational order, same keys are compressed and/or merged.
As another example, at least one computing device 50 can be under every group of candidate algorithm parameter value, according to described
Machine learning algorithm calculates to perform the same data stream type on machine learning model training, wherein, it is each by merging
Same treatment step between data stream type calculates calculates to perform data stream type.
Apparatus for evaluating 60 is used for the effect for assessing the machine learning model corresponding with every group of candidate algorithm parameter value trained
Fruit.
As an example, display device 20 can also show the multigroup candidate algorithm parameter value of generation and train to user
The effect of machine learning model corresponding with every group of candidate algorithm parameter value.
As an example, according to an exemplary embodiment of the present invention be for what machine learning algorithm carried out algorithm parameter tuning
System may also include:Application apparatus (not shown).Application apparatus is used for directly by the algorithm parameter to be adjusted of the machine learning algorithm
Algorithm parameter value be arranged to one group of candidate algorithm parameter value corresponding to the best machine learning model of effect, and by setting
Algorithm parameter value was applied in the step of follow-up training machine learning model;Or for by the machine learning algorithm
The algorithm parameter value of algorithm parameter to be adjusted is arranged to one group of candidate that user selects from multigroup candidate algorithm parameter value of display
Algorithm parameter value, and the step of the algorithm parameter value of setting is applied into follow-up training machine learning model in.
As an example, according to an exemplary embodiment of the present invention be for what machine learning algorithm carried out algorithm parameter tuning
System may also include:Save set (not shown).Save set is used for one group corresponding to the best machine learning model of effect
Candidate algorithm parameter value is preserved in the form of configuration file;Or for user to be joined from multigroup candidate algorithm of display
The one group of candidate algorithm parameter value selected in numerical value is preserved in the form of configuration file.
It should be understood that according to an exemplary embodiment of the present invention be for what machine learning algorithm carried out algorithm parameter tuning
The related specific implementation that the specific implementation of system may be incorporated by reference Fig. 1 to Fig. 7 descriptions is realized, will not be repeated here.
It is according to an exemplary embodiment of the present invention to be carried out for machine learning algorithm included by the system of algorithm parameter tuning
Device can be individually configured to perform the software of specific function, hardware, firmware or any combination of above-mentioned item.For example, these
Device may correspond to special integrated circuit, can also correspond to pure software code, also corresponds to software and is mutually tied with hardware
The module of conjunction.In addition, the one or more functions that these devices are realized also can be by physical entity equipment (for example, processor, visitor
Family end or server etc.) in component seek unity of action.
It should be understood that the method according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
It can be realized by the program being recorded in computer-readable media, for example, the exemplary embodiment according to the present invention, it is possible to provide one
Kind is used for the computer-readable medium that algorithm parameter tuning is carried out for machine learning algorithm, wherein, described computer-readable
Record has the computer program for performing following methods step on medium:(A) machine for training machine learning model is determined
Device learning algorithm;(B) graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm is provided a user, wherein,
It is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm parameter value includes
One candidate algorithm parameter value of each algorithm parameter to be adjusted of the machine learning algorithm;(C) user is received in order to set
The input operation adjusted ginseng configuration item and performed on graphical interfaces is stated, and the tune of user's setting is obtained according to the input operation
Join configuration item;(D) tune based on acquisition joins configuration item to generate multigroup candidate algorithm parameter value;(E) calculated respectively in every group of candidate
Under method parameter value, machine learning model corresponding with every group of candidate algorithm parameter value is trained according to the machine learning algorithm;
(F) effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained is assessed.
Computer program in above computer computer-readable recording medium can be in client, main frame, agent apparatus, server etc.
Run in the environment disposed in computer equipment, it should be noted that the computer program can be additionally used in perform except above-mentioned steps with
Outer additional step or performed when performing above-mentioned steps more specifically handles, and these additional steps and further handles
Content is described referring to figs. 1 to Fig. 7, here in order to avoid repetition will be repeated no longer.
It should be noted that the system according to an exemplary embodiment of the present invention that algorithm parameter tuning is carried out for machine learning algorithm
The operation of computer program can be completely dependent on to realize corresponding function, i.e. the function structure of each device and computer program
In it is corresponding to each step so that whole system by special software kit (for example, lib storehouses) and called, it is corresponding to realize
Function.
On the other hand, it is according to an exemplary embodiment of the present invention to be for what machine learning algorithm carried out algorithm parameter tuning
The included each device of system can also be realized by hardware, software, firmware, middleware, microcode or its any combination.When
When being realized with software, firmware, middleware or microcode, program code or code segment for performing corresponding operating can store
In the computer-readable medium of such as storage medium so that processor can by read and run corresponding program code or
Code segment performs corresponding operation.
For example, the exemplary embodiment of the present invention is also implemented as computer, the computer includes memory unit and place
Device is managed, set of computer-executable instructions conjunction is stored with memory unit, when the set of computer-executable instructions is closed by the place
When managing device execution, the method that algorithm parameter tuning is carried out for machine learning algorithm is performed.
Particularly, the computer can be deployed in server or client, can also be deployed in distributed network
On node apparatus in environment.In addition, the computer can be PC computers, board device, personal digital assistant, intelligent hand
Machine, web applications or other be able to carry out the device of above-mentioned instruction set.
Here, the computer is not necessarily single device, and can also be any can perform alone or in combination
State the device of instruction (or instruction set) or the aggregate of circuit.Computer can also be integrated control system or system administration manager
A part, or can be configured as filling with the portable electronic of interface inter-link with Local or Remote (for example, via be wirelessly transferred)
Put.
In the computer, processor may include central processing unit (CPU), graphics processor (GPU), FPGA
Device, dedicated processor systems, microcontroller or microprocessor.Unrestricted as example, processor may also include at simulation
Manage device, digital processing unit, microprocessor, polycaryon processor, processor array, network processing unit etc..
It is according to an exemplary embodiment of the present invention for machine learning algorithm retouched in the method for algorithm parameter tuning
The some operations stated can realize that some operations can be realized by hardware mode by software mode, in addition, can also be by soft
The mode of combination of hardware realizes these operations.
Processor can run the instruction being stored in one of memory unit or code, wherein, the memory unit can be with
Data storage.Instruction and data can be also sent and received via Network Interface Unit and by network, wherein, the network connects
Mouth device can use any of host-host protocol.
Memory unit can be integral to the processor and be integrated, for example, RAM or flash memory are arranged in into integrated circuit microprocessor etc.
Within.In addition, memory unit may include independent device, such as, outside dish driving, storage array or any Database Systems can
Other storage devices used.Memory unit and processor can be coupled operationally, or can for example by I/O ports,
Network connection etc. communicates so that processor can read the file being stored in memory unit.
In addition, the computer may also include video display (such as, liquid crystal display) and user mutual interface is (all
Such as, keyboard, mouse, touch input device etc.).The all component of computer can be connected to each other via bus and/or network.
It is according to an exemplary embodiment of the present invention to be carried out for machine learning algorithm involved by the method for algorithm parameter tuning
Operation can be described as it is various interconnection or coupling functional blocks or function diagram.However, these functional blocks or function diagram can
Equably it is integrated into single logic device or is operated according to non-definite border.
For example, as described above, according to an exemplary embodiment of the present invention be used to carry out algorithm ginseng for machine learning algorithm
The computer of number tuning may include memory unit and processor, wherein, set of computer-executable instructions is stored with memory unit
Close, when the set of computer-executable instructions is closed by the computing device, perform following step:(A) determine to be used to train
The machine learning algorithm of machine learning model;(B) provide a user for setting the tune of the machine learning algorithm to join configuration item
Graphical interfaces, wherein, it is described adjust ginseng configuration item be used for limit how to generate multigroup candidate algorithm parameter value, wherein, every group of time
Algorithm parameter value is selected to include a candidate algorithm parameter value of each algorithm parameter to be adjusted for the machine learning algorithm;(C) connect
Receive user in order to set it is described adjust ginseng configuration item and the input operation that is performed on graphical interfaces, and according to the input operation come
Obtain the tune ginseng configuration item that user is set;(D) tune based on acquisition joins configuration item to generate multigroup candidate algorithm parameter value;(E)
Respectively under every group of candidate algorithm parameter value, trained according to the machine learning algorithm corresponding with every group of candidate algorithm parameter value
Machine learning model;(F) effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained is assessed.
The foregoing describe each exemplary embodiment of the present invention, it should be appreciated that foregoing description is only exemplary, and exhaustive
Property, the invention is not restricted to disclosed each exemplary embodiment.Without departing from the scope and spirit of the invention, it is right
Many modifications and changes will be apparent from for those skilled in the art.Therefore, protection of the invention
Scope should be defined by the scope of claim.
Claims (10)
1. a kind of method that algorithm parameter tuning is carried out for machine learning algorithm, including:
(A) machine learning algorithm for training machine learning model is determined;
(B) graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm is provided a user, wherein, it is described to adjust ginseng
Configuration item is used to limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm parameter value includes the machine
One candidate algorithm parameter value of each algorithm parameter to be adjusted of learning algorithm;
(C) user is received in order to set the input operation adjusted ginseng configuration item and performed on graphical interfaces, and according to described
Input operation joins configuration item to obtain the tune of user's setting;
(D) tune based on acquisition joins configuration item to generate multigroup candidate algorithm parameter value;
(E) respectively under every group of candidate algorithm parameter value, join according to the machine learning algorithm to train with every group of candidate algorithm
Machine learning model corresponding to numerical value;
(F) effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained is assessed.
2. the method according to claim 11, in addition to:
(G) show multigroup candidate algorithm parameter value of generation to user and train corresponding with every group of candidate algorithm parameter value
Machine learning model effect.
3. the method according to claim 11, in addition to:
(H) the algorithm parameter value of the algorithm parameter to be adjusted of the machine learning algorithm is directly arranged to the best engineering of effect
One group of candidate algorithm parameter value corresponding to model is practised, and the algorithm parameter value of setting is applied to follow-up training machine and learnt
In the step of model.
4. according to the method for claim 1, wherein, the tune ginseng configuration item includes at least one among following item:Just
Initial value configuration item, for specifying the initial value of algorithm parameter to be adjusted so that based on specified algorithm parameter to be adjusted in step (D)
Initial value generate at least one candidate algorithm parameter value of algorithm parameter to be adjusted;Span configuration item, treated for specifying
Adjust the span of algorithm parameter so that the span based on specified algorithm parameter to be adjusted is treated to generate in step (D)
Adjust at least one candidate algorithm parameter value of algorithm parameter;Parameter adjustment method configuration item, for specifying the multigroup candidate algorithm ginseng of generation
The method of numerical value so that calculated in step (D) according at least one candidate of the specified method based on each algorithm parameter to be adjusted
Method parameter value generates multigroup candidate algorithm parameter value.
5. according to the method for claim 1, wherein, in step (E), concurrently train and every group of candidate algorithm parameter value
Corresponding machine learning model,
Wherein, when concurrently training machine learning model corresponding with every group of candidate algorithm parameter value, tieed up by parameter server
The parameter of shield machine learning model corresponding with every group of candidate algorithm parameter value, wherein, the parameter has the form of key-value pair,
The parameter server preserves multiple key-value pairs with same keys according to single key corresponding to the form of multiple values.
6. according to the method for claim 1, wherein, in step (E), concurrently trained and every group by multiple computing devices
Machine learning model corresponding to candidate algorithm parameter value,
Wherein, when concurrently training machine learning model corresponding with every group of candidate algorithm parameter value, tieed up by parameter server
The parameter of shield machine learning model corresponding with every group of candidate algorithm parameter value, wherein, the parameter server includes at least one
Individual server end and multiple client, wherein, client corresponds with computing device, also, corresponding client and calculating
Device becomes one, wherein, at least one server end is used to preserve machine corresponding with every group of candidate algorithm parameter value
The parameter of device learning model;Each client is used to be transmitted at least one set of candidate algorithm between one or more server ends
The parameter manipulation instruction of the parameter involved by the machine learning algorithm under parameter value, wherein, with each client pair
The computing device answered is configured under at least one set of candidate algorithm parameter value, is come according to the machine learning algorithm
Training machine learning model, wherein, in parameter manipulation instruction, same keys are compressed and/or merged.
7. the method according to claim 11, wherein, in step (E), under every group of candidate algorithm parameter value, according to institute
State machine learning algorithm and calculated to perform the same data stream type on machine learning model training,
Wherein, calculated by merging the same treatment step between each data stream type calculating to perform data stream type.
8. a kind of system that algorithm parameter tuning is carried out for machine learning algorithm, including:
Algorithm determining device, for determining the machine learning algorithm for training machine learning model;
Display device, for providing a user the graphical interfaces of the tune ginseng configuration item for setting the machine learning algorithm, its
In, it is described to adjust ginseng configuration item is used to limit how to generate multigroup candidate algorithm parameter value, wherein, every group of candidate algorithm parameter value bag
Include a candidate algorithm parameter value of each algorithm parameter to be adjusted of the machine learning algorithm;
Configuration item acquisition device, the input performed for receiving user in order to set the tune ginseng configuration item on graphical interfaces
Operation, and configuration item is joined to obtain the tune of user's setting according to the input operation;
Algorithm parameter value generation device, multigroup candidate algorithm parameter value is generated for the tune ginseng configuration item based on acquisition;
At least one computing device, under every group of candidate algorithm parameter value, being instructed respectively according to the machine learning algorithm
Practice machine learning model corresponding with every group of candidate algorithm parameter value;
Apparatus for evaluating, for assessing the effect of the machine learning model corresponding with every group of candidate algorithm parameter value trained.
9. a kind of computer-readable medium for being used to carry out algorithm parameter tuning for machine learning algorithm, wherein, in the meter
There is record on calculation machine computer-readable recording medium carries out algorithm as described in claim 1 to 7 is any for performing for machine learning algorithm
The computer program of the method for arameter optimization.
10. a kind of computer for being used to carry out algorithm parameter tuning for machine learning algorithm, including memory unit and processor,
Wherein, set of computer-executable instructions conjunction is stored with memory unit, when the set of computer-executable instructions is closed by the place
When managing device execution, the side that algorithm parameter tuning is carried out for machine learning algorithm as described in claim 1 to 7 is any is performed
Method.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711048805.3A CN107844837B (en) | 2017-10-31 | 2017-10-31 | Method and system for adjusting and optimizing algorithm parameters aiming at machine learning algorithm |
CN202010496368.7A CN111652380B (en) | 2017-10-31 | 2017-10-31 | Method and system for optimizing algorithm parameters aiming at machine learning algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711048805.3A CN107844837B (en) | 2017-10-31 | 2017-10-31 | Method and system for adjusting and optimizing algorithm parameters aiming at machine learning algorithm |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010496368.7A Division CN111652380B (en) | 2017-10-31 | 2017-10-31 | Method and system for optimizing algorithm parameters aiming at machine learning algorithm |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107844837A true CN107844837A (en) | 2018-03-27 |
CN107844837B CN107844837B (en) | 2020-04-28 |
Family
ID=61681212
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711048805.3A Active CN107844837B (en) | 2017-10-31 | 2017-10-31 | Method and system for adjusting and optimizing algorithm parameters aiming at machine learning algorithm |
CN202010496368.7A Active CN111652380B (en) | 2017-10-31 | 2017-10-31 | Method and system for optimizing algorithm parameters aiming at machine learning algorithm |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010496368.7A Active CN111652380B (en) | 2017-10-31 | 2017-10-31 | Method and system for optimizing algorithm parameters aiming at machine learning algorithm |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN107844837B (en) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108509727A (en) * | 2018-03-30 | 2018-09-07 | 深圳市智物联网络有限公司 | Model in data modeling selects processing method and processing device |
CN108681487A (en) * | 2018-05-21 | 2018-10-19 | 千寻位置网络有限公司 | The distributed system and tuning method of sensing algorithm arameter optimization |
CN108710949A (en) * | 2018-04-26 | 2018-10-26 | 第四范式(北京)技术有限公司 | The method and system of template are modeled for creating machine learning |
CN109144648A (en) * | 2018-08-21 | 2019-01-04 | 第四范式(北京)技术有限公司 | Uniformly execute the method and system of feature extraction |
CN109284828A (en) * | 2018-09-06 | 2019-01-29 | 沈文策 | A kind of hyper parameter tuning method, device and equipment |
CN109447277A (en) * | 2018-10-19 | 2019-03-08 | 厦门渊亭信息科技有限公司 | A kind of general machine learning is super to join black box optimization method and system |
CN109828836A (en) * | 2019-01-20 | 2019-05-31 | 北京工业大学 | A kind of batch streaming computing system dynamic state of parameters configuration method |
CN110414689A (en) * | 2019-08-06 | 2019-11-05 | 中国工商银行股份有限公司 | Update method and device on a kind of machine learning model line |
CN110647998A (en) * | 2019-08-12 | 2020-01-03 | 北京百度网讯科技有限公司 | Method, system, device and storage medium for implementing automatic machine learning |
CN110728371A (en) * | 2019-09-17 | 2020-01-24 | 第四范式(北京)技术有限公司 | System, method and electronic device for executing automatic machine learning scheme |
CN110766164A (en) * | 2018-07-10 | 2020-02-07 | 第四范式(北京)技术有限公司 | Method and system for performing a machine learning process |
WO2020035076A1 (en) * | 2018-08-17 | 2020-02-20 | 第四范式(北京)技术有限公司 | Method and system for visualizing data processing step of machine learning process |
CN110838069A (en) * | 2019-10-15 | 2020-02-25 | 支付宝(杭州)信息技术有限公司 | Data processing method, device and system |
CN111047048A (en) * | 2019-11-22 | 2020-04-21 | 支付宝(杭州)信息技术有限公司 | Energized model training and merchant energizing method and device, and electronic equipment |
CN111178535A (en) * | 2018-11-12 | 2020-05-19 | 第四范式(北京)技术有限公司 | Method and device for realizing automatic machine learning |
CN111191795A (en) * | 2019-12-31 | 2020-05-22 | 第四范式(北京)技术有限公司 | Method, device and system for training machine learning model |
CN111242320A (en) * | 2020-01-16 | 2020-06-05 | 京东数字科技控股有限公司 | Machine learning method and device, electronic equipment and storage medium |
CN111694844A (en) * | 2020-05-28 | 2020-09-22 | 平安科技(深圳)有限公司 | Enterprise operation data analysis method and device based on configuration algorithm and electronic equipment |
CN111723939A (en) * | 2020-05-15 | 2020-09-29 | 第四范式(北京)技术有限公司 | Parameter adjusting method, device, equipment and system of machine learning model |
WO2020207268A1 (en) * | 2019-04-11 | 2020-10-15 | 腾讯科技(深圳)有限公司 | Database performance adjustment method and apparatus, device, system, and storage medium |
CN111797990A (en) * | 2019-04-08 | 2020-10-20 | 北京百度网讯科技有限公司 | Training method, training device and training system of machine learning model |
CN111831322A (en) * | 2020-04-15 | 2020-10-27 | 中国人民解放军军事科学院战争研究院 | Machine learning parameter configuration method for multi-level user |
CN112101562A (en) * | 2019-06-18 | 2020-12-18 | 第四范式(北京)技术有限公司 | Method and system for realizing machine learning modeling process |
CN112149836A (en) * | 2019-06-28 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Machine learning program updating method, device and equipment |
WO2021169960A1 (en) * | 2020-02-27 | 2021-09-02 | 第四范式(北京)技术有限公司 | Configuration file recommendation method and apparatus, and system |
WO2021208685A1 (en) * | 2020-04-17 | 2021-10-21 | 第四范式(北京)技术有限公司 | Method and apparatus for executing automatic machine learning process, and device |
CN113886026A (en) * | 2021-12-07 | 2022-01-04 | 中国电子科技集团公司第二十八研究所 | Intelligent modeling method and system based on dynamic parameter configuration and process supervision |
CN114385256A (en) * | 2020-10-22 | 2022-04-22 | 华为云计算技术有限公司 | Method and device for configuring system parameters |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115470910A (en) * | 2022-10-20 | 2022-12-13 | 晞德软件(北京)有限公司 | Automatic parameter adjusting method based on Bayesian optimization and K-center sampling |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200087A (en) * | 2014-06-05 | 2014-12-10 | 清华大学 | Parameter optimization and feature tuning method and system for machine learning |
CN105912500A (en) * | 2016-03-30 | 2016-08-31 | 百度在线网络技术(北京)有限公司 | Machine learning model generation method and machine learning model generation device |
CN106156810A (en) * | 2015-04-26 | 2016-11-23 | 阿里巴巴集团控股有限公司 | General-purpose machinery learning algorithm model training method, system and calculating node |
US20170147922A1 (en) * | 2015-11-23 | 2017-05-25 | Daniel Chonghwan LEE | Filtering, smoothing, memetic algorithms, and feasible direction methods for estimating system state and unknown parameters of electromechanical motion devices |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160110657A1 (en) * | 2014-10-14 | 2016-04-21 | Skytree, Inc. | Configurable Machine Learning Method Selection and Parameter Optimization System and Method |
CN106202431B (en) * | 2016-07-13 | 2019-06-28 | 华中科技大学 | A kind of Hadoop parameter automated tuning method and system based on machine learning |
CN107203809A (en) * | 2017-04-20 | 2017-09-26 | 华中科技大学 | A kind of deep learning automation parameter adjustment method and system based on Keras |
-
2017
- 2017-10-31 CN CN201711048805.3A patent/CN107844837B/en active Active
- 2017-10-31 CN CN202010496368.7A patent/CN111652380B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104200087A (en) * | 2014-06-05 | 2014-12-10 | 清华大学 | Parameter optimization and feature tuning method and system for machine learning |
CN106156810A (en) * | 2015-04-26 | 2016-11-23 | 阿里巴巴集团控股有限公司 | General-purpose machinery learning algorithm model training method, system and calculating node |
US20170147922A1 (en) * | 2015-11-23 | 2017-05-25 | Daniel Chonghwan LEE | Filtering, smoothing, memetic algorithms, and feasible direction methods for estimating system state and unknown parameters of electromechanical motion devices |
CN105912500A (en) * | 2016-03-30 | 2016-08-31 | 百度在线网络技术(北京)有限公司 | Machine learning model generation method and machine learning model generation device |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108509727A (en) * | 2018-03-30 | 2018-09-07 | 深圳市智物联网络有限公司 | Model in data modeling selects processing method and processing device |
CN108509727B (en) * | 2018-03-30 | 2022-04-08 | 深圳市智物联网络有限公司 | Model selection processing method and device in data modeling |
CN108710949A (en) * | 2018-04-26 | 2018-10-26 | 第四范式(北京)技术有限公司 | The method and system of template are modeled for creating machine learning |
CN108681487A (en) * | 2018-05-21 | 2018-10-19 | 千寻位置网络有限公司 | The distributed system and tuning method of sensing algorithm arameter optimization |
CN108681487B (en) * | 2018-05-21 | 2021-08-24 | 千寻位置网络有限公司 | Distributed system and method for adjusting and optimizing sensor algorithm parameters |
CN110766164A (en) * | 2018-07-10 | 2020-02-07 | 第四范式(北京)技术有限公司 | Method and system for performing a machine learning process |
WO2020035076A1 (en) * | 2018-08-17 | 2020-02-20 | 第四范式(北京)技术有限公司 | Method and system for visualizing data processing step of machine learning process |
CN109144648A (en) * | 2018-08-21 | 2019-01-04 | 第四范式(北京)技术有限公司 | Uniformly execute the method and system of feature extraction |
CN109284828A (en) * | 2018-09-06 | 2019-01-29 | 沈文策 | A kind of hyper parameter tuning method, device and equipment |
CN109447277A (en) * | 2018-10-19 | 2019-03-08 | 厦门渊亭信息科技有限公司 | A kind of general machine learning is super to join black box optimization method and system |
CN109447277B (en) * | 2018-10-19 | 2023-11-10 | 厦门渊亭信息科技有限公司 | Universal machine learning super-ginseng black box optimization method and system |
CN111178535A (en) * | 2018-11-12 | 2020-05-19 | 第四范式(北京)技术有限公司 | Method and device for realizing automatic machine learning |
CN109828836A (en) * | 2019-01-20 | 2019-05-31 | 北京工业大学 | A kind of batch streaming computing system dynamic state of parameters configuration method |
CN109828836B (en) * | 2019-01-20 | 2021-04-30 | 北京工业大学 | Parameter dynamic configuration method for batch streaming computing system |
CN111797990A (en) * | 2019-04-08 | 2020-10-20 | 北京百度网讯科技有限公司 | Training method, training device and training system of machine learning model |
WO2020207268A1 (en) * | 2019-04-11 | 2020-10-15 | 腾讯科技(深圳)有限公司 | Database performance adjustment method and apparatus, device, system, and storage medium |
CN112101562A (en) * | 2019-06-18 | 2020-12-18 | 第四范式(北京)技术有限公司 | Method and system for realizing machine learning modeling process |
CN112101562B (en) * | 2019-06-18 | 2024-01-30 | 第四范式(北京)技术有限公司 | Implementation method and system of machine learning modeling process |
CN112149836A (en) * | 2019-06-28 | 2020-12-29 | 杭州海康威视数字技术股份有限公司 | Machine learning program updating method, device and equipment |
CN110414689A (en) * | 2019-08-06 | 2019-11-05 | 中国工商银行股份有限公司 | Update method and device on a kind of machine learning model line |
CN110647998A (en) * | 2019-08-12 | 2020-01-03 | 北京百度网讯科技有限公司 | Method, system, device and storage medium for implementing automatic machine learning |
CN110647998B (en) * | 2019-08-12 | 2022-11-25 | 北京百度网讯科技有限公司 | Method, system, device and storage medium for implementing automatic machine learning |
CN110728371A (en) * | 2019-09-17 | 2020-01-24 | 第四范式(北京)技术有限公司 | System, method and electronic device for executing automatic machine learning scheme |
CN110838069A (en) * | 2019-10-15 | 2020-02-25 | 支付宝(杭州)信息技术有限公司 | Data processing method, device and system |
CN111047048A (en) * | 2019-11-22 | 2020-04-21 | 支付宝(杭州)信息技术有限公司 | Energized model training and merchant energizing method and device, and electronic equipment |
CN111047048B (en) * | 2019-11-22 | 2023-04-07 | 支付宝(杭州)信息技术有限公司 | Energized model training and merchant energizing method and device, and electronic equipment |
CN111191795A (en) * | 2019-12-31 | 2020-05-22 | 第四范式(北京)技术有限公司 | Method, device and system for training machine learning model |
CN111191795B (en) * | 2019-12-31 | 2023-10-20 | 第四范式(北京)技术有限公司 | Method, device and system for training machine learning model |
CN111242320A (en) * | 2020-01-16 | 2020-06-05 | 京东数字科技控股有限公司 | Machine learning method and device, electronic equipment and storage medium |
WO2021169960A1 (en) * | 2020-02-27 | 2021-09-02 | 第四范式(北京)技术有限公司 | Configuration file recommendation method and apparatus, and system |
CN111831322A (en) * | 2020-04-15 | 2020-10-27 | 中国人民解放军军事科学院战争研究院 | Machine learning parameter configuration method for multi-level user |
CN111831322B (en) * | 2020-04-15 | 2023-08-01 | 中国人民解放军军事科学院战争研究院 | Multi-level user-oriented machine learning parameter configuration method |
WO2021208685A1 (en) * | 2020-04-17 | 2021-10-21 | 第四范式(北京)技术有限公司 | Method and apparatus for executing automatic machine learning process, and device |
CN111723939A (en) * | 2020-05-15 | 2020-09-29 | 第四范式(北京)技术有限公司 | Parameter adjusting method, device, equipment and system of machine learning model |
WO2021238563A1 (en) * | 2020-05-28 | 2021-12-02 | 平安科技(深圳)有限公司 | Enterprise operation data analysis method and apparatus based on configuration algorithm, and electronic device and medium |
CN111694844A (en) * | 2020-05-28 | 2020-09-22 | 平安科技(深圳)有限公司 | Enterprise operation data analysis method and device based on configuration algorithm and electronic equipment |
CN114385256A (en) * | 2020-10-22 | 2022-04-22 | 华为云计算技术有限公司 | Method and device for configuring system parameters |
CN113886026A (en) * | 2021-12-07 | 2022-01-04 | 中国电子科技集团公司第二十八研究所 | Intelligent modeling method and system based on dynamic parameter configuration and process supervision |
Also Published As
Publication number | Publication date |
---|---|
CN107844837B (en) | 2020-04-28 |
CN111652380B (en) | 2023-12-22 |
CN111652380A (en) | 2020-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107844837A (en) | The method and system of algorithm parameter tuning are carried out for machine learning algorithm | |
CN108090516A (en) | Automatically generate the method and system of the feature of machine learning sample | |
CN104951425B (en) | A kind of cloud service performance self-adapting type of action system of selection based on deep learning | |
CN107169573A (en) | Using composite machine learning model come the method and system of perform prediction | |
WO2020259502A1 (en) | Method and device for generating neural network model, and computer-readable storage medium | |
CN107766946A (en) | Generate the method and system of the assemblage characteristic of machine learning sample | |
CN110766164A (en) | Method and system for performing a machine learning process | |
CN107609652A (en) | Perform the distributed system and its method of machine learning | |
CN107844784A (en) | Face identification method, device, computer equipment and readable storage medium storing program for executing | |
CN107704871A (en) | Generate the method and system of the assemblage characteristic of machine learning sample | |
CN108710949A (en) | The method and system of template are modeled for creating machine learning | |
CN108008942A (en) | The method and system handled data record | |
CN105260171B (en) | A kind of generation method and device of virtual item | |
US20220147877A1 (en) | System and method for automatic building of learning machines using learning machines | |
CN107871166A (en) | For the characteristic processing method and characteristics processing system of machine learning | |
CN110956272A (en) | Method and system for realizing data processing | |
CN108921300A (en) | The method and apparatus for executing automaton study | |
CN109445935A (en) | A kind of high-performance big data analysis system self-adaption configuration method under cloud computing environment | |
CN107169574A (en) | Using nested machine learning model come the method and system of perform prediction | |
CN108108820A (en) | For selecting the method and system of the feature of machine learning sample | |
US20220245131A1 (en) | Method and system for using stacktrace signatures for bug triaging in a microservice architecture | |
CN107273979A (en) | The method and system of machine learning prediction are performed based on service class | |
CN110598065A (en) | Data mining method and device and computer readable storage medium | |
KR20200125890A (en) | Cloud-based transaction system and method capable of providing neural network training model in supervised state | |
CN109947462A (en) | A kind of decision support method and device that the change of software-oriented code is integrated |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |