Summary of the invention
In view of this, the disclosure provides a kind of hyper parameter tuning method, apparatus, server, client and medium, Jin Erzhi
Partially solves the problems, such as caused by the limitation and defect due to the relevant technologies one or more.
The first aspect of the disclosure provides a kind of hyper parameter tuning method, is applied to server, this method comprises: connecing
Receive the request of hyper parameter tuning;It requests to generate hyper parameter tuning task based on the hyper parameter tuning;According to the hyper parameter tuning
The environmental labels of user configure the hyper parameter tuning task in request;According to the mark of the hyper parameter tuning task
Code determines the node for executing the hyper parameter tuning task, and runs algorithm model to the hyper parameter tune on the node
Excellent task is trained, and the daily record data generated when the hyper parameter tuning task trained according to User Identity storage;
In response to user's operation, the daily record data is sent to client according to user right.
In accordance with an embodiment of the present disclosure, comprising hyper parameter selection algorithm and for hyper parameter in the hyper parameter tuning request
The threshold range of setting requests to generate hyper parameter tuning task to include: to select using the hyper parameter based on the hyper parameter tuning
It selects algorithm to calculate the threshold range of hyper parameter, obtains hyper parameter list;Based on the hyper parameter list generate with it is described
The corresponding hyper parameter tuning task list of hyper parameter list.
It in accordance with an embodiment of the present disclosure, include the Run Script of the algorithm model in the hyper parameter tuning request,
On the node run algorithm model with the hyper parameter tuning task is trained include: on the node starting described in
The Run Script of algorithm model, and the algorithm model is run on the script to instruct to the hyper parameter tuning task
Practice.
In accordance with an embodiment of the present disclosure, it is generated when storing the training hyper parameter tuning task according to User Identity
The daily record data that daily record data generates when including: according to User Identity that training is the described hyper parameter tuning task store to point
Cloth file system.
In accordance with an embodiment of the present disclosure, the hyper parameter tuning task includes single machine hyper parameter tuning task and/or distribution
Formula hyper parameter tuning task.
In accordance with an embodiment of the present disclosure, the daily record data includes following any one or more: the index of algorithm model
Data, the numerical value of hyper parameter, the variation tendency of the achievement data, the training algorithm model time-consuming, the described achievement data
Contrast effect.
The second aspect of the disclosure provides a kind of hyper parameter tuning method, is applied to client, this method comprises: to
Server sends the request of hyper parameter tuning, so that the server, which is based on hyper parameter tuning request, generates hyper parameter tuning
Task configures the hyper parameter tuning task according to the environmental labels of user in hyper parameter tuning request, according to
The identification code of the hyper parameter tuning task determines the node for executing the hyper parameter tuning task, and runs on the node
Algorithm model trains the hyper parameter tune according to User Identity storage to be trained to the hyper parameter tuning task
The daily record data generated when excellent task;It receives and shows the daily record data.
In accordance with an embodiment of the present disclosure, comprising hyper parameter selection algorithm and for hyper parameter in the hyper parameter tuning request
The threshold range of setting.
It in accordance with an embodiment of the present disclosure, include the Run Script of the algorithm model in the hyper parameter tuning request.
In accordance with an embodiment of the present disclosure, the daily record data includes following any one or more: the index of algorithm model
Data, the numerical value of hyper parameter, the variation tendency of the achievement data, the training algorithm model time-consuming, the described achievement data
Contrast effect.
A kind of hyper parameter tuning device is provided in terms of the third of the disclosure, is applied to server, which includes: to connect
Module is received, for receiving the request of hyper parameter tuning;Generation module generates hyper parameter tuning based on hyper parameter tuning request and appoints
Business;Configuration module, for according to the hyper parameter tuning request in user environmental labels to the hyper parameter tuning task into
Row configuration;Training module executes the hyper parameter tuning task for determining according to the identification code of the hyper parameter tuning task
Node, and run algorithm model to be trained to the hyper parameter tuning task, and according to user's body on the node
Part mark stores the daily record data generated when the training hyper parameter tuning task;Sending module, in response to user's operation, according to
User right sends the daily record data to client.
In accordance with an embodiment of the present disclosure, comprising hyper parameter selection algorithm and for hyper parameter in the hyper parameter tuning request
The threshold range of setting, generation module include: computing module, using the hyper parameter selection algorithm to the threshold range of hyper parameter
It is calculated, obtains hyper parameter list;Submodule is generated, is generated based on the hyper parameter list opposite with the hyper parameter list
The hyper parameter tuning task list answered.
It in accordance with an embodiment of the present disclosure, include the Run Script of the algorithm model in the hyper parameter tuning request, on
It states training module to be also used to: starting the Run Script of the algorithm model on the node, and on the script described in operation
Algorithm model is to be trained the hyper parameter tuning task.
In accordance with an embodiment of the present disclosure, it is generated when storing the training hyper parameter tuning task according to User Identity
The daily record data that daily record data generates when including: according to User Identity that training is the described hyper parameter tuning task store to point
Cloth file system.
In accordance with an embodiment of the present disclosure, the hyper parameter tuning task includes single machine hyper parameter tuning task and/or distribution
Formula hyper parameter tuning task.
In accordance with an embodiment of the present disclosure, the daily record data includes following any one or more: the index of algorithm model
Data, the numerical value of hyper parameter, the variation tendency of the achievement data, the training algorithm model time-consuming, the described achievement data
Contrast effect.
4th aspect of the disclosure provides a kind of hyper parameter tuning device, is applied to client, which includes: hair
Module is sent, for sending the request of hyper parameter tuning to server, so that the server is requested based on the hyper parameter tuning
Hyper parameter tuning task is generated, according to the environmental labels of user in hyper parameter tuning request to the hyper parameter tuning task
It is configured, according to the determining node for executing the hyper parameter tuning task of the identification code of the hyper parameter tuning task, and
Algorithm model is run on the node to be trained to the hyper parameter tuning task, and is stored and instructed according to User Identity
The daily record data generated when practicing the hyper parameter tuning task;Receiving module, for receiving and showing the daily record data.
In accordance with an embodiment of the present disclosure, comprising hyper parameter selection algorithm and for hyper parameter in the hyper parameter tuning request
The threshold range of setting.
It in accordance with an embodiment of the present disclosure, include the Run Script of the algorithm model in the hyper parameter tuning request.
In accordance with an embodiment of the present disclosure, the daily record data includes following any one or more: the index of algorithm model
Data, the numerical value of hyper parameter, the variation tendency of the achievement data, the training algorithm model time-consuming, the described achievement data
Contrast effect.
5th aspect of the disclosure provides a kind of server, comprising: one or more processors, and storage dress
It sets.The storage device is for storing one or more programs.Wherein, when one or more of programs are by one or more
When a processor executes, so that one or more of processors execute the hyper parameter tuning method that first aspect provides.
6th aspect of the disclosure provides a kind of computer-readable medium, is stored thereon with executable instruction, this refers to
Enabling makes processor execute the hyper parameter tuning method that first aspect provides when being executed by processor.
7th aspect of the disclosure provides a kind of computer program, and the computer program includes that computer is executable
Instruction, the hyper parameter tuning method that described instruction provides when executed for realizing first aspect.
8th aspect of the disclosure provides a kind of client, comprising: one or more processors, and storage dress
It sets.The storage device is for storing one or more programs.Wherein, when one or more of programs are by one or more
When a processor executes, so that one or more of processors execute the hyper parameter tuning method that second aspect provides.
9th aspect of the disclosure provides a kind of computer-readable medium, is stored thereon with executable instruction, this refers to
Enabling makes processor execute the hyper parameter tuning method that second aspect provides when being executed by processor.
Tenth aspect of the disclosure provides a kind of computer program, and the computer program includes that computer is executable
Instruction, the hyper parameter tuning method that described instruction provides when executed for realizing second aspect.
The hyper parameter tuning method applied to server that the disclosure provides has the beneficial effect that:
The technical solution provided by the embodiment of the present disclosure can generate hyper parameter tuning based on the request of hyper parameter tuning and appoint
Business configures hyper parameter tuning task according to the environmental labels of user in the request of hyper parameter tuning, and according to hyper parameter
The identification code of tuning task determines the node for executing hyper parameter tuning task, and runs algorithm model on this node to super ginseng
Number tuning task is trained, and realizing user can be in the environmentally operation hyper parameter tuning task of oneself exploitation.Then root
The daily record data that generates when storing the training hyper parameter tuning task according to User Identity, in response to user's operation, according to
User right sends the daily record data to client, so that user can check the mould of oneself in the client of oneself
Type data, and can not check other people model data, ensure that the safety of user data, moreover it is possible to ensure opening for user
Hair is environmentally isolated, that is to say, that the exploitation environment that each user uses is that oneself is exclusive.
The hyper parameter tuning method applied to client that the disclosure provides has the beneficial effect that:
The technical solution provided by the embodiment of the present disclosure can send hyper parameter tuning to server and request, in the request
Contain the environmental labels of user and the identity of user, so that server can be used according in the request of hyper parameter tuning
The environmental labels at family configure hyper parameter tuning task, are determined according to the identification code of hyper parameter tuning task and execute hyper parameter
The node of tuning task, and algorithm model is run on node to be trained to hyper parameter tuning task, realizing user can
In the environmentally operation hyper parameter tuning task oneself developed.Then server is stored according to User Identity
The daily record data generated when the training hyper parameter tuning task, when user checks the training of algorithm model on the client
When, client can receive and show the daily record data generated when algorithm model training, so that user can be at oneself
Client on check the model data of oneself, and can not check other people model data, ensure that user data
Safety, moreover it is possible to ensure that the exploitation of user is environmentally isolated, that is to say, that the exploitation environment that each user uses is that oneself is exclusive
's.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Specific embodiment
Hereinafter, will be described with reference to the accompanying drawings embodiment of the disclosure.However, it should be understood that these descriptions are only exemplary
, and it is not intended to limit the scope of the present disclosure.In addition, in the following description, descriptions of well-known structures and technologies are omitted, with
Avoid unnecessarily obscuring the concept of the disclosure.
Term as used herein is not intended to limit the disclosure just for the sake of description specific embodiment.It uses herein
The terms "include", "comprise" etc. show the presence of the feature, step, operation and/or component, but it is not excluded that in the presence of
Or add other one or more features, step, operation or component.
There are all terms (including technical and scientific term) as used herein those skilled in the art to be generally understood
Meaning, unless otherwise defined.It should be noted that term used herein should be interpreted that with consistent with the context of this specification
Meaning, without that should be explained with idealization or excessively mechanical mode.
It, in general should be according to this using statement as " at least one in A, B and C etc. " is similar to
Field technical staff is generally understood the meaning of the statement to make an explanation (for example, " system at least one in A, B and C "
Should include but is not limited to individually with A, individually with B, individually with C, with A and B, with A and C, have B and C, and/or
System etc. with A, B, C).It should also be understood by those skilled in the art that substantially arbitrarily indicating two or more optional projects
Adversative conjunction and/or phrase, either in specification, claims or attached drawing, shall be construed as giving including
A possibility that either one or two projects of one of these projects, these projects.For example, phrase " A or B " should be understood as wrapping
A possibility that including " A " or " B " or " A and B ".
Fig. 1 shows the exemplary of the hyper parameter tuning method or hyper parameter tuning device that can apply the embodiment of the present invention
The schematic diagram of system architecture 100.
As shown in Figure 1, system architecture 100 may include one of terminal device 101,102,103 or a variety of, network
104 and server 105.Network 104 between terminal device 101,102,103 and server 105 to provide communication link
Medium.Network 104 may include various connection types, such as wired, wireless communication link or fiber optic cables etc..
It should be understood that the number of terminal device, network and server in Fig. 1 is only schematical.According to realization need
It wants, can have any number of terminal device, network and server.For example server 105 can be multiple server compositions
Server cluster etc..
User can be used terminal device 101,102,103 and be interacted by network 104 with server 105, to receive or send out
Send message etc..Terminal device 101,102,103 can be the various electronic equipments with display screen, including but not limited to intelligent hand
Machine, tablet computer, portable computer and desktop computer etc..
Server 105 can be to provide the server of various services.Such as server 105 can from terminal device 103 (
Can be terminal device 101 or 102) in obtain hyper parameter tuning request, based on hyper parameter tuning request hyper parameter can be generated
Tuning task configures hyper parameter tuning task according to the environmental labels of user in the request of hyper parameter tuning, and according to
The identification code of hyper parameter tuning task determines the node for executing hyper parameter tuning task, and on this node operation algorithm model with
Hyper parameter tuning task is trained, realizing user can be in the environmentally operation hyper parameter tuning task of oneself exploitation.
The daily record data generated when then storing the training hyper parameter tuning task according to User Identity, is grasped in response to user
Make, the daily record data is sent to client according to user right, so that user can check in the client of oneself
The model data of oneself, and can not check other people model data, ensure that the safety of user data, moreover it is possible to ensure
The exploitation of user is environmentally isolated, that is to say, that the exploitation environment that each user uses is that oneself is exclusive.
In some embodiments, hyper parameter tuning method provided by the embodiment of the present invention is generally executed by server 105
Or server cluster executes, correspondingly, hyper parameter tuning device is generally positioned in server 105 or in server cluster.?
In other embodiments, certain terminals can have function similar with server thereby executing this method.Therefore, the present invention is real
It applies hyper parameter tuning method provided by example and is not limited to server end execution.
Fig. 2 diagrammatically illustrates the process of the hyper parameter tuning method applied to server according to the embodiment of the present disclosure
Figure.
As shown in Fig. 2, the hyper parameter tuning method for being applied to server includes step S110~step S150.
In step s 110, the request of hyper parameter tuning is received.
In the step s 120, hyper parameter tuning task is generated based on hyper parameter tuning request.
In step s 130, the hyper parameter tuning is appointed according to the environmental labels of user in hyper parameter tuning request
Business is configured.
In step S140, is determined according to the identification code of the hyper parameter tuning task and execute the hyper parameter tuning task
Node, and run algorithm model to be trained to the hyper parameter tuning task, and according to user's body on the node
Part mark stores the daily record data generated when the training hyper parameter tuning task.
In step S150, in response to user's operation, the daily record data is sent to client according to user right.
This method can generate hyper parameter tuning task based on the request of hyper parameter tuning, use according in the request of hyper parameter tuning
The environmental labels at family configure hyper parameter tuning task, and determine to execute according to the identification code of hyper parameter tuning task and surpass
The node of arameter optimization task, and run algorithm model on this node to be trained to hyper parameter tuning task, it realizes
User can be in the environmentally operation hyper parameter tuning task of oneself exploitation.Then according to User Identity storage training
The daily record data generated when hyper parameter tuning task sends the day to client according to user right in response to user's operation
Will data, so that user can check the model data of oneself in the client of oneself, and can not check other people
Model data ensures that the safety of user data, moreover it is possible to ensure that the exploitation of user is environmentally isolated, that is to say, that Mei Geyong
The exploitation environment that family uses all is that oneself is exclusive.
In some embodiments of the present disclosure, it may include the environmental labels of user in above-mentioned hyper parameter tuning request, surpass
The configuration file of parameter, the selection algorithm of hyper parameter, algorithm model Run Script.Wherein, the environmental labels of user can be
Run the mark of the exploitation environment of hyper parameter tuning task.It may include the threshold value of each hyper parameter in the configuration file of hyper parameter
Range.The selection algorithm of hyper parameter can be Parse, Random, Grid, Hyper or Bayesian.The operation of algorithm model
Script, which can be, specifies a Run Script for algorithm model.
In some embodiments of the present disclosure, according to the environmental labels of user in the request of hyper parameter tuning to hyper parameter tuning
Task is configured.For example, the environmental labels of user can be expressed as user-env, server can will be obtained from client
User-env is written in the initial environment of hyper parameter tuning task, and generates 26 uniquely specified alphabetic codes to identify task,
Wherein, 26 alphabetic codes can be the identification code of hyper parameter tuning parameter.
It, can be to container cluster after server configures hyper parameter tuning task in some embodiments of the present disclosure
(kubernetes) batch tasks application is submitted, by hyper parameter tuning task run to corresponding node, while can be to number
According to corresponding content is written in library (for example, task name, task 26 is alphabetic code, algorithm types of task etc.).In addition, service
Device can be with the ui trigger request at customer in response end, such as lists hyper parameter tuning task all items content relevant with its
Deng.
In some embodiments of the present disclosure, determine that executing hyper parameter tuning appoints according to the identification code of hyper parameter tuning task
The node of business.The node can be a server, which can refer to the server in server cluster.In addition, at this
In embodiment, it can also determine to execute the hyper parameter according to the type (for example, single machine or distribution) of hyper parameter tuning task
The node of tuning task.
In some embodiments of the present disclosure, algorithm model is run on above-mentioned node to carry out to hyper parameter tuning task
Training, in order to carry out tuning to hyper parameter.Wherein, algorithm model can be corresponding with hyper parameter tuning task.For example,
Hyper parameter tuning task is single machine hyper parameter tuning task, then corresponding algorithm model is also possible to uniprocessor algorithm mould
Type.For another example hyper parameter tuning task is distributed hyper parameter tuning task, then corresponding algorithm model is also possible to
Distributed algorithm model.Therefore, in the present embodiment, it can not only support the algorithm frame of single machine, can also support multimachine point
The algorithm frame of cloth.
In some embodiments of the present disclosure, production when can store trained hyper parameter tuning task according to User Identity
Raw daily record data.Wherein, User Identity can be the identification code for the setting of each client.Daily record data can wrap
Include following any one or more: the achievement data of algorithm model, the numerical value of hyper parameter, the achievement data variation tendency,
The contrast effect of time-consuming, the described achievement data of the training algorithm model.In the present embodiment, if user on the client
When checking the data of algorithm model training hyper parameter tuning task, input that server can be logged according to user in client come
Daily record data corresponding with the user right is transferred, so that it is determined that the safety of daily record data.
In some embodiments of the present disclosure, server can be according to unique 26 alphabetic codes come by corresponding hyper parameter
The daily record data of tuning task is stored in bottom storage system, and according to the model training evaluation index inside user configuration from task
The index extraction of response is come out in log and is stored in database.
In some embodiments of the present disclosure, the Run Script comprising algorithm model in above-mentioned hyper parameter tuning request,
It includes: to start the algorithm model on node that algorithm model is run on node to be trained to the hyper parameter tuning task
Run Script, and on the script run algorithm model to be trained to hyper parameter tuning task, further realize
The beneficial effect of multi-tenant isolation.
In some embodiments of the present disclosure, produced when storing the training hyper parameter tuning task according to User Identity
The daily record data storage that raw daily record data generates when including: according to User Identity that training is the described hyper parameter tuning task
To distributed file system, such storage mode directly acquires convenient for user.
In some embodiments of the present disclosure, the hyper parameter tuning task include single machine hyper parameter tuning task and/or
Distributed hyper parameter tuning task.In the present embodiment, the operation of single machine hyper parameter tuning task can be not only supported, it can be with
Supporting the operation of all kinds of distributed hyper parameter tuning tasks, such as tensorflow distribution, spark is distributed, and xgboost points
Cloth, mxnet distribution etc..
Fig. 3 is diagrammatically illustrated according to the hyper parameter tuning method applied to server of the disclosure another embodiment
Flow chart.
As shown in figure 3, above-mentioned steps S120 can specifically include step S121 and step S122.
In step S121, the threshold range of hyper parameter is calculated using the hyper parameter selection algorithm, is surpassed
Parameter list.
In step S122, hyper parameter tuning corresponding with the hyper parameter list is generated based on the hyper parameter list
Task list.
This method can use hyper parameter selection algorithm and calculate the threshold range of hyper parameter, obtain hyper parameter column
Table, can tentatively determine hyper parameter to be trained from the threshold range of hyper parameter by this method, i.e., tentatively to from client
The hyper parameter tuning received saves the time to hyper parameter tuning for later use algorithm model, improves training effectiveness.
In some embodiments of the present disclosure, the threshold range of hyper parameter is counted using the hyper parameter selection algorithm
It calculates, obtains hyper parameter list.For example, server can receive the hyper parameter configuration file of client transmission, and it is parsed into
The selection algorithm that hyper parameter threshold range and hyper parameter are specified, the selection algorithm for then starting response come to hyper parameter threshold range
It carries out that hyper parameter list is calculated.
In some embodiments of the present disclosure, it is based on hyper parameter list generation hyper parameter tune corresponding with hyper parameter list
Excellent task list.For example, server can receive generation task requests, and create corresponding task configurations environment.Specifically
Ground, server obtain corresponding hyper parameter list, traverse hyper parameter list, generate corresponding hyper parameter tuning task list, so
The environmental labels user-env for obtaining corresponding user from client is written in the initial environment of hyper parameter tuning task afterwards, and raw
Task is identified at 26 uniquely specified alphabetic codes.In task configurations: for spark generic task, server can be certainly
Move the software version and cluster environment that client environment is corresponded to for its carry;For tensorflow task, single machine
Tensorflow can the corresponding tensorflow mirror image packet of automatic carry, such as gpu or cpu;Distributed tensorflow is appointed
For business, then tf-operator generic task can be automatically configured, and identify the node that can accordingly run.Pytorch is appointed
Business, then can be configured to pytorch-operator generic task.For the distributed task scheduling of other classes, then mpi- can be configured to
Operator generic task.
Fig. 4 diagrammatically illustrates the process of the hyper parameter tuning method applied to client according to the embodiment of the present disclosure
Figure.
As shown in figure 4, the hyper parameter tuning method for being applied to client includes step S210 and step S220.
In step S210, the request of hyper parameter tuning is sent to server, so that the server is based on the super ginseng
Number tuning request generates hyper parameter tuning task, according to the environmental labels of user in hyper parameter tuning request to the super ginseng
Number tuning task is configured, and is determined according to the identification code of the hyper parameter tuning task and is executed the hyper parameter tuning task
Node, and algorithm model is run to be trained to the hyper parameter tuning task on the node, and according to user identity
Mark stores the daily record data generated when the training hyper parameter tuning task
In step S220, receives and show the daily record data
This method can send the request of hyper parameter tuning to server, contain the environmental labels and use of user in the request
The identity at family, so that server can be according to the environmental labels of user in the request of hyper parameter tuning to hyper parameter tuning
Task is configured, and the node for executing hyper parameter tuning task is determined according to the identification code of hyper parameter tuning task, and in node
For upper operation algorithm model to be trained to hyper parameter tuning task, realizing user can be in the environmentally operation of oneself exploitation
Hyper parameter tuning task.When server then being allowed to store the training hyper parameter tuning task according to User Identity
The daily record data of generation, when user checks the training of algorithm model on the client, client can receive and show
The daily record data generated when algorithm model training, so that user can check the model of oneself in the client of oneself
Data, and can not check other people model data, ensure that the safety of user data, moreover it is possible to ensure the exploitation of user
It is environmentally isolated, that is to say, that the exploitation environment that each user uses is that oneself is exclusive.
In some embodiments of the present disclosure, comprising hyper parameter selection algorithm and for super in the hyper parameter tuning request
The threshold range of parameter setting.
In some embodiments of the present disclosure, the operation foot comprising the algorithm model in the hyper parameter tuning request
This.
In some embodiments of the present disclosure, the daily record data includes following any one or more: algorithm model
Achievement data, the numerical value of hyper parameter, the variation tendency of the achievement data, time-consuming, the described index of the training algorithm model
The contrast effect of data.
Fig. 5 diagrammatically illustrates the schematic diagram interacted according to the server of the embodiment of the present disclosure with client.
As shown in figure 5, the process that server is interacted with client includes S1~S9, it is specific as follows:
S1, user end to server send the request of hyper parameter tuning.
S2 can generate hyper parameter based on hyper parameter tuning request when server receives the request of hyper parameter tuning
Tuning task;
S3, server can carry out hyper parameter tuning task according to the environmental labels of user in hyper parameter tuning request
Configuration.
S4, server can determine the node for executing hyper parameter tuning task according to the identification code of hyper parameter tuning task.
S5 runs algorithm model on this node to be trained to hyper parameter tuning task, and realizing user can be
The environmentally operation hyper parameter tuning task of oneself exploitation.
S6, server can store the log number generated when the training hyper parameter tuning task according to User Identity
According to.
S7, when user wants when client checks the training progress of algorithm model, user can be in ui circle of client
It carries out checking operation on face.
S8, when server receives when checking operation of client, in response to user's operation, according to user right inquiry with
Its corresponding daily record data, so that user can check the model data of oneself in the client of oneself, and can not
The model data for checking other people ensures that the safety of user data.
S9, client can receive server in response to the daily record data of user's operation.
Fig. 6 diagrammatically illustrates the box of the hyper parameter tuning device applied to server according to the embodiment of the present disclosure
Figure.
As shown in fig. 6, be applied to server hyper parameter tuning device 600 include receiving module 610, generation module 620,
Configuration module 630, training module 640 and sending module 650.
Specifically, receiving module 610, for receiving the request of hyper parameter tuning.
Generation module 620 generates hyper parameter tuning task based on hyper parameter tuning request.
Configuration module 630, the environmental labels for user in being requested according to the hyper parameter tuning are to the hyper parameter tune
Excellent task is configured.
Training module 640 executes the hyper parameter tuning for determining according to the identification code of the hyper parameter tuning task
The node of task, and on the node run algorithm model to be trained to the hyper parameter tuning task, and according to
Family identity stores the daily record data generated when the training hyper parameter tuning task.
Sending module 650 sends the daily record data to client according to user right in response to user's operation.
The hyper parameter tuning device 600 for being applied to server can generate hyper parameter tuning based on the request of hyper parameter tuning
Task configures hyper parameter tuning task according to the environmental labels of user in the request of hyper parameter tuning, and according to super ginseng
The identification code of number tuning task determines the node for executing hyper parameter tuning task, and runs algorithm model on this node to super
Arameter optimization task is trained, and realizing user can be in the environmentally operation hyper parameter tuning task of oneself exploitation.Then
The daily record data generated when storing the training hyper parameter tuning task according to User Identity, in response to user's operation, root
The daily record data is sent to client according to user right, so that user can check oneself in the client of oneself
Model data, and can not check other people model data, ensure that the safety of user data, moreover it is possible to ensure user's
Exploitation is environmentally isolated, that is to say, that the exploitation environment that each user uses is that oneself is exclusive.
In accordance with an embodiment of the present disclosure, the hyper parameter tuning device 600 of server should be applied to for realizing Fig. 2 embodiment
The hyper parameter tuning method applied to server of description.
Fig. 7 is diagrammatically illustrated according to the hyper parameter tuning device applied to server of the disclosure another embodiment
Block diagram.
As shown in fig. 7, above-mentioned generation module 620 includes computing module 621 and generation submodule 622.
Specifically, computing module 621 calculate the threshold range of hyper parameter using the hyper parameter selection algorithm,
Obtain hyper parameter list.
Submodule 622 is generated, hyper parameter tune corresponding with the hyper parameter list is generated based on the hyper parameter list
Excellent task list.
Above-mentioned generation module 620 can use hyper parameter selection algorithm and calculate the threshold range of hyper parameter, obtain
Hyper parameter list can tentatively determine hyper parameter to be trained from the threshold range of hyper parameter by this method, i.e., preliminary right
The hyper parameter tuning received from client saves the time to hyper parameter tuning for later use algorithm model, improves instruction
Practice efficiency.
In accordance with an embodiment of the present disclosure, which is applied to service for realizing what Fig. 3 embodiment described
The hyper parameter tuning method of device.
It is understood that receiving module 610, generation module 620, computing module 621, generation submodule 622, configuration mould
Block 630, training module 640 and sending module 650 may be incorporated in a module realize or it is therein any one
Module can be split into multiple modules.Alternatively, at least partly function of one or more modules in these modules can be with
At least partly function of other modules combines, and realizes in a module.According to an embodiment of the invention, receiving module
610, generation module 620, computing module 621, generation submodule 622, configuration module 630, training module 640 and transmission mould
At least one of block 650 can at least be implemented partly as hardware circuit, for example, field programmable gate array (FPGA), can
Programmed logic array (PLA) (PLA), system on chip, the system on substrate, the system in encapsulation, specific integrated circuit (ASIC), or can
To be realized with carrying out the hardware such as any other rational method that is integrated or encapsulating or firmware to circuit, or with software, hardware
And the appropriately combined of firmware three kinds of implementations is realized.Alternatively, receiving module 610, generation module 620, computing module
621, generating at least one of submodule 622, configuration module 630, training module 640 and sending module 650 can be at least
It is implemented partly as computer program module, when the program is run by computer, the function of corresponding module can be executed.
Fig. 8 diagrammatically illustrates the box of the hyper parameter tuning device applied to client according to the embodiment of the present disclosure
Figure.
As shown in figure 8, the hyper parameter tuning device 700 for being applied to client includes sending module 710 and receiving module
720。
Specifically, sending module 710, for sending the request of hyper parameter tuning to server, so that the server base
It requests to generate hyper parameter tuning task in the hyper parameter tuning, according to the environmental labels of user in hyper parameter tuning request
The hyper parameter tuning task is configured, is determined according to the identification code of the hyper parameter tuning task and executes the hyper parameter
The node of tuning task, and algorithm model is run to be trained to the hyper parameter tuning task on the node, and root
The daily record data generated when storing the training hyper parameter tuning task according to User Identity.
Receiving module 720, for receiving and showing the daily record data.
The hyper parameter tuning device 700 for being applied to client can send the request of hyper parameter tuning to server, this is asked
The environmental labels of user and the identity of user are contained in asking, so that server can be requested according to hyper parameter tuning
The environmental labels of middle user configure hyper parameter tuning task, determine that execution is super according to the identification code of hyper parameter tuning task
The node of arameter optimization task, and algorithm model is run on node to be trained to hyper parameter tuning task, realize use
It family can be in the environmentally operation hyper parameter tuning task of oneself exploitation.Then allow server according to User Identity
The daily record data generated when storing the training hyper parameter tuning task, when user checks the training of algorithm model on the client
When situation, the daily record data that generates when client can receive and show algorithm model training, so that user can be
The model data of oneself is checked in the client of oneself, and can not check other people model data, ensures that number of users
According to safety, moreover it is possible to ensure that the exploitation of user is environmentally isolated, that is to say, that the exploitation environment that each user uses is oneself
Exclusive.
In accordance with an embodiment of the present disclosure, the hyper parameter tuning device 700 of client should be applied to for realizing Fig. 4 embodiment
The hyper parameter tuning method applied to client of description.
Fig. 9 diagrammatically illustrates the block diagram of the computer system of the server according to the embodiment of the present disclosure.
Computer system shown in Fig. 9 is only an example, should not function and use scope to the embodiment of the present disclosure
Bring any restrictions.
As shown in figure 9, include processor 801 according to the computer system 800 of the server of the embodiment of the present disclosure, it can be with
Random access storage device is loaded into according to the program being stored in read-only memory (ROM) 802 or from storage section 808
(RAM) program in 803 and execute various movements appropriate and processing.Processor 801 for example may include general purpose microprocessor
(such as CPU), instruction set processor and/or related chip group and/or special microprocessor are (for example, specific integrated circuit
(ASIC)), etc..Processor 801 can also include the onboard storage device for caching purposes.Processor 801 may include using
In the different movements for executing the method flow according to the embodiment of the present disclosure described referring to figs. 2 and 3 single treatment units or
Person is multiple processing units.
In RAM 803, it is stored with system 800 and operates required various programs and data.Processor 801, ROM 802 with
And RAM 803 is connected with each other by bus 804.Processor 801 is held by executing the program in ROM 802 and/or RAM 803
Various steps of the row above with reference to Fig. 2 and Fig. 3 hyper parameter tuning method applied to server described.It is noted that the journey
Sequence also can store in one or more memories in addition to ROM 802 and RAM 803.Processor 801 can also pass through
It executes the program being stored in the one or more memory and is applied to server above with reference to Fig. 2 and Fig. 3 description to execute
Hyper parameter tuning method various steps.
In accordance with an embodiment of the present disclosure, system 800 can also include input/output (I/O) interface 807, input/output
(I/O) interface 807 is also connected to bus 804.System 800 can also include be connected to I/O interface 805 with one in lower component
Item is multinomial: the importation 806 including keyboard, mouse etc.;Including such as cathode-ray tube (CRT), liquid crystal display (LCD)
Deng and loudspeaker etc. output par, c 807;Storage section 808 including hard disk etc.;And including such as LAN card, modulatedemodulate
Adjust the communications portion 809 of the network interface card of device etc..Communications portion 809 executes communication process via the network of such as internet.
Driver 810 is also connected to I/O interface 805 as needed.Detachable media 811, such as disk, CD, magneto-optic disk, semiconductor
Memory etc. is mounted on as needed on driver 810, in order to be pacified as needed from the computer program read thereon
It is packed into storage section 808.
In accordance with an embodiment of the present disclosure, it may be implemented as computer software journey above with reference to the method for flow chart description
Sequence.For example, embodiment of the disclosure includes a kind of computer program product comprising carry meter on a computer-readable medium
Calculation machine program, the computer program include the program code for method shown in execution flow chart.In such embodiments,
The computer program can be downloaded and installed from network by communications portion 809, and/or be pacified from detachable media 811
Dress.When the computer program is executed by processor 801, the above-mentioned function of limiting in the system of the embodiment of the present disclosure is executed.Root
According to embodiment of the disclosure, system as described above, unit, module, unit etc. can by computer program module come
It realizes.
It should be noted that computer-readable medium shown in the disclosure can be computer-readable signal media or meter
Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but not
Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.Meter
The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, just of one or more conducting wires
Taking formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only storage
Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device,
Or above-mentioned any appropriate combination.In the disclosure, computer readable storage medium can be it is any include or storage journey
The tangible medium of sequence, the program can be commanded execution system, device or device use or in connection.And at this
In open, computer-readable signal media may include in a base band or as the data-signal that carrier wave a part is propagated,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By the use of instruction execution system, device or device or program in connection.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned
Any appropriate combination.In accordance with an embodiment of the present disclosure, computer-readable medium may include above-described ROM 802
And/or one or more memories other than RAM 803 and/or ROM 802 and RAM 803.
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the disclosure, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants
It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule
The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction
It closes to realize.
As on the other hand, the disclosure additionally provides a kind of computer-readable medium, which can be
Included in equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying equipment.Above-mentioned calculating
Machine readable medium carries one or more program, when said one or multiple programs are executed by the equipment, makes
Equipment execution is obtained according to the hyper parameter tuning method applied to server of the embodiment of the present disclosure.This method comprises: receiving super
Arameter optimization request;It requests to generate hyper parameter tuning task based on the hyper parameter tuning;It is requested according to the hyper parameter tuning
The environmental labels of middle user configure the hyper parameter tuning task;Identification code according to the hyper parameter tuning task is true
Surely the node of the hyper parameter tuning task is executed, and runs algorithm model on the node to appoint to the hyper parameter tuning
Business is trained, and the daily record data generated when the hyper parameter tuning task trained according to User Identity storage;Response
In user's operation, the daily record data is sent to client according to user right.
Figure 10 diagrammatically illustrates the block diagram of the computer system of the client according to the embodiment of the present disclosure.Figure 10 is shown
Computer system be only an example, should not function to the embodiment of the present disclosure and use scope bring any restrictions.
It as shown in Figure 10, include processor 901 according to the computer system 900 of the client of the embodiment of the present disclosure, it can
To be loaded into random access storage device according to the program being stored in read-only memory (ROM) 902 or from storage section 908
(RAM) program in 903 and execute various movements appropriate and processing.Processor 901 for example may include general purpose microprocessor
(such as CPU), instruction set processor and/or related chip group and/or special microprocessor are (for example, specific integrated circuit
(ASIC)), etc..Processor 901 can also include the onboard storage device for caching purposes.Processor 901 may include using
It is either more in the single treatment unit for the different movements for executing the method flow according to the embodiment of the present disclosure with reference to Fig. 4 description
A processing unit.
In RAM 903, it is stored with system 900 and operates required various programs and data.Processor 901, ROM 902 with
And RAM 903 is connected with each other by bus 904.Processor 901 is held by executing the program in ROM 902 and/or RAM 903
Various steps of the row above with reference to Fig. 4 hyper parameter tuning method applied to client described.It is noted that the program can also
To be stored in one or more memories in addition to ROM 902 and RAM 903.Processor 901 can also be deposited by executing
The program in the one or more memory is stored up to execute the hyper parameter tuning applied to client described above with reference to Fig. 4
The various steps of method.
In accordance with an embodiment of the present disclosure, system 900 can also include input/output (I/O) interface 907, input/output
(I/O) interface 907 is also connected to bus 904.System 900 can also include be connected to I/O interface 905 with one in lower component
Item is multinomial: the importation 906 including keyboard, mouse etc.;Including such as cathode-ray tube (CRT), liquid crystal display (LCD)
Deng and loudspeaker etc. output par, c 907;Storage section 908 including hard disk etc.;And including such as LAN card, modulatedemodulate
Adjust the communications portion 909 of the network interface card of device etc..Communications portion 909 executes communication process via the network of such as internet.
Driver 910 is also connected to I/O interface 905 as needed.Detachable media 911, such as disk, CD, magneto-optic disk, semiconductor
Memory etc. is mounted on as needed on driver 910, in order to be pacified as needed from the computer program read thereon
It is packed into storage section 908.
In accordance with an embodiment of the present disclosure, it may be implemented as computer software journey above with reference to the method for flow chart description
Sequence.For example, embodiment of the disclosure includes a kind of computer program product comprising carry meter on a computer-readable medium
Calculation machine program, the computer program include the program code for method shown in execution flow chart.In such embodiments,
The computer program can be downloaded and installed from network by communications portion 909, and/or be pacified from detachable media 911
Dress.When the computer program is executed by processor 901, the above-mentioned function of limiting in the system of the embodiment of the present disclosure is executed.Root
According to embodiment of the disclosure, system as described above, unit, module, unit etc. can by computer program module come
It realizes.
It should be noted that computer-readable medium shown in the disclosure can be computer-readable signal media or meter
Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example can be --- but not
Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.Meter
The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to: have the electrical connection, just of one or more conducting wires
Taking formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type may be programmed read-only storage
Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device,
Or above-mentioned any appropriate combination.In the disclosure, computer readable storage medium can be it is any include or storage journey
The tangible medium of sequence, the program can be commanded execution system, device or device use or in connection.And at this
In open, computer-readable signal media may include in a base band or as the data-signal that carrier wave a part is propagated,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By the use of instruction execution system, device or device or program in connection.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned
Any appropriate combination.In accordance with an embodiment of the present disclosure, computer-readable medium may include above-described ROM 902
And/or one or more memories other than RAM 903 and/or ROM 902 and RAM 903.
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the disclosure, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one module, program segment or code of table, a part of above-mentioned module, program segment or code include one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants
It is noted that the combination of each box in block diagram or flow chart and the box in block diagram or flow chart, can use and execute rule
The dedicated hardware based systems of fixed functions or operations is realized, or can use the group of specialized hardware and computer instruction
It closes to realize.
As on the other hand, the disclosure additionally provides a kind of computer-readable medium, which can be
Included in equipment described in above-described embodiment;It is also possible to individualism, and without in the supplying equipment.Above-mentioned calculating
Machine readable medium carries one or more program, when said one or multiple programs are executed by the equipment, makes
Equipment execution is obtained according to the hyper parameter tuning method applied to client of the embodiment of the present disclosure.This method comprises: to service
Device sends the request of hyper parameter tuning, appoints so that the server is based on hyper parameter tuning request generation hyper parameter tuning
Business configures the hyper parameter tuning task according to the environmental labels of user in hyper parameter tuning request, according to institute
The identification code for stating hyper parameter tuning task determines the node for executing the hyper parameter tuning task, and runs and calculate on the node
Method model trains the hyper parameter tuning according to User Identity storage to be trained to the hyper parameter tuning task
The daily record data generated when task;It receives and shows the daily record data.
Embodiment of the disclosure is described above.But the purpose that these embodiments are merely to illustrate that, and
It is not intended to limit the scope of the present disclosure.Although respectively describing each embodiment above, but it is not intended that each reality
Use cannot be advantageously combined by applying the measure in example.The scope of the present disclosure is defined by the appended claims and the equivalents thereof.It does not take off
From the scope of the present disclosure, those skilled in the art can make a variety of alternatives and modifications, these alternatives and modifications should all fall in this
Within scope of disclosure.