CN111831322B - Multi-level user-oriented machine learning parameter configuration method - Google Patents

Multi-level user-oriented machine learning parameter configuration method Download PDF

Info

Publication number
CN111831322B
CN111831322B CN202010296630.3A CN202010296630A CN111831322B CN 111831322 B CN111831322 B CN 111831322B CN 202010296630 A CN202010296630 A CN 202010296630A CN 111831322 B CN111831322 B CN 111831322B
Authority
CN
China
Prior art keywords
parameter
machine learning
stage
parameter configuration
optimal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010296630.3A
Other languages
Chinese (zh)
Other versions
CN111831322A (en
Inventor
程钢
刘必欣
初宁
王强
张旭锋
李伯昌
薛源
刁文辉
于泓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of War of PLA Academy of Military Science
Original Assignee
Research Institute of War of PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of War of PLA Academy of Military Science filed Critical Research Institute of War of PLA Academy of Military Science
Priority to CN202010296630.3A priority Critical patent/CN111831322B/en
Publication of CN111831322A publication Critical patent/CN111831322A/en
Application granted granted Critical
Publication of CN111831322B publication Critical patent/CN111831322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/71Version control; Configuration management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)
  • Stored Programmes (AREA)

Abstract

The invention mainly solves the problem that in the application process of the machine learning application program, the machine learning application program is required to be oriented to multi-level user requirements due to the fact that zero basis, a little basis and professional level users commonly use the application. The method comprises the following steps: (1) in a machine learning stage, aiming at a plurality of typical scenes, adapting each scene data with the machine learning, and determining the optimal parameter configuration of each scene; (2) in the parameter sequencing stage, sequencing each group of optimal parameter configuration in ascending order according to a strategy with frequency as a main part to form an ordered optimal parameter set; (3) in the application running stage, aiming at an application scene, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the running of an application program, and the optimal parameters obtained by the detailed parameter configuration mode are put back into the parameter ordering stage for iteration.

Description

Multi-level user-oriented machine learning parameter configuration method
Technical Field
The invention relates to computer program parameter configuration by adopting a machine learning technology, in particular to a machine learning parameter configuration method capable of providing a combination for users with zero foundation, little foundation and professional level.
Background
Artificial intelligence technology is moving to practical use, and technologies such as image recognition, voice recognition, natural language processing and the like are all necessary in many occasions of man-machine interaction. Among these techniques, the core algorithm is machine learning. However, under the state of the art conditions, no matter supervised learning, unsupervised learning, reinforcement learning, and currently most popular deep learning, parameter configuration is left. The best effect of learning and training can be achieved only under the condition that the parameter configuration is reasonable.
Because the artificial intelligence technology is put into practical use, users are no longer satisfied with using computer programs with configured machine learning parameters, hope to participate in the design, and achieve better application effects. However, most machine learning parameters are very specialized, the parameters are many, and the parameter size has no direct change relation with the quality of the result. In addition, the application scene has great influence on parameter configuration, and a set of parameters are often suitable for a limited scene. Therefore, in a machine learning computer program, if a user cannot adjust parameters by himself, relying on and waiting for a professional to complete parameter adjustment, such a program is inefficient for adapting to various scenes.
In order to reduce the threshold of user adjustment parameters, there are several methods, such as a cross-validation method commonly used in machine learning, in which a model training data set is randomly divided into multiple data subsets, different parameters are set during training of different data subsets, and an optimal parameter combination is selected through result comparison. In addition, in recent years, automatic machine learning technology has been developed rapidly in the field of deep learning, and selection of optimal models and parameters is performed by comparing training results of different models and parameters on a set of data sets based on a set of model and parameter search strategies. However, the above methods are essentially parameter configuration methods based on an exhaustive search strategy, which are time-consuming and computationally expensive, and difficult to obtain optimal results.
Disclosure of Invention
The invention aims to provide a machine learning program parameter configuration method for multi-level users, which meets the requirements of different types of users: (1) providing a default parameter mode to support machine learning application program operation; (2) providing a single parameter configuration mode to support machine learning application program operation; (3) providing a detailed parameter configuration mode supports machine learning application operation.
To achieve the above object, the present invention proposes a complex machine learning parameter configuration solution. In the machine learning stage, aiming at a plurality of typical scenes, adapting each scene data with machine learning, and determining the optimal parameter configuration of each scene; (2) in the parameter sequencing stage, sequencing each group of optimal parameter configuration in ascending order according to a strategy with frequency as a main part to form an ordered optimal parameter set; (3) in the application running stage, aiming at an application scene, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the running of an application program, and the optimal parameters obtained by the detailed parameter configuration mode are put back into the parameter ordering stage for iteration.
Further, the term "parameter" in the present invention means that the parameter that the program cannot determine the quality by itself is defined as the parameter to be configured of the machine learning program, and includes not only the parameter that cannot be optimized by the machine learning iteration, but also the parameter that cannot be calculated by the application program according to the predetermined algorithm.
Further, in the step (1), each scene data is adapted to the machine learning in the machine learning stage, that is, the machine learning module inputs according to the scene data, observes the learning effect by adjusting the parameters, and obtains the optimal parameters when the learning effect is optimal, so as to complete the adaptation process.
Further, in the step (2), sorting is performed according to the frequency-based policy ascending order in the parameter sorting stage, that is, firstly, sorting of the parameter sets is performed according to the frequency, and if the parameter frequencies are the same, sorting is performed by adopting another criterion, so as to determine the sequence of the parameter schemes with the same frequency.
Further, the default parameter mode provided in the step (3) in the application running stage is the parameter scheme with the highest ranking in the optimal parameter set obtained in the parameter ranking stage, and the parameter scheme is used as the configuration parameter of the machine learning application program.
Further, the single parameter configuration mode provided in the application operation stage in the step (3) is an ordered optimal parameter set obtained in the parameter ordering stage, the ordered parameter set and the monotonic value form a mapping relation, and the user selects a corresponding parameter scheme by adjusting the monotonic value.
Further, the detailed parameter configuration mode provided in the step (3) at the application program operation stage is that each configurable parameter user is visible, and the user selects the satisfactory operation result by adjusting the parameter.
Further, the step (3) of setting the optimal parameters obtained by the detailed parameter configuration mode in the application program operation stage, and putting the optimal parameters back to the parameter sorting stage for iteration means that the user performs parameter configuration by himself, when the parameter configuration is completed, the user feels that the result is satisfactory, or the parameter configuration process meets the set rule of the machine learning application program, the obtained parameter scheme is incorporated into the optimal parameter set, and the optimal parameter set is rearranged according to the parameter frequency and other rules in the parameter sorting stage to form a new sorting optimal parameter set.
The invention uses the machine learning program for multi-level users and has the following advantages:
1. the application range of the scene is wider, and the machine learning application program can play the efficacy of the scene according to parameter configuration aiming at various scenes;
2. the application range of the user is wider, and the user can select default parameters, adjust single parameters and adjust all parameters according to the needs of the user;
3. the single parameter adjustment scheme follows a definite ordering rule, so that a user can quickly understand and master the single parameter adjustment method;
4. parameter configuration aiming at different typical scenes in the machine learning process is utilized to the maximum extent;
5. after detailed parameter adjustment in an application operation stage is realized, the optimal parameter inclusion parameter set of a new typical scene is ordered, and the closed loop of machine learning in actual application is realized.
Drawings
Fig. 1 is a schematic illustration of the principle of operation.
Detailed Description
The implementation of the compound machine learning parameter configuration solution is described below by way of specific embodiments of multi-level user oriented K-Means machine learning parameter configuration.
1. In the machine learning stage, each scene data is adapted to K-means for a plurality of typical scenes, and the optimal solution cannot be obtained by automatic iteration in the K-means model due to the number of clusters and the maximum iteration number, so that the optimal parameter scheme { the number of clusters and the maximum iteration number } of each scene is obtained by continuously adjusting the two parameters and observing the optimal parameters for the scene data.
2. In the parameter sequencing stage, the optimal parameter set { cluster number, maximum iteration number } is sequenced according to the frequency as the main, if the frequency is the same, according toThe magnitude of (2) determines the order of occurrence, and if the values are the same, the values are ordered according to the order of occurrence. Finally, an orderly optimal parameter set is obtained.
3. In the application running stage, aiming at an application scene, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the running of an application program.
4. If the user selects to use the default parameter mode, adopting the parameter scheme { cluster number { with the highest ranking in the optimal parameter set best Maximum number of iterations best Running a machine learning application.
5. If the user chooses to use the single parameter configuration mode, the design variables s= { s|s e [0,1] } correspond one-to-one with the ordered optimal parameter sets from small to large. The user selects the value of s by himself to obtain the corresponding { clustering number, maximum iteration number }, and the machine learning application program is operated according to the scheme.
6. If the user selects to use the detailed parameter configuration mode, the user is allowed to set the values of the clustering number and the maximum iteration number by himself, and the machine learning application program operates according to the selected parameter scheme.
7. In the detailed parameter configuration mode, there are two methods how to define the user configuration parameters as optimal parameters: one method is for the application to interact with the user, and if the user feels that { cluster number, maximum iteration number } of a certain scheme is most appropriate, the set of parameters is defined as optimal parameters; alternatively, the application may be stopped if the user feels acceptable, otherwise the program continues to run, so that under this assumption the last set of parameters of the application is the optimal parameters. And (3) putting the optimal parameters back to the parameter sequencing stage, and enhancing the optimal parameter set selection space through iteration.
Finally, it should be noted that: the above embodiments are only for illustrating the technical aspects of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: modifications and equivalents may be made to the specific embodiments of the invention without departing from the spirit and scope of the invention, which is intended to be covered by the claims.

Claims (5)

1. A multi-level user-oriented machine learning parameter configuration method is a compound machine learning parameter configuration solution, and is characterized by comprising the following steps,
in a machine learning stage, aiming at a plurality of typical scenes, adapting each scene data with the machine learning, and determining the optimal parameter configuration of each scene;
in the parameter sequencing stage, sequencing each group of optimal parameter configuration in ascending order according to a strategy with frequency as a main part to form an ordered optimal parameter set;
in the application running stage, aiming at an application scene, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the running of an application program, and the optimal parameters obtained by the detailed parameter configuration mode are put back into the parameter ordering stage for iteration;
the machine learning parameters are defined as parameters to be configured of the machine learning program, wherein the parameters which cannot be determined by the program by self include parameters which cannot be optimized through machine learning iteration and parameters which cannot be obtained by the application program according to a set algorithm;
the default parameter mode provided in the application operation stage refers to a parameter scheme with highest ranking in the optimal parameter set obtained in the parameter ranking stage, and the parameter scheme is used as a configuration parameter of the machine learning application program;
the single parameter configuration mode provided in the application operation stage refers to an ordered optimal parameter set obtained in the parameter ordering stage, the ordered parameter set and the monotonic value form a mapping relation, and a user selects a corresponding parameter scheme by adjusting the monotonic value;
the detailed parameter configuration mode provided in the application operation stage comprises the steps that all the configurable parameter users are visible, and the users select satisfactory operation results by adjusting the parameters.
2. The machine learning parameter configuration method of claim 1, wherein: in the machine learning stage, each scene data is matched with the machine learning, the machine learning module inputs according to the scene data, the learning effect is observed through adjusting parameters, and when the learning effect is optimal, the optimal parameters are obtained, so that the adaptation process is completed.
3. The machine learning parameter configuration method of claim 1, wherein: and in the parameter sequencing stage, sequencing the optimal parameter set obtained in the machine learning stage according to the strategy ascending sequence with the frequency as the main.
4. A machine learning parameter configuration method as claimed in claim 3, wherein: the sorting according to the ascending order of the strategy with the frequency as the main part refers to firstly sorting the parameter sets according to the frequency, and if the parameter frequencies are the same, sorting by adopting another criterion, and determining the sequence of the parameter schemes with the consistent frequencies.
5. The machine learning parameter configuration method of claim 1, wherein: and in the application operation stage, the optimal parameters obtained by the detailed parameter configuration mode are put back into the parameter sorting stage for iteration, the user performs parameter configuration by himself, when the parameter configuration is completed, the user feels that the result is satisfactory, or the parameter configuration process meets the set rules of the machine learning application program, the obtained parameter scheme is brought into an optimal parameter set, and the optimal parameter set is rearranged according to the parameter frequency and other rules in the parameter sorting stage to form a new sorting optimal parameter set.
CN202010296630.3A 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method Active CN111831322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010296630.3A CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010296630.3A CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Publications (2)

Publication Number Publication Date
CN111831322A CN111831322A (en) 2020-10-27
CN111831322B true CN111831322B (en) 2023-08-01

Family

ID=72914046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010296630.3A Active CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Country Status (1)

Country Link
CN (1) CN111831322B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201207851D0 (en) * 2012-05-04 2012-06-20 Ibm Instrumentation of software applications for configuration thereof
CN107844837A (en) * 2017-10-31 2018-03-27 第四范式(北京)技术有限公司 The method and system of algorithm parameter tuning are carried out for machine learning algorithm
CN108021986A (en) * 2017-10-27 2018-05-11 平安科技(深圳)有限公司 Electronic device, multi-model sample training method and computer-readable recording medium
CN109961142A (en) * 2019-03-07 2019-07-02 腾讯科技(深圳)有限公司 A kind of Neural network optimization and device based on meta learning
CN110119271A (en) * 2018-12-19 2019-08-13 厦门渊亭信息科技有限公司 A kind of model across machine learning platform defines agreement and adaption system
CN110502213A (en) * 2019-05-24 2019-11-26 网思科技股份有限公司 A kind of artificial intelligence capability development platform
CN110895718A (en) * 2018-09-07 2020-03-20 第四范式(北京)技术有限公司 Method and system for training machine learning model
CN110930256A (en) * 2019-09-30 2020-03-27 北京九章云极科技有限公司 Quantitative analysis method and quantitative analysis system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016061283A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable machine learning method selection and parameter optimization system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201207851D0 (en) * 2012-05-04 2012-06-20 Ibm Instrumentation of software applications for configuration thereof
CN108021986A (en) * 2017-10-27 2018-05-11 平安科技(深圳)有限公司 Electronic device, multi-model sample training method and computer-readable recording medium
CN107844837A (en) * 2017-10-31 2018-03-27 第四范式(北京)技术有限公司 The method and system of algorithm parameter tuning are carried out for machine learning algorithm
CN110895718A (en) * 2018-09-07 2020-03-20 第四范式(北京)技术有限公司 Method and system for training machine learning model
CN110119271A (en) * 2018-12-19 2019-08-13 厦门渊亭信息科技有限公司 A kind of model across machine learning platform defines agreement and adaption system
CN109961142A (en) * 2019-03-07 2019-07-02 腾讯科技(深圳)有限公司 A kind of Neural network optimization and device based on meta learning
CN110502213A (en) * 2019-05-24 2019-11-26 网思科技股份有限公司 A kind of artificial intelligence capability development platform
CN110930256A (en) * 2019-09-30 2020-03-27 北京九章云极科技有限公司 Quantitative analysis method and quantitative analysis system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Angel + :基于Angel的分布式机器学习平台;张智鹏;数据与计算发展前沿;63-72 *
Parameters optimization of deep learning models using Particle swarm optimization;Basheer Qolomany等;2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC);1285-1290 *
基于机器学习的交通流预测技术的研究与应用;姚致远;中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑;C034-821 *
基于机器学习算法的随机序列实证分析;许鸿森等;电脑知识与技术(第26期);166-168 *
排序学习研究进展与展望;李金忠等;自动化学报(第08期);1345-1369 *

Also Published As

Publication number Publication date
CN111831322A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN103927580B (en) Project constraint parameter optimizing method based on improved artificial bee colony algorithm
CN113269322A (en) Deep reinforcement learning improvement method based on self-adaptive hyper-parameters
CN104751842B (en) The optimization method and system of deep neural network
CN106709565A (en) Neural network optimization method and device
CN111582311B (en) Method for training intelligent agent by using dynamic reward example sample based on reinforcement learning
CN110378462A (en) Solve the improvement EDA algorithm with time permutation flowshop scheduling problem
CN113591298B (en) Optical structure optimization design method based on deep neural network
CN112131206B (en) Multi-model database OrientDB parameter configuration automatic tuning method
CN110555506A (en) gradient self-adaptive particle swarm optimization method based on group aggregation effect
CN109634107A (en) A kind of engine dynamic control law optimization method
CN111860789A (en) Model training method, terminal and storage medium
CN111291854A (en) Artificial bee colony algorithm optimization method based on multiple improved strategies
CN111831322B (en) Multi-level user-oriented machine learning parameter configuration method
CN117290721A (en) Digital twin modeling method, device, equipment and medium
CN114742693B (en) Dressing migration method based on self-adaptive instance normalization
CN102799749B (en) Automatic generating method and generating system for distributed music lamplight performance scheme
CN106909967A (en) A kind of simple efficient improvement artificial bee colony optimization method
CN111456958B (en) Fan rotating speed control method and computer readable storage medium
CN110070184A (en) Merge the data sampling method of sample losses and optimal speed constraint
CN114611373A (en) PSO improved algorithm based on variant sigmoid function and particle variation
CN114332101A (en) Genetic algorithm-based U-Net network model pruning method
CN112819161A (en) Variable-length gene genetic algorithm-based neural network construction system, method and storage medium
CN113553778A (en) Method for optimizing parameters of model, electronic device and computer-readable storage medium
CN113112031A (en) Unmanned aerial vehicle task allocation method based on simulated annealing algorithm
CN115995228A (en) Acoustic model topological structure selection method based on particle swarm optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant