CN111831322A - Machine learning parameter configuration method for multi-level user - Google Patents

Machine learning parameter configuration method for multi-level user Download PDF

Info

Publication number
CN111831322A
CN111831322A CN202010296630.3A CN202010296630A CN111831322A CN 111831322 A CN111831322 A CN 111831322A CN 202010296630 A CN202010296630 A CN 202010296630A CN 111831322 A CN111831322 A CN 111831322A
Authority
CN
China
Prior art keywords
parameter
machine learning
optimal
stage
parameter configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010296630.3A
Other languages
Chinese (zh)
Other versions
CN111831322B (en
Inventor
程钢
刘必欣
初宁
王强
张旭锋
李伯昌
薛源
刁文辉
于泓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of War of PLA Academy of Military Science
Original Assignee
Research Institute of War of PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of War of PLA Academy of Military Science filed Critical Research Institute of War of PLA Academy of Military Science
Priority to CN202010296630.3A priority Critical patent/CN111831322B/en
Publication of CN111831322A publication Critical patent/CN111831322A/en
Application granted granted Critical
Publication of CN111831322B publication Critical patent/CN111831322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/71Version control; Configuration management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Machine Translation (AREA)
  • Stored Programmes (AREA)

Abstract

The invention mainly solves the problem that the machine learning application program is required to face multi-level user requirements due to the fact that users with zero foundation, little foundation and professional levels use and apply together in the application process of the machine learning application program. The method comprises the following steps: in a machine learning stage, aiming at a plurality of typical scenes, adapting the data of each scene to machine learning, and determining the optimal parameter configuration of each scene; in the parameter sorting stage, sorting the optimal parameter configurations of each group in an ascending order according to a strategy with frequency as a main strategy to form an ordered optimal parameter set; and thirdly, in the application operation stage, providing three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode for a user to support the operation of the application program aiming at the application scene, and returning the optimal parameters obtained in the detailed parameter configuration mode to the parameter sequencing stage for iteration.

Description

Machine learning parameter configuration method for multi-level user
Technical Field
The invention relates to computer program parameter configuration by adopting a machine learning technology, in particular to a method for configuring composite machine learning parameters for zero-base, little-base and professional-level users.
Background
The artificial intelligence technology is going to be practical, and technologies such as image recognition, voice recognition, natural language processing and the like are necessary in many human-computer interaction occasions. In these techniques, the core algorithm is machine learning. However, in the current state of the art, no matter supervised learning, unsupervised learning, reinforcement learning, and the most popular deep learning at present, parameter configuration is not left. The best learning and training effect can be achieved only under the condition that the parameter configuration is reasonable.
Because the artificial intelligence technology is put into practice, users no longer satisfy the requirement of using computer programs with configured machine learning parameters, hope to participate in design, and achieve better application effect. However, most machine learning parameters are very specialized, parameters are many, and the size of the parameters has no direct change relation with the result. In addition, the application scenario has a great influence on parameter configuration, and one set of parameters is often suitable for a limited scenario. Therefore, in a machine learning computer program, if a user cannot adjust parameters by himself or herself, and relies on and waits for a professional to complete parameter adjustment, the efficiency of adapting such a program to various scenes is low.
In order to reduce the threshold of adjusting parameters by a user, there are several methods, such as a cross validation method commonly used in machine learning, in which a model training data set is randomly divided into a plurality of data subsets, different parameters are set during training of different data subsets, and an optimal parameter combination is selected by comparing results. In addition, in recent years, a rapid automatic machine learning technique is developed in the field of deep learning, and based on a group of models and parameter search strategies, optimal models and parameters are selected by comparing training results of different models and parameters on a group of data sets. However, the above methods are essentially parameter configuration methods based on an exhaustive search strategy, which are time-consuming and computing resources, and it is difficult to obtain an optimal result.
Disclosure of Invention
The invention aims to provide a machine learning program parameter configuration method for multi-level users, which meets the requirements of different types of users: providing a default parameter mode to support the operation of a machine learning application program; providing a single parameter configuration mode to support the operation of a machine learning application program; and providing a detailed parameter configuration mode to support the machine learning application program to run.
In order to achieve the above object, the present invention provides a composite machine learning parameter configuration solution. In a machine learning stage, aiming at a plurality of typical scenes, adapting the data of each scene to machine learning, and determining the optimal parameter configuration of each scene; in the parameter sorting stage, sorting the optimal parameter configurations of each group in an ascending order according to a strategy with frequency as a main strategy to form an ordered optimal parameter set; and thirdly, in the application operation stage, providing three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode for a user to support the operation of the application program aiming at the application scene, and returning the optimal parameters obtained in the detailed parameter configuration mode to the parameter sequencing stage for iteration.
Further, the term "parameter" in the present invention means that a parameter for which a program cannot determine whether the program is good or bad by itself is defined as a parameter to be configured of a machine learning program, and includes both a parameter that cannot be iteratively optimized through machine learning and a parameter that an application program cannot calculate according to a predetermined algorithm.
Furthermore, the step of adapting the scene data to the machine learning in the machine learning stage means that the machine learning module observes the learning effect by adjusting parameters according to the input of the scene data, and when the learning effect is optimal, the optimal parameters are obtained to complete the adaptation process.
And secondly, in the parameter sorting stage, sorting according to a strategy with frequency as a main part in an ascending order, namely, sorting the parameter sets according to the frequency, and if the frequency of the parameters is the same, sorting by adopting another criterion to determine the parameter scheme sequence with consistent frequency.
And step three, the default parameter mode provided in the application running stage is a parameter scheme with the highest ranking in the optimal parameter set obtained in the parameter ranking stage, and the parameter scheme is used as the configuration parameter of the machine learning application program.
And step three, the single parameter configuration mode provided in the application operation stage is the ordered optimal parameter set obtained in the parameter ordering stage, the ordered parameter set and the monotonic value form a mapping relation, and the user selects the corresponding parameter scheme by adjusting the monotonic value.
And step three, providing a detailed parameter configuration mode in the application program operation stage, wherein each user capable of configuring parameters can see the detailed parameter configuration mode, and the user selects a satisfactory operation result by adjusting the parameters.
And step three, placing the optimal parameters obtained in the detailed parameter configuration mode in the application program running stage back to the parameter sequencing stage for iteration, namely, the user carries out parameter configuration by himself, when the parameter configuration is completed, the user feels that the result is satisfactory, or the parameter configuration process meets the set rules of the machine learning application program, bringing the obtained parameter scheme into the optimal parameter set, and reordering according to the parameter frequency and other rules in the parameter sequencing stage to form a new optimal parameter set for sequencing.
The machine learning program used by the invention for the multi-level user has the following advantages:
1. the application range of the scene is wider, and the machine learning application program can play the efficiency of the machine learning application program according to parameter configuration aiming at various scenes;
2. the application range of the user is wider, and the user can select to use default parameters, adjust single parameters and adjust all parameters according to the requirement of the user;
3. the single parameter adjusting scheme follows a definite sequencing rule, and a user can quickly understand and master the single parameter adjusting method;
4. parameter configuration for different typical scenes in the machine learning process is utilized to the maximum extent;
5. after the detailed parameter adjustment in the application operation stage is realized, the optimal parameters of the new typical scene are included in the parameter set sequence, and the closed loop of the machine learning in the actual application is realized.
Drawings
Fig. 1 is a schematic diagram of the working principle.
Detailed Description
The following describes an implementation process of the composite machine learning parameter configuration solution by taking a specific implementation of K-Means machine learning parameter configuration for a multi-level user as an example.
1. In the machine learning stage, aiming at a plurality of typical scenes, the data of each scene is adapted to the K-means, and because the clustering number and the maximum iteration number cannot be automatically iterated in the K-means model to obtain the optimal solution, the optimal parameter scheme { clustering number, maximum iteration number } of each scene is obtained by continuously adjusting the two parameters and observing the optimal parameter aiming at the scene data.
2. In the parameter sorting stage, the optimal parameter set { cluster number, maximum iteration number } is sorted mainly according to frequency, if the frequency is the same, according to the condition
Figure BDA0002452431490000041
The size of the numerical values determines the sequence, and if the numerical values are the same, the numerical values are sorted according to the sequence of appearance. And finally, obtaining an ordered optimal parameter set.
3. In the application running stage, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the running of an application program aiming at an application scene.
4. If the user chooses to use the default parameter mode, the ordering in the optimal parameter set is adoptedHighest parameter scheme { number of clusters }bestMaximum number of iterationsbestRun the machine learning application.
5. If the user chooses to use the single parameter configuration mode, the design variable s ═ { s ∈ [0, 1] } corresponds one-to-one from small to large with the ordered optimal parameter set. And enabling the user to select the value of s by self to obtain the corresponding { clustering number and maximum iteration number }, and operating the machine learning application program according to the scheme.
6. If the user selects to use the detailed parameter configuration mode, the user is allowed to set the values of the clustering number and the maximum iteration number by himself, and the machine learning application program operates according to the selected parameter scheme.
7. In the detailed parameter configuration mode, how to define the user configuration parameters as the optimal parameters, there are two methods: one approach is to let the application interact with the user, defining the set of parameters as optimal parameters if the user feels that { cluster number, maximum iteration number } of a certain scheme is most appropriate; alternatively, the application may be stopped if the user feels the solution acceptable, or the program may continue to run, so that the last set of parameters of the application is the optimal parameters under this assumption. And returning the optimal parameters to the parameter sorting stage, and enhancing the optimal parameter set selection space through iteration.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (9)

1. A machine learning parameter configuration method for multi-level users is a composite machine learning parameter configuration solution, which is characterized by comprising the following steps,
in a machine learning stage, aiming at a plurality of typical scenes, adapting the data of each scene to machine learning, and determining the optimal parameter configuration of each scene;
in the parameter sorting stage, sorting the optimal parameter configurations of each group in an ascending order according to a strategy with frequency as a main strategy to form an ordered optimal parameter set;
in the application operation stage, three parameter configuration methods of a default parameter mode, a single parameter configuration mode and a detailed parameter configuration mode are provided for a user to support the operation of an application program aiming at an application scene, and the optimal parameters obtained in the detailed parameter configuration mode are put back to the parameter sequencing stage for iteration.
2. The machine learning parameters of claim 1, wherein: the parameter that the program cannot automatically determine the quality is defined as the parameter to be configured of the machine learning program, and the parameter includes parameters that cannot be iteratively optimized through machine learning and parameters that cannot be calculated by the application program according to a set algorithm.
3. The adaptation of scene data to machine learning during a machine learning phase of claim 1, wherein: and the machine learning module observes the learning effect by adjusting the parameters according to the input of scene data, obtains the optimal parameters when the learning effect is optimal, and finishes the adaptation process.
4. The parameter ordering stage according to claim 1, characterized in that: and sorting the optimal parameter sets obtained in the machine learning stage in an ascending order according to a strategy with frequency as a main part.
5. The ascending ordering according to frequency dominated strategy of claim 4, wherein: and firstly, sorting the parameter sets according to the frequency, and if the parameter frequencies are the same, sorting by adopting another criterion to determine the sequence of the parameter schemes with consistent frequencies.
6. The default parameter mode provided during the application run phase of claim 1, wherein: the parameter scheme with the highest ranking in the optimal parameter set obtained in the parameter ranking stage is used as the configuration parameter of the machine learning application program.
7. The single parameter configuration mode provided during an application run phase of claim 1, wherein: the method is characterized in that the ordered optimal parameter set obtained in the parameter ordering stage forms a mapping relation between the ordered parameter set and the monotonic value, and a user selects a corresponding parameter scheme by adjusting the monotonic value.
8. The detailed parameter configuration mode provided during the application execution phase according to claim 1, wherein: and (4) each user with the configurable parameters can see the configurable parameters, and the user selects a satisfactory operation result by adjusting the parameters.
9. The method of claim 1, wherein the optimal parameters obtained from the detailed parameter configuration mode during the application run phase are returned to the parameter ordering phase for iteration, and wherein: and (3) the user carries out parameter configuration by himself, when the parameter configuration is completed and the user feels that the result is satisfactory, or the parameter configuration process meets the set rule of the machine learning application program, the obtained parameter scheme is brought into the optimal parameter set, and the parameter is reordered according to the parameter frequency and other rules in the parameter ordering stage to form a new ordered optimal parameter set.
CN202010296630.3A 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method Active CN111831322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010296630.3A CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010296630.3A CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Publications (2)

Publication Number Publication Date
CN111831322A true CN111831322A (en) 2020-10-27
CN111831322B CN111831322B (en) 2023-08-01

Family

ID=72914046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010296630.3A Active CN111831322B (en) 2020-04-15 2020-04-15 Multi-level user-oriented machine learning parameter configuration method

Country Status (1)

Country Link
CN (1) CN111831322B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201207851D0 (en) * 2012-05-04 2012-06-20 Ibm Instrumentation of software applications for configuration thereof
US20160110657A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable Machine Learning Method Selection and Parameter Optimization System and Method
CN107844837A (en) * 2017-10-31 2018-03-27 第四范式(北京)技术有限公司 The method and system of algorithm parameter tuning are carried out for machine learning algorithm
CN108021986A (en) * 2017-10-27 2018-05-11 平安科技(深圳)有限公司 Electronic device, multi-model sample training method and computer-readable recording medium
CN109961142A (en) * 2019-03-07 2019-07-02 腾讯科技(深圳)有限公司 A kind of Neural network optimization and device based on meta learning
CN110119271A (en) * 2018-12-19 2019-08-13 厦门渊亭信息科技有限公司 A kind of model across machine learning platform defines agreement and adaption system
CN110502213A (en) * 2019-05-24 2019-11-26 网思科技股份有限公司 A kind of artificial intelligence capability development platform
CN110895718A (en) * 2018-09-07 2020-03-20 第四范式(北京)技术有限公司 Method and system for training machine learning model
CN110930256A (en) * 2019-09-30 2020-03-27 北京九章云极科技有限公司 Quantitative analysis method and quantitative analysis system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201207851D0 (en) * 2012-05-04 2012-06-20 Ibm Instrumentation of software applications for configuration thereof
US20160110657A1 (en) * 2014-10-14 2016-04-21 Skytree, Inc. Configurable Machine Learning Method Selection and Parameter Optimization System and Method
CN108021986A (en) * 2017-10-27 2018-05-11 平安科技(深圳)有限公司 Electronic device, multi-model sample training method and computer-readable recording medium
CN107844837A (en) * 2017-10-31 2018-03-27 第四范式(北京)技术有限公司 The method and system of algorithm parameter tuning are carried out for machine learning algorithm
CN110895718A (en) * 2018-09-07 2020-03-20 第四范式(北京)技术有限公司 Method and system for training machine learning model
CN110119271A (en) * 2018-12-19 2019-08-13 厦门渊亭信息科技有限公司 A kind of model across machine learning platform defines agreement and adaption system
CN109961142A (en) * 2019-03-07 2019-07-02 腾讯科技(深圳)有限公司 A kind of Neural network optimization and device based on meta learning
CN110502213A (en) * 2019-05-24 2019-11-26 网思科技股份有限公司 A kind of artificial intelligence capability development platform
CN110930256A (en) * 2019-09-30 2020-03-27 北京九章云极科技有限公司 Quantitative analysis method and quantitative analysis system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BASHEER QOLOMANY等: "Parameters optimization of deep learning models using Particle swarm optimization", 2017 13TH INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING CONFERENCE (IWCMC), pages 1285 - 1290 *
姚致远: "基于机器学习的交通流预测技术的研究与应用", 中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑, pages 034 - 821 *
张智鹏: "Angel + :基于Angel的分布式机器学习平台", 数据与计算发展前沿, pages 63 - 72 *
李金忠等: "排序学习研究进展与展望", 自动化学报, no. 08, pages 1345 - 1369 *
许鸿森等: "基于机器学习算法的随机序列实证分析", 电脑知识与技术, no. 26, pages 166 - 168 *

Also Published As

Publication number Publication date
CN111831322B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN106709565A (en) Neural network optimization method and device
JP2023523029A (en) Image recognition model generation method, apparatus, computer equipment and storage medium
CN110909787A (en) Method and system for multi-objective batch scheduling optimization based on clustering evolutionary algorithm
CN112000772A (en) Sentence-to-semantic matching method based on semantic feature cube and oriented to intelligent question and answer
CN113011337B (en) Chinese character library generation method and system based on deep meta learning
CN108304920A (en) A method of multiple dimensioned learning network is optimized based on MobileNets
CN109947940A (en) File classification method, device, terminal and storage medium
CN112000770A (en) Intelligent question and answer oriented sentence-to-sentence matching method based on semantic feature map
CN113657421A (en) Convolutional neural network compression method and device and image classification method and device
CN117454824B (en) Chip circuit design method based on double-layer multi-objective optimization
Mac Parthaláin et al. Fuzzy-rough set bireducts for data reduction
CN111651220A (en) Spark parameter automatic optimization method and system based on deep reinforcement learning
WO2023155783A1 (en) Model adjustment method and apparatus, service processing method and apparatus, and device and storage medium
CN110956277A (en) Interactive iterative modeling system and method
CN110727969A (en) Method, device and equipment for automatically adjusting workflow and storage medium
CN109634107A (en) A kind of engine dynamic control law optimization method
CN117290721A (en) Digital twin modeling method, device, equipment and medium
CN116881641A (en) Pre-training model adjustment method and device, storage medium and computing equipment
CN113885845B (en) Calculation map generation method, system, equipment and medium of deep learning compiler
CN114239237A (en) Power distribution network simulation scene generation system and method supporting digital twinning
CN111831322A (en) Machine learning parameter configuration method for multi-level user
JP2002230012A (en) Document clustering device
CN113762514B (en) Data processing method, device, equipment and computer readable storage medium
US20070094163A1 (en) Genetic algorithm-based tuning engine
CN113313250A (en) Neural network training method and system adopting mixed precision quantification and knowledge distillation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant