CN115469851A - Automatic parameter adjusting method for compiler - Google Patents

Automatic parameter adjusting method for compiler Download PDF

Info

Publication number
CN115469851A
CN115469851A CN202211287488.1A CN202211287488A CN115469851A CN 115469851 A CN115469851 A CN 115469851A CN 202211287488 A CN202211287488 A CN 202211287488A CN 115469851 A CN115469851 A CN 115469851A
Authority
CN
China
Prior art keywords
optimization
candidate
compiler
model
optimization option
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211287488.1A
Other languages
Chinese (zh)
Inventor
林锦坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zede Software Beijing Co ltd
Original Assignee
Zede Software Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zede Software Beijing Co ltd filed Critical Zede Software Beijing Co ltd
Priority to CN202211287488.1A priority Critical patent/CN115469851A/en
Publication of CN115469851A publication Critical patent/CN115469851A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/37Compiler construction; Parser generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Abstract

The invention discloses an automatic parameter adjusting method for a compiler, which comprises the following steps: 1) Selecting a target program to be compiled, setting parameter adjusting time t, and generating an initial optimization option group set phi according to optimization option information of a compiler init (ii) a 2) Compiler use phi init Compiling and running the target program by each optimization option group to obtain a training set S; 3) Constructing a proxy model M by using a training set S; 4) Generating a candidate optimization option group set phi according to optimization option information of a compiler by using a model M candidate Predicting the performance of each candidate optimization option group; 5) Selecting a candidate optimization option group as the configuration of the compiler, compiling and running a target program to obtain corresponding performance, and adding the corresponding performance into a training set S; then training the model M by using the updated training set S; 6) The steps 4) to 5) are circulated until the parameter adjusting time t is reached, and thenAnd then outputting an optimization option group corresponding to the optimal performance as the configuration of the compiler.

Description

Automatic parameter adjusting method for compiler
Technical Field
The invention belongs to the technical field of computer software, and particularly relates to an automatic compiler parameter adjusting method based on a hybrid prediction model and optimization option importance prediction.
Background
Bayes optimization is a black box optimization algorithm, which is used for solving the extreme value problem of a function of which the expression is unknown; the algorithm predicts the probability distribution of the function value at any point according to the function value at a group of sampling points, which is realized by Gaussian process regression. Gini Impurity: representing the probability of a randomly selected sample in the set of samples being misclassified.
Patent application publication No.: the invention of CN114861781A provides an automatic parameter adjusting and optimizing method, which comprises the following steps: determining historical system parameters and resource consumption corresponding to the historical system parameters as training samples according to parameter adjusting instructions; performing feature extraction on the training samples; training a parameter prediction model by using the extracted features; predicting the system parameters of the next time period by using the trained parameter prediction model to obtain predicted system parameters; and updating the system parameters according to the predicted system parameters.
Most of the existing tools for adjusting the optimization options of the compiler are general automatic parameter adjusting tools, and the existing general parameter adjusting tools have the following limitations:
1. the existing automatic parameter adjusting tool based on the model mostly uses a single model, and the performance of the automatic parameter adjusting tool is unstable.
2. The existing automatic parameter adjusting tool has poor effect on the scene with configuration options optimized by a compiler, the highest option to be configured can reach more than one hundred, and the cost of operating a target program is high.
3. For the compiler optimization option configuration scene, the optimization options of the compiler mostly only have two options of starting or not starting, and the existing automatic parameter adjusting tool does not well utilize the characteristics.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide an automatic parameter adjusting method for a compiler based on a hybrid prediction model and optimization option importance prediction. The compiler has a large number of selectable optimization options, and the setting of the optimization options can greatly influence the performance of the compiled target program. The invention can effectively accelerate the parameter adjusting speed and find out more optimal optimized option configuration in less time.
The technical scheme of the invention is as follows:
a compiler automatic parameter adjusting method comprises the following steps:
1) Selecting a target program to be compiled, setting parameter adjusting time t, and generating an initial optimization option group set phi according to optimization option information of a compiler init (ii) a What is needed isSet of initial optimization options Φ init ={Θ 1 ,…,Θ b ,…,Θ B },Θ b A b-th optimization option group representing a compiler; each optimization option group Θ = { θ = { (θ) } 1 ,…,θ i ,…,θ d D denotes the total number of configurable optimization options for the compiler, θ i Whether the ith optimization option is selected or not is shown, the selection is 1, otherwise, the selection is 0;
2) Compiler uses initial optimized set of groups Φ init Compiling the target program by each optimization option group, and running a source code of the target program obtained by compiling each time to obtain a training set S; the training set S = { (Θ) 1 ,o 1 ),…,(Θ b ,o b ),…,(Θ B ,o B )},(Θ b ,o b ) Set of presentation usage optimization options Θ b Compiling the target program and operating the obtained source code to obtain a performance index value o m
3) Constructing a proxy model M by using a training set S;
4) Generating a candidate optimization option group set phi according to the optimization option information of the compiler by using the agent model M candidate Predicting the set of candidate optimization options Φ candidate When each candidate optimization option group is used as the configuration of the compiler, the performance of the corresponding candidate optimization option group is obtained;
5) Selecting a candidate optimization option group theta according to the performance of each candidate optimization option group selected As the configuration of the compiler, the object program is compiled and operated to obtain the corresponding performance o selected Will (theta) selected ,o selected ) Adding a training set S; then, training the agent model M by using the updated training set S;
6) The steps 4) to 5) are circulated until the parameter adjusting time t is reached, and then the optimal performance o is output best Corresponding set of optimization options Θ best And the compiler is used for compiling the target program.
Further, training the hybrid prediction model by using a training set S to obtain a proxy model M; the proxy model M includes a plurality of predictive models.
Further, generating a set of candidate optimization options Φ candidate The method comprises the following steps: predicting influence degree of d optimization options of a compiler by using a first prediction model in the agent model M, taking the optimization option k before influence degree as a key optimization option, and taking other d-k optimization options as non-key optimization options; then using k key optimization options to carry out permutation generation 2 k A key optimization option group, c is randomly generated by using d-k non-key optimization options i Combining the non-key optimization option groups to generate a candidate optimization option group set
Figure BDA0003899989180000021
Further, the influence degree of the option theta is optimized
Figure BDA0003899989180000022
Wherein the random forest algorithm generates Q decision trees G (d) based on the training set S i ) For node d in decision tree i The coefficient of kini of (a); d i And f, dividing the decision tree by using the optimization option theta on the ith decision tree, and u is the total number of the decision trees divided by using the optimization option theta.
Further, a Gaussian attenuation function is adopted to determine the sampling number of the non-critical optimization option group in each iteration.
Further, in step 5), the parameter adjusting device selects the candidate optimization item group with the largest ei value as the candidate optimization item group Θ selected (ii) a Wherein the c-th candidate optimization item group Θ c The corresponding ei value is ei c =(o c -o incumbent -0.1)*cdf(z c )+std c *
Figure BDA0003899989180000031
o c For the set of candidate optimization items Θ c Performance of std c For the set of candidate optimization items Θ c Performance of and current set of candidate optimization items Φ candidate Of the other candidate optimization term groups, o incumbent For the best performance achieved.
Further, the hybrid prediction model is composed of a random forest model, an Xgboost model, a Gaussian process model and a Lightgbm model.
Further, each model in the hybrid prediction model is trained by using the training set S, and the prediction results of the models are weighted and averaged to serve as the prediction result of the proxy model M.
A server, comprising a memory and a processor, the memory storing a computer program configured to be executed by the processor, the computer program comprising instructions for carrying out the steps of the above method.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned method.
The invention has the following advantages:
1) The invention provides a special automatic parameter adjusting algorithm aiming at a compiler scene based on a Bayes method, which has higher parameter adjusting speed compared with the conventional general parameter adjusting algorithm.
2) The hybrid prediction model provided by the invention can effectively improve the stability of the proxy model.
3) The method for generating the candidate optimization option group set based on the optimization option importance prediction effectively accelerates the parameter adjusting process.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The invention will be described in further detail with reference to the following drawings, which are given by way of example only for the purpose of illustrating the invention and are not intended to limit the scope of the invention.
The whole process of the scheme is shown in fig. 1, a user inputs a target algorithm source code, compiler optimization options and parameter adjusting time, the parameter adjusting algorithm adjusts parameters until the parameter adjusting time is used up, and finally the found optimal optimization option group and the performance thereof are output.
The detailed method comprises the following steps:
1. input the method
a) Target program source code
Any corresponding compiler-compilable object source code. When compiling the target program, the compiler generally selects some optimization options to optimize the target program, so that the efficiency of the compiled executable file is higher; for example, the gcc compiler can select optimization options such as-fdce, -fmerge-constants, etc.
b) Compiler optimization options
The compiler may configure compiler optimization options.
2. Parameter adjusting algorithm
Figure BDA0003899989180000041
As shown in the algorithm 1, after receiving the source code of the target program to be compiled, the optimization option information of the compiler and the parameter adjusting time t input by the user, the parameter adjusting algorithm firstly generates a group of initial optimization option group set phi init Then compiling a target algorithm by using the optimization option group in the set and operating to obtain a training set S, then constructing a proxy model M by using the training set S, generating a candidate optimization option group according to the compiler optimization option importance judged by the kini coefficient, inputting the generated candidate optimization option group into the proxy model to predict the performance of the optimization option group, then selecting one of the optimization option groups to execute the target algorithm to obtain new data to be added into the training set, circulating the process until the parameter adjusting time t is used up, and finally outputting the found optimal optimization option group theta best And corresponding target program performance o best Will theta best Configuring the compiler as a compiler optimization option for target program optimization for compiling the target program. Wherein:
(1) Compiler optimization option set Θ = { θ = { (θ) 1 ,…,θ i ,…,θ d D denotes the compiler has d configurable optimization options, θ i Whether the ith optimization option is selected or not is shown, the selection is 1, otherwise, the selection is 0;
(2) Initial set of optimizationsSet of phi init ={Θ 1 ,…,Θ b ,…,Θ B },Θ b Representing the b-th optimization option group;
(3) Training set S = { (Θ) 1 ,o 1 ),…,(Θ m ,o m ),…,(Θ M ,o M )},(Θ m ,o m ) Shows the use of the mth optimization option group Θ m The performance index value obtained by compiling the running object program is o m
The details of each step are as follows:
(1) Generating an initial optimization option group set:
obtaining an optimized option group set phi in a parameter value range by using random sampling init ={Θ 1 ,…,Θ b ,…,Θ B }
(2) Constructing a proxy model:
the parameter adjusting algorithm uses a mixed prediction model as a proxy model M, the characteristics are optimization option values, 1 is selected, 0 is not selected, and label is a performance value for compiling the running target program. Training the agent model M by using a training set S consisting of historical operating data, and when a new optimization option group theta is input new When the performance o of compiling the running object program using the optimization option group is predicted new
The stability of the prediction of the agent model M can be improved by using a hybrid prediction model, wherein the hybrid prediction model is composed of a Random Forest (RF), xgboost (XGB), a Gaussian Process (GP) and a Lightgbm (LGB), the models are trained respectively during model training, the models are predicted respectively during prediction, and then the predicted values of the models are weighted and averaged, and the calculation formula is as follows:
o w =w RF o RF +w XGB o XGB +w GP o GP +w LGB o LGB
wherein o is w A weighted value for each model predicted value; w is a RF Is a random forest weight, o RF And (4) predicting performance values of the random forest by the same method.
The weights are updated after each iteration, e.g. weight update formula (w) for random forests XGB ,w GP ,w LGB Similar for updates of):
Figure BDA0003899989180000051
Figure BDA0003899989180000052
wherein:
Figure BDA0003899989180000061
weight after the ith iteration;
Figure BDA0003899989180000062
weight after i-1 iteration;
o real : actually compiling the performance value of the running target program;
r: a weight update coefficient of 0.05 by default;
(3) Generating a set of candidate parameter sets Φ candidate
In order to solve the problem, the method uses the influence degree of d optimization options of a random forest prediction compiler in a proxy model in each iteration, takes the optimization option at the front k of the influence degree as a key optimization option, and takes the other optimization options at the front k as non-key optimization options (the key optimization option refers to an optimization option which has a large influence on the performance of a target program compiled by the compiler); then using k key optimization options to carry out permutation generation 2 k A key optimization option group, c is randomly generated by using d-k non-key optimization options i A set of non-critical optimization options (e.g., randomly combining a plurality of non-critical optimization options from d-k non-critical optimization options, the d-k non-critical optimization options having a total of 2 d-k Seed arrangement from which c is randomly selected i Is arranged to obtain c i A set of non-critical optimization options) are combined to generate c i *2 k Set of candidate optimization options
Figure BDA0003899989180000063
The influence degree is calculated as follows:
Figure BDA0003899989180000064
Figure BDA0003899989180000065
Figure BDA0003899989180000066
wherein:
impact (θ): optimizing the influence degree of the option theta; g (d) i ) For node d in decision tree i The coefficient of kini of (a); d i And f, dividing the decision tree by using the optimization option theta on the ith decision tree, wherein u is the total number of the decision trees which are divided by using the optimization option theta, and Q is the total number of the decision trees of the random forest. The random forest is composed of a plurality of decision trees, before the decision trees are generated, part of all features are randomly sampled to be used as the feature of each decision tree, and Q times of sampling are carried out at the place to generate Q decision trees. w is a left Representing optimization option theta in decision tree corresponding to node d i The weight of the left sub-tree of (1); w is a wright Representing the current node d in the decision tree i The weight of the right sub-tree of (1); n is d Representing the number of the optimization option groups at the current node, wherein N is the total number of all the optimization option groups in the training set; c is at the current node d i C =2 in the present scenario; p is a radical of i Represents the probability of selecting the ith type of optimization option group, where i =0 represents no enablement of optimization option θ, and i =1 represents enablement;
G(d i ) Decision tree sectionPoint d i The coefficient of kini of (a);
I(d i ) Decision tree node d i The purity of the keny of (a);
c i for the sampling number of the ith iteration non-key optimization option group, the scheme uses a Gaussian attenuation function to calculate:
Figure BDA0003899989180000071
Figure BDA0003899989180000072
wherein: c. C 1 : the sampling number of the first iteration non-key optimization option group; offset, scale, decade: three constants to control the rate of descent of the Gaussian decay function (see https:// blog. Csdn. Net/lijingjingchn/article/details/106428445).
(4) Performance prediction and parameter set selection:
first predict a parameter set Φ using a proxy model M candidate Performance of each parameter set in i Obtaining standard deviation std from the prediction result of Gaussian process model GP in the proxy model i Then ei is calculated using the following formula i
ei i =(o i -o best -0.1)*cdf(z i )+std i *pdf(z i )
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003899989180000073
o incumbent the ability to find the best set of parameters for the current time;
finally, the parameter group theta with the maximum ei value is selected selected Compiling and running the object program to obtain the performance o selected Will (theta) selected ,o selected ) And adding a training set S.
3. Output the output
When the parameter adjusting time t is used up, the parameter adjusting process is ended and the parameters are outputFind the best optimization option set theta in this time best And compiling the Performance of running the target Algorithm Using this set of optimization options o best
In addition to using a hybrid prediction model, the surrogate model may also select other regression models, such as random forest, XGBoost, etc., individually.
The initial parameter set sampling method can also select other sampling methods besides random sampling, such as Monte Carlo sampling.
Besides selecting ei as the ranking index, the optimization option group selection method may also use other indexes, such as a performance index value given by the target program.
Although specific embodiments of the invention have been disclosed for purposes of illustration, and for purposes of aiding in the understanding of the contents of the invention and its implementation, those skilled in the art will appreciate that: various substitutions, changes and modifications are possible without departing from the spirit and scope of the present invention and the appended claims. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (10)

1. A compiler automatic parameter adjusting method comprises the following steps:
1) Selecting a target program to be compiled, setting parameter adjusting time t, and generating an initial optimization option group set phi according to optimization option information of a compiler init (ii) a The set of initial optimization options Φ init ={Θ 1 ,…,Θ b ,…,Θ B },Θ b A b-th optimization option group representing a compiler; each optimization option group Θ = { θ = { (θ) } 1 ,…,θ i ,…,θ d D denotes the total number of configurable optimization options for the compiler, θ i Whether the ith optimization option is selected or not is shown, the selection is 1, otherwise, the selection is 0;
2) Compiler using initial optimized set of groups Φ init Compiling the target program by each optimization option group in the system, and running a target program source code obtained by compiling each time to obtain a training set S; the training set S = { (Θ) 1 ,o 1 ),…,(Θ b ,o b ),…,(Θ B ,o B )},(Θ b ,o b ) Representing a set of usage optimization options Θ b Compiling the target program and operating the obtained source code to obtain a performance index value o m
3) Constructing a proxy model M by using a training set S;
4) Generating a candidate optimization option group set phi according to the optimization option information of the compiler by using the agent model M candidate Predicting the set of candidate optimization options Φ candidate When each candidate optimization option group is used as the configuration of the compiler, the performance of the corresponding candidate optimization option group is obtained;
5) Selecting a candidate optimization option group theta according to the performance of each candidate optimization option group selected As the configuration of the compiler, the object program is compiled and operated to obtain the corresponding performance o selected Will (theta) selected ,o selected ) Adding a training set S; then, training the agent model M by using the updated training set S;
6) The steps 4) to 5) are circulated until the parameter adjusting time t is reached, and then the optimal performance o is output best Corresponding set of optimization options Θ best The compiler is configured to compile the object program.
2. The method of claim 1, wherein training the hybrid predictive model using the training set S results in a proxy model M; the proxy model M includes a plurality of predictive models.
3. Method according to claim 2, characterized in that a set of candidate optimization options Φ is generated candidate The method comprises the following steps: predicting influence degree of d optimization options of a compiler by using a first prediction model in the agent model M, taking the optimization option k before influence degree as a key optimization option, and taking other d-k optimization options as non-key optimization options; then using k key optimization options to carry out permutation generation 2 k A set of key optimization options, using d-k non-key optimization optionsMachine generation c i Combining the non-key optimization option groups to generate a candidate optimization option group set
Figure FDA0003899989170000011
4. A method according to claim 3, characterized in that the influence of the optimization option θ
Figure FDA0003899989170000012
Wherein the random forest algorithm generates Q decision trees G (d) based on the training set S i ) For node d in decision tree i The coefficient of kini of (a); d i And f, dividing the decision tree by using the optimization option theta on the ith decision tree, and u is the total number of the decision trees divided by using the optimization option theta.
5. The method of claim 3, wherein a Gaussian decay function is used to determine the number of samples of the set of non-critical optimization options at each iteration.
6. The method as claimed in claim 1, wherein in step 5), the parameter adjusting device selects the candidate optimization item group with the largest ei value as the candidate optimization item group Θ selected (ii) a Wherein the c-th candidate optimization item group Θ c Corresponding ei value of ei c =(o c -o incumbent -0.1)*cdf(z c )+std c *pdf(z c ),
Figure FDA0003899989170000021
o c For the set of candidate optimization items Θ c Performance of std c For the set of candidate optimization items Θ c Performance and current set of candidate optimization items Φ candidate Standard deviation of performance of other candidate optimization term groups of (1), o incumbent For the best performance achieved.
7. The method of claim 2, wherein the hybrid predictive model consists of a random forest model, an Xgboost model, a Gaussian process model, and a Lightgbm model.
8. The method of claim 7, wherein each model in the hybrid predictive model is trained using a training set S, and the prediction results of the models are weighted-averaged to obtain the prediction result of the proxy model M.
9. A server, comprising a memory and a processor, the memory storing a computer program configured to be executed by the processor, the computer program comprising instructions for carrying out the steps of the method according to any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN202211287488.1A 2022-10-20 2022-10-20 Automatic parameter adjusting method for compiler Pending CN115469851A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211287488.1A CN115469851A (en) 2022-10-20 2022-10-20 Automatic parameter adjusting method for compiler

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211287488.1A CN115469851A (en) 2022-10-20 2022-10-20 Automatic parameter adjusting method for compiler

Publications (1)

Publication Number Publication Date
CN115469851A true CN115469851A (en) 2022-12-13

Family

ID=84337090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211287488.1A Pending CN115469851A (en) 2022-10-20 2022-10-20 Automatic parameter adjusting method for compiler

Country Status (1)

Country Link
CN (1) CN115469851A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116521176A (en) * 2023-05-06 2023-08-01 东莞理工学院 Compilation optimization option optimization method and device, intelligent terminal and storage medium
CN116860259A (en) * 2023-09-05 2023-10-10 之江实验室 Method, device and equipment for model training and automatic optimization of compiler
CN116931955A (en) * 2023-09-18 2023-10-24 之江实验室 Compiler automatic optimization method and device based on artificial intelligence
CN116991429A (en) * 2023-09-28 2023-11-03 之江实验室 Compiling and optimizing method, device and storage medium of computer program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200167930A1 (en) * 2017-06-16 2020-05-28 Ucl Business Ltd A System and Computer-Implemented Method for Segmenting an Image
CN112150209A (en) * 2020-06-19 2020-12-29 南京理工大学 Construction method of CNN-LSTM time sequence prediction model based on clustering center
CN113407185A (en) * 2021-03-10 2021-09-17 天津大学 Compiler optimization option recommendation method based on Bayesian optimization
CN113591944A (en) * 2021-07-14 2021-11-02 中国海洋大学 Parameter selection optimization method, system and equipment in random forest model training

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200167930A1 (en) * 2017-06-16 2020-05-28 Ucl Business Ltd A System and Computer-Implemented Method for Segmenting an Image
CN112150209A (en) * 2020-06-19 2020-12-29 南京理工大学 Construction method of CNN-LSTM time sequence prediction model based on clustering center
CN113407185A (en) * 2021-03-10 2021-09-17 天津大学 Compiler optimization option recommendation method based on Bayesian optimization
CN113591944A (en) * 2021-07-14 2021-11-02 中国海洋大学 Parameter selection optimization method, system and equipment in random forest model training

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JUNJIE CHEN等: "Efficient Compiler Autotuning via Bayesian Optimization" *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116521176A (en) * 2023-05-06 2023-08-01 东莞理工学院 Compilation optimization option optimization method and device, intelligent terminal and storage medium
CN116521176B (en) * 2023-05-06 2023-12-29 东莞理工学院 Compilation optimization option optimization method and device, intelligent terminal and storage medium
CN116860259A (en) * 2023-09-05 2023-10-10 之江实验室 Method, device and equipment for model training and automatic optimization of compiler
CN116860259B (en) * 2023-09-05 2023-12-19 之江实验室 Method, device and equipment for model training and automatic optimization of compiler
CN116931955A (en) * 2023-09-18 2023-10-24 之江实验室 Compiler automatic optimization method and device based on artificial intelligence
CN116931955B (en) * 2023-09-18 2024-01-09 之江实验室 Compiler automatic optimization method and device based on artificial intelligence
CN116991429A (en) * 2023-09-28 2023-11-03 之江实验室 Compiling and optimizing method, device and storage medium of computer program
CN116991429B (en) * 2023-09-28 2024-01-16 之江实验室 Compiling and optimizing method, device and storage medium of computer program

Similar Documents

Publication Publication Date Title
CN115469851A (en) Automatic parameter adjusting method for compiler
US8825573B2 (en) Controlling quarantining and biasing in cataclysms for optimization simulations
CN112035116B (en) Agent modeling method for multi-target compiling optimization sequence selection
Rueda et al. Testing MVMO on learning-based real-parameter single objective benchmark optimization problems
CN111612528A (en) Method, device and equipment for determining user classification model and storage medium
EP3779616B1 (en) Optimization device and control method of optimization device
JP2020086821A (en) Optimization device and control method thereof
CN113407185B (en) Compiler optimization option recommendation method based on Bayesian optimization
CN108052696B (en) Three-value FPRM circuit area and delay optimization method using particle swarm optimization
Abdel-Basset et al. MOEO-EED: A multi-objective equilibrium optimizer with exploration–exploitation​ dominance strategy
CN113821983B (en) Engineering design optimization method and device based on proxy model and electronic equipment
WO2005048184A1 (en) Active learning method and system
CN110738362A (en) method for constructing prediction model based on improved multivariate cosmic algorithm
Doerr et al. Royal road functions and the (1+ λ) evolutionary algorithm: Almost no speed-up from larger offspring populations
Cristescu et al. Surrogate-based multiobjective optimization: ParEGO update and test
JP7256378B2 (en) Optimization system and method of controlling the optimization system
CN112232401A (en) Data classification method based on differential privacy and random gradient descent
JP2022015503A (en) Information processing system, information processing method and program
Pei et al. AlphaSyn: Logic synthesis optimization with efficient monte carlo tree search
Wang et al. A Variable-Length Chromosome Evolutionary Algorithm for Reversible Circuit Synthesis.
CN113220437B (en) Workflow multi-target scheduling method and device
CN113554144A (en) Self-adaptive population initialization method and storage device for multi-target evolutionary feature selection algorithm
US11307867B2 (en) Optimizing the startup speed of a modular system using machine learning
Minhaz et al. Solution of a Classical Cryptarithmetic Problem by using parallel genetic algorithm
Punnathanam et al. Reduced Yin-Yang-Pair optimization and its performance on the CEC 2016 expensive case

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20221213

WD01 Invention patent application deemed withdrawn after publication