CN114881359A - GBDT and XGboost fused road surface IRI prediction method - Google Patents
GBDT and XGboost fused road surface IRI prediction method Download PDFInfo
- Publication number
- CN114881359A CN114881359A CN202210625570.4A CN202210625570A CN114881359A CN 114881359 A CN114881359 A CN 114881359A CN 202210625570 A CN202210625570 A CN 202210625570A CN 114881359 A CN114881359 A CN 114881359A
- Authority
- CN
- China
- Prior art keywords
- model
- prediction
- layer
- iri
- tree
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 20
- 238000012549 training Methods 0.000 claims abstract description 16
- 230000004927 fusion Effects 0.000 claims abstract description 12
- 238000007637 random forest analysis Methods 0.000 claims abstract description 11
- 238000012935 Averaging Methods 0.000 claims abstract 2
- 238000011156 evaluation Methods 0.000 claims description 8
- 230000009467 reduction Effects 0.000 claims description 8
- 238000010187 selection method Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 4
- 239000000126 substance Substances 0.000 claims description 4
- 239000012535 impurity Substances 0.000 claims description 3
- 230000008901 benefit Effects 0.000 abstract description 10
- 238000012423 maintenance Methods 0.000 abstract description 4
- 238000012544 monitoring process Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 24
- 238000003066 decision tree Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 3
- 238000012417 linear regression Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010219 correlation analysis Methods 0.000 description 1
- 238000002790 cross-validation Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
- G06F18/2148—Generating training patterns; Bootstrap methods, e.g. bagging or boosting characterised by the process organisation or structure, e.g. boosting cascade
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24323—Tree-organised classifiers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/01—Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Entrepreneurship & Innovation (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Development Economics (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- Mathematical Physics (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Computational Linguistics (AREA)
- Medical Informatics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to a pavement IRI prediction method fusing GBDT and XGboost, and belongs to the technical field of pavement monitoring. The method comprises the following steps: s1: acquiring pavement characteristic data, and selecting a characteristic data set by adopting a random forest algorithm; s2: constructing a Stacking fusion model: dividing the characteristic data set of the step S1 into a plurality of sub data sets, inputting the sub data sets into each base learner of the first layer prediction model, and outputting a respective prediction result by each base learner; then, taking the model output of the first layer and the characteristic data set of the step S1 as the input of the second layer, training a meta-learner of a prediction model of the second layer, and averaging the model output of the second layer to obtain a final prediction result; the first layer of prediction models comprise a GBDT model and an XGboost model. The invention improves the road IRI prediction precision, greatly improves the maintenance fund planning benefit and realizes the goal of optimal cost benefit.
Description
Technical Field
The invention belongs to the technical field of pavement monitoring, and relates to a pavement IRI prediction method fusing GBDT and XGboost.
Background
The existing pavement evenness evaluation index (IRI) prediction method mainly comprises two main types, one type is a prediction method based on a time sequence, and the other type is a learning type prediction method based on characteristic parameters.
The prediction method based on the time series mainly predicts the decay of performance indexes in the whole life cycle of a road surface and is used for maintenance planning and fund allocation, and the utilized data mainly comprises structural characteristic data, traffic environment data and historical detection data and often needs enough historical detection data for modeling. The method does not fully utilize bottom layer detection data including disease information, characteristic parameters and the like. The main problem is that the prediction model has poor precision and can only be applied to network level road management decision.
The learning type prediction method based on the characteristic parameters is mainly divided into two types, one type is used for assisting the modeling of a time sequence model, and the other type is used for predicting other performance indexes based on certain detection data. The precision is relatively high, but a large amount of data is needed as a support, and the portability of different project management modes is poor.
Therefore, a new road surface IRI prediction method is needed to improve the prediction accuracy.
Disclosure of Invention
In view of the above, the invention aims to provide a road surface IRI prediction method fusing GBDT and XGboost, which solves the problem of feature selection of the existing learning type prediction model; and the fusion model is adopted, so that the road surface IRI prediction precision is improved, the maintenance fund planning benefit can be greatly improved, and the goal of optimal cost benefit is realized.
In order to achieve the purpose, the invention provides the following technical scheme:
a road surface IRI prediction method fusing GBDT and XGboost specifically comprises the following steps:
s1: acquiring pavement characteristic data, and selecting a characteristic data set by adopting a random forest algorithm;
s2: constructing a Stacking fusion model: dividing the characteristic data set of the step S1 into a plurality of sub data sets, inputting the sub data sets into each base learner of the first layer prediction model, and outputting a respective prediction result by each base learner; then, the model output of the first layer and the feature data set of step S1 are used as the input of the second layer, the meta-learner of the second layer prediction model is trained, and the model outputs at the second layer are averaged to obtain the final prediction result. The first layer of prediction model comprises a GBDT model and an XGboost model; and the meta-learner of the second layer of prediction model adopts a Bagging model.
Further, in step S1, the random forest algorithm specifically includes: and selecting the features by using an average impurity degree reduction index, using classification or regression precision as a criterion function and using a sequence backward selection method and a generalized sequence backward selection method.
Further, in step S1, the average impure reduction index MDI is calculated by the formula:
wherein, IMP R 、IMP L The overall purity and the purity of the left and right sides of the bifurcation, N, are indicated t 、Respectively representing the number of samples of the global, left and right nodes, N being the sample size.
Further, in step S2, constructing a GBDT model specifically includes: input training sample set T { (x) 1 ,y 1 ),…,(x i ,y i ),…,(x N ,y N )},Where X is the input sample space, X i Is an index for evaluation of the properties of the specimen,y is the performance condition and the loss function is L (Y) i ,f(x i ) The output is regressionTree (R)The specific training process of the GBDT model is as follows:
1) initializing an estimation function to minimize a loss function;
wherein f is 0 (x) Is a tree with only one root node, L (y) i C) is a loss function, c is a constant that minimizes the loss function;
2) let M be 1,2, …, M representing the maximum number of iterations;
calculating the negative gradient of the loss function for the ith sample, i is 1,2, …, N, and using the negative gradient as a residual error estimation;
wherein r is mi Represents the residual estimate of the ith sample;
② fitting residual error pairs rm j Generating a regression tree to estimate regression leaf node region to obtain mth tree node region R mj Wherein J is 1,2, …, J and J represents the number of leaf nodes;
(iii) for J ═ 1,2, …, J, the values of the leaf node regions are estimated using a linear search, minimizing the loss function:
fourthly, updating the learner f m (x i ):
3) All c's in the same leaf node area mj And accumulating the values to obtain a final regression tree:
further, in step S2, constructing an XGBoost model specifically includes: input training sample set T { (x) 1 ,y 1 ),…,(x i ,y i ),…,(x N ,y N )},Where X is the input sample space, X i Is an index for evaluation of the properties of the specimen,y is the performance condition;
in the XGboost, each tree needs to be added one by one, so that the effect can be improved;
wherein t is the number of trees;
if the leaves have too many nodes, the risk of over-fitting the model increases. So at the targetAdding penalty term omega (f) into function t ) Limiting the number of leaf nodes;
wherein gamma is punishment, T is the number of leaves, omega is the weight of a leaf node, lambda is an adjustable parameter, and omega is j A leaf node score set is set for each tree;
complete objective function Obj (t) Comprises the following steps:
Solving the optimal solution of the objective function:
the above formula can be used as the cotyledon fraction of the tree, and the structure of the tree is excellent along with the increase of the fraction; and once the post-splitting result is less than the maximum resultant value for the given parameter, the algorithm will stop growing the cotyledon depth.
The invention has the beneficial effects that:
(1) the invention solves the problem of feature selection of the existing learning type prediction, and the selection of the feature index is inspected by using the support degree based on feature learning instead of using a simple correlation analysis technology (such as a Pearson correlation coefficient). The accuracy can be greatly improved, and the model has strong application scene transplanting capability.
(2) The invention adopts the fusion model, improves the road surface IRI prediction precision compared with a single model, and overcomes the problem that the traditional prediction model only solves the average fitting precision.
(3) The invention can be widely applied to different areas through improving the precision and the transplanting performance. By improving the prediction precision, the maintenance fund planning benefit can be greatly improved, and the aim of optimizing the cost benefit is fulfilled.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the means of the instrumentalities and combinations particularly pointed out hereinafter.
Drawings
For the purposes of promoting a better understanding of the objects, aspects and advantages of the invention, reference will now be made to the following detailed description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic flow chart of a road surface IRI prediction method fusing GBDT and XGboost according to the invention;
FIG. 2 is a schematic diagram of feature importance scoring based on a random forest algorithm;
FIG. 3 is a comparison graph of the prediction effect of the Stacking fusion model of the present invention and the existing multivariate linear regression, SVM, GBDT, XGboost models.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention in a schematic way, and the features in the following embodiments and examples may be combined with each other without conflict.
Wherein the showings are for the purpose of illustrating the invention only and not for the purpose of limiting the same, and in which there is shown by way of illustration only and not in the drawings in which there is no intention to limit the invention thereto; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there is an orientation or positional relationship indicated by terms such as "upper", "lower", "left", "right", "front", "rear", etc., based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not an indication or suggestion that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes, and are not to be construed as limiting the present invention, and the specific meaning of the terms may be understood by those skilled in the art according to specific situations.
Referring to fig. 1 to 3, a road surface IRI prediction method fusing GBDT and XGBoost mainly includes the following steps:
1) all the characteristic data (according to actual conditions) are obtained, and specific characteristic variables are shown in table 1.
TABLE 1 characterization of characteristic variables
2) Feature selection
And (4) performing feature selection by adopting a machine learning mode, wherein the used model is a random forest. The random forest feature selection is to use a random forest algorithm to automatically select features with high correlation. The method is a random forest based packaging type feature selection algorithm, takes a random forest algorithm as a basic tool, takes classification or regression precision as a criterion function, and adopts a sequence backward selection method and a generalized sequence backward selection method to select features.
The average impurity reduction index is used for feature selection,
wherein, IMP R 、IMP L Respective purity, N t 、Respectively, the number of samples of the node, and N is the sample size.
The input port is used for receiving the data set transmitted by the front node, the output port is used for outputting the data set added with the discrete fields, the characteristic index results with the characteristic quantity not more than 20 and the purity reduction gradient result more than 0.02 are selected and shown in fig. 2.
3) GBDT model
The GBDT algorithm is essentially a combination of a large number of simple models, and the core of the GBDT algorithm is that, starting from the second decision tree, the input of each decision tree is the sum of the outputs of all the previous decision trees, and based on the promotion idea, the conclusions of a plurality of decision trees are accumulated to obtain the final output.
GBDT is a decision tree algorithm trained based on the Gradient Boosting strategy, and mainly comprises three parts of Gradient Boosting, decision tree algorithm and reduction. The core idea of GBDT is to reduce the residual error, each iteration of which is to reduce the residual error generated by the previous iteration. And when the model prediction result is inconsistent with the actual observed value, generating a new decision tree in the gradient direction of residual error reduction so as to reduce the residual error of the last time, and continuously and repeatedly iterating until the output result is basically consistent with the actual observed value. One sign of continuous optimization and improvement of the model is iterative descent of the loss function of the model, and the GBDT algorithm is to construct a new model in the gradient descent direction of the loss function.
Input training sample set T { (x) 1 ,y 1 ),…,(x i ,y i ),…,(x N ,y N )},Where X is the input sample space, X i Is an index of evaluation of the amount of the substance,y is the performance condition and the loss function is L (Y) i ,f(x i ) The output is a regression treeThe specific training process of the GBDT model is as follows:
1) initializing an estimation function to minimize a loss function;
wherein f is 0 (x) Is a tree with only one root node, L (y) i C) is a loss function, c is a constant that minimizes the loss function;
2) let M be 1,2, …, M representing the maximum number of iterations;
calculating the negative gradient of the loss function for the ith sample, i is 1,2, …, N, and using the negative gradient as a residual error estimation;
wherein r is mi Represents the residual estimate of the ith sample;
② fitting residual error pairs rm j Generating a regression tree to estimate regression leaf node region to obtain mth tree node region R mj Wherein J is 1,2, …, J and J represents the number of leaf nodes;
(iii) for J ═ 1,2, …, J, the values of the leaf node regions are estimated using a linear search, minimizing the loss function:
fourthly, updating the learner f m (x i ):
3) All c's in the same leaf node area mj And accumulating the values to obtain a final regression tree:
4) XGboost model
The XGboost is an iterative tree algorithm, combines a plurality of weak classifiers into a strong classifier together, and is an implementation of a Gradient Boosting Decision Tree (GBDT). XGboost is a powerful sequential integration technique with a modular structure for parallel learning to achieve fast computations, which prevents overfitting by regularization and can generate a weighted quantile sketch that processes weighted data.
The specific algorithm steps are as follows:
in the XGboost, each tree needs to be added one by one, so that the effect can be improved;
wherein t is the number of trees;
if the leaves have too many nodes, the risk of over-fitting the model increases. So a penalty term omega (f) is added to the objective function t ) Limiting the number of leaf nodes;
wherein gamma is punishment, T is the number of leaves, omega is the weight of a leaf node, lambda is an adjustable parameter, and omega is j A leaf node score set is set for each tree;
complete objective function Obj (t) Comprises the following steps:
Solving the optimal solution of the objective function:
the above formula can be used as the cotyledon fraction of the tree, and the structure of the tree is excellent along with the increase of the fraction; and once the post-splitting result is less than the maximum resultant value for the given parameter, the algorithm will stop growing the cotyledon depth.
5) Stacking fusion model
The Stacking model fusion method includes the steps that firstly, an original characteristic data set is divided into a plurality of sub data sets, the sub data sets are input into each base learner of a first-layer prediction model, and each base learner outputs a prediction result. Then, the output of the first layer is used as the input of the second layer, the meta-learner of the prediction model of the second layer is trained, and the model positioned at the second layer outputs the final prediction result. The Stacking model fusion method can improve the overall prediction precision by generalizing the output results of a plurality of models.
In the first stage, an original data set is firstly segmented and divided into a training set and a test set according to a certain proportion, then a proper base learner is selected to train the training set in a cross validation mode, each trained base learner predicts the validation set and the test set, and a machine learning model with excellent prediction performance is selected in the first stage, and meanwhile diversification among the models is guaranteed; in the second stage, the prediction result of the base learner is respectively used as feature data for training and predicting the meta learner, the meta learner performs model construction by combining the features obtained in the last stage and the labels of the original training set as sample data, and outputs the final Stacking model prediction result, and the meta learner in the stage generally selects a simple model with better stability to play a role in integrally improving the model performance.
As shown in fig. 1, the Stacking fusion model: and using two different integrated model algorithms of GBDT and XGboost as a base learner to obtain two groups of prediction results, then applying the three groups of prediction results to a second layer of a using element learner, and selecting a Bagging model for training so as to obtain a final prediction result.
6) Stacking fusion model prediction result and evaluation
In order to verify the superiority of the Stacking fusion model in road surface IRI prediction, the same training set and test set are selected, and the Stacking fusion model is compared and analyzed with the existing 4 prediction models (multivariate linear regression, SVM, GBDT and XGboost), as shown in FIG. 3.
Taking the LTPP data in the united states as an example, spanning 62 states, cities, including: traffic volume size (76989 pieces of data), crack length (12964 pieces of data), traffic open date (1817 pieces of data), rut depth (18128 pieces of data), IRI size (97535 pieces of data), texture information (18735 pieces of data), total 6 tables, and total 226128 pieces of data.
As shown in Table 1 and FIG. 3, on the basis of road surface IRI prediction results, the accuracy of the Stacking fusion model is further improved on GBDT and XGboost models which are good in performance, the final RMSE is 0.040, the final MAE is 0.013, and the R is 2 0.996, and meets the high-precision requirement in road surface prediction.
TABLE 1 evaluation index of each prediction model
Finally, the above embodiments are only intended to illustrate the technical solutions of the present invention and not to limit the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions, and all of them should be covered by the claims of the present invention.
Claims (6)
1. A road surface IRI prediction method fusing GBDT and XGboost is characterized by comprising the following steps:
s1: acquiring pavement characteristic data, and selecting a characteristic data set by adopting a random forest algorithm;
s2: constructing a Stacking fusion model: dividing the characteristic data set of the step S1 into a plurality of sub data sets, inputting the sub data sets into each base learner of the first layer prediction model, and outputting a respective prediction result by each base learner; then, taking the model output of the first layer and the characteristic data set of the step S1 as the input of the second layer, training a meta-learner of a prediction model of the second layer, and averaging the model output of the second layer to obtain a final prediction result; the first layer of prediction models comprise a GBDT model and an XGboost model.
2. The road surface IRI prediction method according to claim 1, wherein in step S1, the random forest algorithm specifically comprises: and selecting the features by using an average impurity degree reduction index, using classification or regression precision as a criterion function and using a sequence backward selection method and a generalized sequence backward selection method.
3. The road surface IRI prediction method according to claim 2, wherein in step S1, the average impure degree reduction index MDI is calculated by the following formula:
4. The road surface IRI prediction method according to claim 1, wherein in step S2, constructing the GBDT model specifically includes: input training sample set T { (x) 1 ,y 1 ),…,(x i ,y i ),…,(x N ,y N )},Where X is the input sample space, X i Is an index for evaluation of the properties of the specimen,y is the performance condition and the loss function is L (Y) i ,f(x i ) The output is a regression treeThe specific training process of the GBDT model is as follows:
1) initializing an estimation function to minimize a loss function;
wherein f is 0 (x) Is a tree with only one root node, L (y) i C) is a loss function, c is a constant that minimizes the loss function;
2) let M be 1,2, …, M representing the maximum number of iterations;
calculating the negative gradient of the loss function for the ith sample, i is 1,2, …, N, and using the negative gradient as a residual error estimation;
wherein r is mi Represents the residual estimate of the ith sample;
② fitting residual error pairs rm i Generating a regression tree to estimate regression leaf node region to obtain mth tree node region R mj Wherein J is 1,2, …, J and J represents the number of leaf nodes;
(iii) for J ═ 1,2, …, J, the values of the leaf node regions are estimated using a linear search, minimizing the loss function:
fourthly, updating the learner f m (x i ):
3) In the same leafNode region will all c mj And accumulating the values to obtain a final regression tree:
5. the road surface IRI prediction method according to claim 1, wherein in step S2, an XGboost model is constructed, specifically comprising: input training sample set T { (x) 1 ,y 1 ),…,(x i ,y i ),…,(x N ,y N )},Where X is the input sample space, X i Is an index for evaluation of the properties of the specimen,y is the performance condition;
in XGboost, each tree is added one by one;
wherein t is the number of trees;
adding penalty term omega (f) into the objective function t ) Limiting the number of leaf nodes;
wherein gamma is punishment, T is the number of leaves, omega is the weight of a leaf node, lambda is an adjustable parameter, and omega is j A leaf node score set is set for each tree;
complete objective function Obj (t) Comprises the following steps:
Solving the optimal solution of the objective function:
the above formula can be used as the cotyledon fraction of the tree, and the structure of the tree is excellent along with the increase of the fraction; and once the post-splitting result is less than the maximum resultant value for the given parameter, the algorithm will stop growing the cotyledon depth.
6. The road surface IRI prediction method according to claim 1, wherein in step S2, the meta-learner of the second layer prediction model employs a Bagging model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210625570.4A CN114881359B (en) | 2022-06-02 | 2022-06-02 | Road surface IRI prediction method fusing GBDT and XGBoost |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210625570.4A CN114881359B (en) | 2022-06-02 | 2022-06-02 | Road surface IRI prediction method fusing GBDT and XGBoost |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114881359A true CN114881359A (en) | 2022-08-09 |
CN114881359B CN114881359B (en) | 2024-05-14 |
Family
ID=82679849
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210625570.4A Active CN114881359B (en) | 2022-06-02 | 2022-06-02 | Road surface IRI prediction method fusing GBDT and XGBoost |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114881359B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117235679A (en) * | 2023-11-15 | 2023-12-15 | 长沙金码测控科技股份有限公司 | LUCC-based tensile load and compressive load evaluation method and system for foundation pit monitoring |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007047137A (en) * | 2005-08-05 | 2007-02-22 | Kumataka Engineering:Kk | Road surface behavior measuring device |
CN107292060A (en) * | 2017-07-28 | 2017-10-24 | 成都智建新业建筑设计咨询有限公司 | Basement roadway applications method based on BIM technology |
CN112232526A (en) * | 2020-09-28 | 2021-01-15 | 中山大学 | Geological disaster susceptibility evaluation method and system based on integration strategy |
CN112288191A (en) * | 2020-11-19 | 2021-01-29 | 国家海洋信息中心 | Ocean buoy service life prediction method based on multi-class machine learning method |
CN112733442A (en) * | 2020-12-31 | 2021-04-30 | 交通运输部公路科学研究所 | Road surface long-term performance prediction model based on deep learning and construction method thereof |
CN112906298A (en) * | 2021-02-05 | 2021-06-04 | 重庆邮电大学 | Blueberry yield prediction method based on machine learning |
CN113159364A (en) * | 2020-12-30 | 2021-07-23 | 中国移动通信集团广东有限公司珠海分公司 | Passenger flow prediction method and system for large-scale traffic station |
-
2022
- 2022-06-02 CN CN202210625570.4A patent/CN114881359B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007047137A (en) * | 2005-08-05 | 2007-02-22 | Kumataka Engineering:Kk | Road surface behavior measuring device |
CN107292060A (en) * | 2017-07-28 | 2017-10-24 | 成都智建新业建筑设计咨询有限公司 | Basement roadway applications method based on BIM technology |
CN112232526A (en) * | 2020-09-28 | 2021-01-15 | 中山大学 | Geological disaster susceptibility evaluation method and system based on integration strategy |
CN112288191A (en) * | 2020-11-19 | 2021-01-29 | 国家海洋信息中心 | Ocean buoy service life prediction method based on multi-class machine learning method |
CN113159364A (en) * | 2020-12-30 | 2021-07-23 | 中国移动通信集团广东有限公司珠海分公司 | Passenger flow prediction method and system for large-scale traffic station |
CN112733442A (en) * | 2020-12-31 | 2021-04-30 | 交通运输部公路科学研究所 | Road surface long-term performance prediction model based on deep learning and construction method thereof |
CN112906298A (en) * | 2021-02-05 | 2021-06-04 | 重庆邮电大学 | Blueberry yield prediction method based on machine learning |
Non-Patent Citations (3)
Title |
---|
ZHIYUAN LUO 等: "Prediction of International Roughness Index Based on Stacking Fusion Model", 《SUSTAINABILITY》, 7 June 2022 (2022-06-07), pages 1 - 13 * |
袁明园 等: "路面养护管理决策系统的开发与应用研究", 《中国公路学会养护与管理分会第十届学术年会论文集》, 9 January 2020 (2020-01-09), pages 384 - 387 * |
骆志元: "基于性能预测的路面养护智能决策研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》, no. 3, 15 March 2024 (2024-03-15), pages 034 - 201 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117235679A (en) * | 2023-11-15 | 2023-12-15 | 长沙金码测控科技股份有限公司 | LUCC-based tensile load and compressive load evaluation method and system for foundation pit monitoring |
Also Published As
Publication number | Publication date |
---|---|
CN114881359B (en) | 2024-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104798043B (en) | A kind of data processing method and computer system | |
CN107862173A (en) | A kind of lead compound virtual screening method and device | |
CN105096614B (en) | Newly-built crossing traffic flow Forecasting Methodology based on generation moldeed depth belief network | |
CN101694652A (en) | Network resource personalized recommended method based on ultrafast neural network | |
CN108846526A (en) | A kind of CO2 emissions prediction technique | |
CN109086900B (en) | Electric power material guarantee and allocation platform based on multi-target particle swarm optimization algorithm | |
Mu et al. | Multi-objective ant colony optimization algorithm based on decomposition for community detection in complex networks | |
CN109215740A (en) | Full-length genome RNA secondary structure prediction method based on Xgboost | |
CN104881689A (en) | Method and system for multi-label active learning classification | |
CN105976070A (en) | Key-element-based matrix decomposition and fine tuning method | |
CN109101629A (en) | A kind of network representation method based on depth network structure and nodal community | |
CN116050670B (en) | Road maintenance decision method and system based on data driving | |
CN110147808A (en) | A kind of novel battery screening technique in groups | |
CN108681739A (en) | One kind recommending method based on user feeling and time dynamic tourist famous-city | |
CN115186097A (en) | Knowledge graph and reinforcement learning based interactive recommendation method | |
CN106202377A (en) | A kind of online collaborative sort method based on stochastic gradient descent | |
CN108062566A (en) | A kind of intelligent integrated flexible measurement method based on the potential feature extraction of multinuclear | |
Pumpuang et al. | Comparisons of classifier algorithms: Bayesian network, C4. 5, decision forest and NBTree for Course Registration Planning model of undergraduate students | |
CN108764280A (en) | A kind of medical data processing method and system based on symptom vector | |
CN114881359A (en) | GBDT and XGboost fused road surface IRI prediction method | |
Elayidom et al. | A generalized data mining framework for placement chance prediction problems | |
CN104966106A (en) | Biological age step-by-step predication method based on support vector machine | |
CN106203616A (en) | Neural network model training devices and method | |
CN113326919A (en) | Traffic travel mode selection prediction method based on computational graph | |
CN109919374A (en) | Prediction of Stock Price method based on APSO-BP neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |