LU505928B1 - A decision tree-based inference method for a full-section tunnel blasting plan - Google Patents
A decision tree-based inference method for a full-section tunnel blasting plan Download PDFInfo
- Publication number
- LU505928B1 LU505928B1 LU505928A LU505928A LU505928B1 LU 505928 B1 LU505928 B1 LU 505928B1 LU 505928 A LU505928 A LU 505928A LU 505928 A LU505928 A LU 505928A LU 505928 B1 LU505928 B1 LU 505928B1
- Authority
- LU
- Luxembourg
- Prior art keywords
- decision tree
- des
- information
- der
- inference
- Prior art date
Links
- 238000003066 decision tree Methods 0.000 title claims abstract description 83
- 238000005422 blasting Methods 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000004458 analytical method Methods 0.000 claims abstract description 12
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 8
- 238000004364 calculation method Methods 0.000 claims abstract description 8
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 238000010276 construction Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 3
- 238000013138 pruning Methods 0.000 claims description 3
- 239000011435 rock Substances 0.000 description 34
- 238000005520 cutting process Methods 0.000 description 12
- 238000009412 basement excavation Methods 0.000 description 11
- 238000013461 design Methods 0.000 description 5
- 125000004122 cyclic group Chemical group 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000005065 mining Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009440 infrastructure construction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012856 packing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24323—Tree-organised classifiers
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Devices For Executing Special Programs (AREA)
- Monitoring And Testing Of Transmission In General (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a decision tree-based inference method for a tunnel full-section blasting plan. The specific steps include obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set; using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The present invention constructs a decision tree based on the C4.5 algorithm by obtaining a data set, and after constructing the decision tree and inference program modules, it performs logical analysis and automated inference, thereby obtaining the ideal blasting plan required by the user.The invention discloses a decision tree-based inference method for a full-section tunnel blasting plan. The specific steps include obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set; using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The present invention constructs a decision tree based on the C4.5 algorithm by obtaining a data set, and after constructing the decision tree and inference program modules, it performs logical analysis and automated inference, thereby obtaining the ideal blasting plan required by the user.
Description
BL-5800 1BL-5800 1
LU505928LU505928
A DECISION TREE-BASED INFERENCE METHOD FOR A TUNNEL FULL-A DECISION TREE-BASED INFERENCE METHOD FOR A FULL-TUNNEL
SECTION BLASTING PLANBLASTING PLAN SECTION
The invention relates to the technical field of tunnel blasting plans, and in particular to a decision tree-based inference method for a tunnel full-section blasting plan.The invention relates to the technical field of tunnel blasting plans, and in particular to a decision tree-based inference method for a full-section tunnel blasting plan.
In recent years, with the rapid development of infrastructure construction industries such as railways, highways and urban rail transit at home and abroad, the tasks of tunnel excavation and construction have become increasingly arduous. At present, in many tunnel excavation projects, the drill and blast method plays a very important role. Therefore, the blasting design work of drill and blast construction has become an important factor restricting its development.In recent years, with the rapid development of infrastructure construction industries such as railways, highways and urban rail transit at home and abroad, the tasks of tunnel excavation and construction have become increasingly arduous. At present, in many tunnel excavation projects, the drill and blast method plays a very important role. Therefore, the blasting design work of drill and blast construction has become an important factor restricting its development.
The shortcoming of the existing technology is that there are some prominent problems in the design of tunnel blasting plans, such as: blasting plans are designed based only on experience, and there are relatively large errors due to different experience levels of designers. For some novices, there is a lack of auxiliary tools for blasting plan design. The calculation workload of blasting plan design is very large, resulting in low design speed and efficiency. Therefore, there is an urgent need for a convenient and efficient automated blasting plan inference solution. Decision trees based on expert systems can solve this problem very well. As an intelligent computer program system, the expert system uses artificial intelligence and computer technology to conduct logical analysis and judgment on actual production problems in a specific field based on the theoretical knowledge or production experience provided by one or more experts in a certain field, to solve problems that only human experts can solve.The shortcoming of the existing technology is that there are some prominent problems in the design of tunnel blasting plans, such as: blasting plans are designed based only on experience, and there are relatively large errors due to different experience levels of designers. For some novices, there is a lack of auxiliary tools for blasting plan design. The calculation workload of blasting plan design is very large, resulting in low design speed and efficiency. Therefore, there is an urgent need for a convenient and efficient automated blasting plan inference solution. Decision trees based on expert systems can solve this problem very well. As an intelligent computer program system, the expert system uses artificial intelligence and computer technology to conduct logical analysis and judgment on actual production problems in a specific field based on the theoretical knowledge or production experience provided by one or more experts in a certain field, to solve problems that only human experts can solve.
The purpose of the present invention is to overcome the shortcomings of the existing technology and adopt a decision tree-based inference method for a tunnel full-section blasting plan.The purpose of the present invention is to overcome the shortcomings of the existing technology and adopt a decision tree-based inference method for a full-section tunnel blasting plan.
BL-5800 2BL-5800 2
A decision tree-based inference method for a tunnel full-section blasting plan, LU505928 including specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.A decision tree-based inference method for a tunnel full-section blasting plan, LU505928 including specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.
As a further solution of the present invention, the specific method of obtaining and preprocessing the blasting data set is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.As a further solution of the present invention, the specific method of obtaining and preprocessing the blasting data set is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.
As a further solution of the present invention, the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: obtaining the proportion p.(1,2,3...) of the data set D and its k-th category samples first, and obtaining the expected information of the data set D as:As a further solution of the present invention, the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: obtaining the proportion p.(1,2,3. ..) of the data set D and its k-th category samples first, and obtaining the expected information of the data set D as:
LyLy
Ent(D)= > Pr log, (p,) k=1 : where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samplesEnt(D)= > Pr log, (p,) k=1: where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples
BL-5800 3 according to an information gain formula. The information gain formula is: 0505528 v D”BL-5800 3 according to an information gain formula. The information gain formula is: 0505528 v D”
Gain(D,a)= Ent(D)=$ 5 Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and Y represents the v-th branch under attribute a, then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate.Gain(D,a)= Ent(D)=$ 5 Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and Y represents the v-th branch under attribute a, then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate.
The information gain rate formula is:The information gain rate formula is:
Gain ratio(D, a) = Gain(D,a) 2) a IV(a)Gain ratio(D, a) = Gain(D,a) 2) a IV(a)
V D" D”V D" D”
IV(a) = Sa pi where velIV(a) = Sa pi where vel
As a further solution of the present invention, the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.As a further solution of the present invention, the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.
As a further solution of the present invention, the specific steps of using programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree;As a further solution of the present invention, the specific steps of using programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree;
BL-5800 4 using programming language to establish an inference program module based on LU505928 decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.BL-5800 4 using programming language to establish an inference program module based on LU505928 decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.
As a further solution of the present invention, the specific steps of performing logical analysis and automated inference on the blasting plan to obtain the final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.As a further solution of the present invention, the specific steps of performing logical analysis and automated inference on the blasting plan to obtain the final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.
Compared with the prior art, the present invention has the following technical effects:Compared with the prior art, the present invention has the following technical effects:
By using the above technical solution, a decision tree is constructed by calculating the expected information, information gain, and information gain rate of the data set and verified. Then, the node information based on the decision tree is used to develop the inference program module, so as to conduct logical analysis and reasoning on the actual activity problems that arise in reality, and obtain the optimal solution based on the thoughts of experts.By using the above technical solution, a decision tree is constructed by calculating the expected information, information gain, and information gain rate of the data set and verified. Then, the node information based on the decision tree is used to develop the inference program module, so as to conduct logical analysis and reasoning on the actual activity problems that arise in reality, and obtain the optimal solution based on the thoughts of experts.
Fig. 1 is a schematic diagram of the steps of the inference method of the tunnel full-section blasting plan according to some embodiments of the present application;Fig. 1 is a schematic diagram of the steps of the inference method of the tunnel full-section blasting plan according to some embodiments of the present application;
Fig. 2 is a flow chart of decision tree generation according to some embodiments of the present application.Fig. 2 is a flow chart of decision tree generation according to some embodiments of the present application.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are onlyThe technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only
BL-5800 5 some of the embodiments of the present invention, rather than all the embodiments. LU505928BL-5800 5 some of the embodiments of the present invention, rather than all the embodiments. LU505928
Referring to Figs. 1 and 2, in the embodiment of the present invention, a decision tree-based inference method for a tunnel full-section blasting plan is provided, which includes the following specific steps:Referring to Figs. 1 and 2, in the embodiment of the present invention, a decision tree-based inference method for a tunnel full-section blasting plan is provided, which includes the following specific steps:
S1. obtaining a blasting data set and performing preprocessing;S1. obtaining a blasting data set and performing preprocessing;
The specific method is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.The specific method is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.
S2. calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm;S2. calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm;
In this embodiment, the calculation is explained through the data set of the relationship between weather and sport in Table 1 below:In this embodiment, the calculation is explained through the data set of the relationship between weather and sport in Table 1 below:
Table 1 Dataset of the relationship between weather and sport days weather temperature humidity windy sport (data set) (attribute 1) (Attribute 2) (Attribute 3) (Attribute 4) (category) 1 sunny hot high no no 2 sunny hot high yes no 3 cloudy hot high no yes 4 rainy warm high no yes 5 rainy cold normal no yes 6 rainy cold normal yes no 7 cloudy cold normal yes yes 8 sunny warm high no no 9 sunny cold normal no yes 10 rainy cold normal no yes 11 sunny warm normal yes yes 12 cloudy warm high yes yes 13 cloudy hot normal no yes 14 rainy warm high yes noTable 1 Dataset of the relationship between weather and sport days weather temperature humidity windy sport (data set) (attribute 1) (Attribute 2) (Attribute 3) (Attribute 4) (category) 1 sunny hot high no no 2 sunny hot high yes no 3 cloudy hot high no yes 4 rainy warm high no yes 5 rainy cold normal no yes 6 rainy cold normal yes no 7 cloudy cold normal yes yes 8 sunny warm high no no 9 sunny cold normal no yes 10 rainy cold normal no yes 11 sunny warm normal yes yes 12 cloudy warm high yes yes 13 cloudy hot normal no yes 14 rainy warm high yes no
As shown in Table 1 above, in the relationship between weather and sport, the number of days is a data set, while weather, temperature, humidity and wind are individual attributes, and whether to play sports is the final conclusion. Conclusions are divided into two categories based on yes or no, and the conclusions are determined by the previous attributes. The decision tree is to reach the final conclusion through continuous decision-making of previous attribute values. But the question that needsAs shown in Table 1 above, in the relationship between weather and sport, the number of days is a data set, while weather, temperature, humidity and wind are individual attributes, and whether to play sports is the final conclusion. Conclusions are divided into two categories based on yes or no, and the conclusions are determined by the previous attributes. The decision tree is to reach the final conclusion through continuous decision-making of previous attribute values. But the question that needs
BL-5800 6 to be considered is which attribute is the most effective to put at the beginning of the LU505928 decision. This is confirmed through the following steps.BL-5800 6 to be considered is which attribute is the most effective to put at the beginning of the LU505928 decision. This is confirmed through the following steps.
Specific steps include: calculating information gain: Information gain is an attribute selection metric and is also the most commonly used index for measuring data sets. i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set D as:Specific steps include: calculating information gain: Information gain is an attribute selection metric and is also the most commonly used index for measuring data sets. i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set D as:
LyLy
Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula.Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula.
Specifically, if attribute a is set to have x different values a1, a2, a3, ..., ax, and attribute a is used to divide the data set D, x branch nodes will be generated, and each branch node is a collection of samples with the same value. According to the information entropy formula, the information entropy Ent(D,) of different branches can be calculated separately. Because the number of samples contained in each branch is different, different weights need to be assigned so that the branch with a larger number of samples has a matching weight. The information gain formula can well reflect this feature.Specifically, if attribute a is set to have x different values a1, a2, a3, ..., ax, and attribute a is used to divide the data set D, x branch nodes will be generated, and each branch node is a collection of samples with the same value. According to the information entropy formula, the information entropy Ent(D,) of different branches can be calculated separately. Because the number of samples contained in each branch is different, different weights need to be assigned so that the branch with a larger number of samples has a matching weight. The information gain formula can well reflect this feature.
The proportion weight of branches with a large number of samples is calculated according to an information gain formula. The information gain formula is: v D*The proportion weight of branches with a large number of samples is calculated according to an information gain formula. The information gain formula is: v D*
Gain(D,a)= Ent(D)-Y Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, VGain(D,a)= Ent(D)-Y Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V
BL-5800 7 represents the number of branches under attribute a, and Y represents the v-th 0505528 branch under attribute a.BL-5800 7 represents the number of branches under attribute a, and Y represents the v-th 0505528 branch under attribute a.
Generally, the greater the information gain, the better the effect of using this attribute as a node splitting attribute. Simply using information gain as an attribute selection metric also has the disadvantage of favoring attributes with a large number of values during the decision tree branching process. That is, the more different values there are in an attribute, the more likely it is that the attribute will be used as a splitting attribute.Generally, the greater the information gain, the better the effect of using this attribute as a node splitting attribute. Simply using information gain as an attribute selection metric also has the disadvantage of favoring attributes with a large number of values during the decision tree branching process. That is, the more different values there are in an attribute, the more likely it is that the attribute will be used as a splitting attribute.
Due to the shortcomings of only using information gain as an attribute selection metric, it can be selected based on the information gain rate as a method for judging splitting attributes.Due to the shortcomings of only using information gain as an attribute selection metric, it can be selected based on the information gain rate as a method for judging splitting attributes.
Then use the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate;Then use the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate;
The information gain rate formula is:The information gain rate formula is:
Gain ratio(D, a) = Gain(D,a) 2) 17 (a)Gain ratio(D, a) = Gain(D,a) 2) 17 (a)
V D" D”V D" D”
IV(a) = Do, where vel D D .IV(a) = Do, where vel D D .
It should be noted that the information gain rate is biased towards attributes with a small number of values when judging and analyzing attributes. Therefore, the C4.5 algorithm does not directly select the information gain rate as the attribute selection metric. Instead, it first selects attributes with higher information gain among the attributes to be divided, and then makes a judgment selection based on the information gain rate among the selected attributes.It should be noted that the information gain rate is biased towards attributes with a small number of values when judging and analyzing attributes. Therefore, the C4.5 algorithm does not directly select the information gain rate as the attribute selection metric. Instead, it first selects attributes with higher information gain among the attributes to be divided, and then makes a judgment selection based on the information gain rate among the selected attributes.
S3. constructing and verifying a decision tree based on the calculation results of the data set. The specific steps include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node;S3. constructing and verifying a decision tree based on the calculation results of the data set. The specific steps include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node;
BL-5800 8 repeating the above steps until the information gain value of the node is zero, that LU505928 is, there is only one category, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.BL-5800 8 repeating the above steps until the information gain value of the node is zero, that LU505928 is, there is only one category, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.
S4. using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree. The specific steps include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information.S4. using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree. The specific steps include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information.
In specific implementations, the programming language can be C, C++ and other languages for programming modules.In specific implementations, the programming language can be C, C++ and other languages for programming modules.
The information of the decision tree is output according to the obtained inference program module. The output can be text in TXT format, and finally the current decision tree is generated.The information of the decision tree is output according to the obtained inference program module. The output can be text in TXT format, and finally the current decision tree is generated.
S5. performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The specific steps include:S5. performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The specific steps include:
According to the decision tree and inference program module, all the information of the current blasting plan is sequentially operated by the inference program module, and the information at each stage of the decision tree is sequentially output for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.According to the decision tree and inference program module, all the information of the current blasting plan is sequentially operated by the inference program module, and the information at each stage of the decision tree is sequentially output for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.
Automated inference for blasting plans:Automated inference for blasting plans:
In this embodiment, specific cutting form selection regarding inclined cutting is taken as an example. In actual projects, if it is known that the tunnel is being tunneled horizontally, the rock mass has horizontal joints, and the specific cutting form of the inclined cutting needs to be determined, then only this decision tree needs to be selected. For example, in the first attribute "Whether the rock mass is tunneled downwards", select "No" and click Next, then in the second attribute "Whether the rock mass is horizontally jointed", select "Yes" and click Next. Then the inference conclusion can be drawn: adopt the form of vertical wedge-shaped cutting. From the decision treeIn this embodiment, specific cutting form selection regarding inclined cutting is taken as an example. In actual projects, if it is known that the tunnel is being tunneled horizontally, the rock mass has horizontal joints, and the specific cutting form of the inclined cutting needs to be determined, then only this decision tree needs to be selected. For example, in the first attribute "Where the rock mass is tunneled downwards", select "No" and click Next, then in the second attribute "Where the rock mass is horizontally jointed", select "Yes" and click Next. Then the inference conclusion can be drawn: adopt the form of vertical wedge-shaped cutting. From the decision tree
BL-5800 9 representation diagram below, the attribute selection branches and results of each step LU505928 can be seen intuitively.BL-5800 9 representation diagram below, the attribute selection branches and results of each step LU505928 can be seen intuitively.
The final decision trees are formed, as shown in Table 2 below. À total of 12 decision trees are generated in the total program. The left column indicates the functions that the decision tree can achieve, and the right column indicates the attributes that need to be referenced. Respectively, they are:The final decision trees are formed, as shown in Table 2 below. A total of 12 decision trees are generated in the total program. The left column indicates the functions that the decision tree can achieve, and the right column indicates the attributes that need to be referenced. Respectively, they are:
The first decision tree is "determining tunnel excavation method". Based on the surrounding rock, tunnel cross-sectional area, tunnel length, and surrounding rock grade, the basic method of tunnel excavation can be determined.The first decision tree is "determining tunnel excavation method". Based on the surrounding rock, tunnel cross-sectional area, tunnel length, and surrounding rock grade, the basic method of tunnel excavation can be determined.
The second decision tree "determining the form of wedge-shaped cut based on cyclic footage". On the premise of determining the wedge-shaped cut, the number of wedge-shaped cuts can be determined based on lithology and cyclic footage.The second decision tree "determining the form of wedge-shaped cut based on cyclic footage". On the premise of determining the wedge-shaped cut, the number of wedge-shaped cuts can be determined based on lithology and cyclic footage.
The third decision tree is "determining the blast hole depth based on the air leg rock drill". On the premise of using the air leg rock drill for excavation, the blast hole depth range can be determined based on the rock solidity coefficient and the excavation cross-sectional area.The third decision tree is "determining the blast hole depth based on the air leg rock drill". On the premise of using the air leg rock drill for excavation, the blast hole depth range can be determined based on the rock solidity coefficient and the excavation cross-sectional area.
The fourth decision tree is “determining the parameters of wedge cut based on the rock solidity coefficient.” On the premise of determining the wedge cut, the number of blast holes in the wedge cut, the angle between the blast holes and the working surface, and the distance between the two rows of blast holes can be determined based on the rock solidity coefficient.The fourth decision tree is “determining the parameters of wedge cut based on the rock solidity coefficient.” On the premise of determining the wedge cut, the number of blast holes in the wedge cut, the angle between the blast holes and the working surface, and the distance between the two rows of blast holes can be determined based on the rock solidity coefficient.
The fifth decision tree is "selecting specific cut forms for inclined cut". The specific cut form can be determined based on the determination of using inclined cut.The fifth decision tree is "selecting specific cut forms for inclined cut". The specific cut form can be determined based on the determination of using inclined cut.
The sixth decision tree "determining the distance beyond the contour line of the hole bottom based on the rock solidity coefficient in light explosion" can query the distance beyond the contour line of the hole bottom based on the rock solidity coefficient.The sixth decision tree "determining the distance beyond the contour line of the hole bottom based on the rock solidity coefficient in light explosion" can query the distance beyond the contour line of the hole bottom based on the rock solidity coefficient.
The seventh decision tree "determining the parameters of hollow large-diameter straight hole cutting based on the surrounding rock level" can give specific parameters about the hollow hole based on the surrounding rock level based on the premise of straight hole cutting.The seventh decision tree "determining the parameters of hollow large-diameter straight hole cutting based on the surrounding rock level" can give specific parameters about the hollow hole based on the surrounding rock level based on the premise of straight hole cutting.
The eighth decision tree "determining the control height of the entrance slope and heading slope based on the surrounding rock grade" can determine the specific controlThe eighth decision tree "determining the control height of the entrance slope and heading slope based on the surrounding rock grade" can determine the specific control
BL-5800 10 height based on the surrounding rock grade and slope. LU505928BL-5800 10 height based on the surrounding rock grade and slope. LU505928
The ninth decision tree "determining noise specified limit" can provide a query about the maximum noise limit based on the rock drill type and quality.The ninth decision tree "determining noise specified limit" can provide a query about the maximum noise limit based on the rock drill type and quality.
The tenth decision tree "determining the filling length" can deduce the filling length of blast mud in the blast hole.The tenth decision tree "determining the filling length" can deduce the filling length of blast mud in the blast hole.
The eleventh decision tree "determining the parameters of the tapered cut hole based on the rock solidity coefficient". On the premise of determining the use of tapered cutouts, the appropriate inclination angle and hole bottom spacing of the blast hole for the cutout hole can be obtained based on the rock solidity coefficient.The eleventh decision tree "determining the parameters of the tapered cut hole based on the rock solidity coefficient". On the premise of determining the use of tapered cutouts, the appropriate inclination angle and hole bottom spacing of the blast hole for the cutout hole can be obtained based on the rock solidity coefficient.
The twelfth decision tree "determining the photoblast parameters based onThe twelfth decision tree "determining the photoblast parameters based on
Ma'anshan Mining Research Institute" can determine the specific data of photoblast holes based on the span of the rock mass.Ma'anshan Mining Research Institute" can determine the specific data of photoblast holes based on the span of the rock mass.
Table 2 Summary of Decision Trees determining the tunnel excavation method surrounding rock, tunnel cross- sectional area, tunnel length, surrounding rock grade determining the form of wedge-shaped cutting lithology, cyclic footage determining the blast hole depth based on air leg rock drill rock solidity coefficient, excavation cross-section area parameters of the wedge-shaped cutting rock solidity coefficient selection of specific cutting forms for inclined cutting excavation direction, horizontal or vertical rock mass joints distance beyond the outline of the hole bottom rock solidity coefficient determining the parameters of medium hole and large surrounding rock grade diameter straight hole cutting determining the control height of the entrance slope and surrounding rock grade and slope heading slope rock drill noise regulations limits rock drill type, rock drill weight determining packing length whether it is light blast hole, pre-split blast hole, blast hole depth determining the parameters of the tapered cut hole rock solidity coefficient determining the photoblast parameters based on data from rock quality, excavation location,Table 2 Summary of Decision Trees determining the tunnel excavation method surrounding rock, tunnel cross-sectional area, tunnel length, surrounding rock grade determining the form of wedge-shaped cutting lithology, cyclic footage determining the blast hole depth based on air leg rock drill rock solidity coefficient, excavation cross-section area parameters of the wedge-shaped cutting rock solidity coefficient selection of specific cutting forms for inclined cutting excavation direction, horizontal or vertical rock mass joints distance beyond the outline of the hole bottom rock solidity coefficient determining the parameters of medium hole and large surrounding rock grade diameter straight hole cutting determining the control height of the entrance slope and surrounding rock grade and slope heading slope rock drill noise regulations limits rock drill type, rock drill weight determining packing length whether it is light blast hole, pre -split blast hole, blast hole depth determining the parameters of the tapered cut hole rock solidity coefficient determining the photoblast parameters based on data from rock quality, excavation location,
Ma'anshan mining research institute excavation spanMa'anshan mining research institute excavation span
Although the embodiments of the present invention have been shown andAlthough the embodiments of the present invention have been shown and
BL-5800 11 described, those of ordinary skill in the art will understand that various changes, LU505928 modifications, substitutions and variants can be made to these embodiments without departing from the principles and spirit of the invention.BL-5800 11 described, those of ordinary skill in the art will understand that various changes, LU505928 modifications, substitutions and variants can be made to these embodiments without departing from the principles and spirit of the invention.
The scope of the present invention is defined by the appended claims and their equivalents, and should all fall within the protection scope of the present invention.The scope of the present invention is defined by the appended claims and their equivalents, and should all fall within the protection scope of the present invention.
Claims (6)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210467781.XA CN114841349A (en) | 2022-04-29 | 2022-04-29 | Inference method of tunnel full-section blasting scheme based on decision tree |
Publications (2)
Publication Number | Publication Date |
---|---|
LU505928A1 LU505928A1 (en) | 2024-01-09 |
LU505928B1 true LU505928B1 (en) | 2024-04-29 |
Family
ID=82568546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
LU505928A LU505928B1 (en) | 2022-04-29 | 2023-03-15 | A decision tree-based inference method for a full-section tunnel blasting plan |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN114841349A (en) |
LU (1) | LU505928B1 (en) |
WO (1) | WO2023207387A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114841349A (en) * | 2022-04-29 | 2022-08-02 | 中铁四局集团有限公司 | Inference method of tunnel full-section blasting scheme based on decision tree |
CN117592163B (en) * | 2023-12-04 | 2024-04-16 | 南宁轨道交通建设有限公司 | Auxiliary decision method for treating longitudinal differential settlement of shield tunnel |
CN117973044B (en) * | 2024-02-04 | 2024-06-14 | 中南大学 | Tunnel intelligent blasting design method and laser positioning equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113191771A (en) * | 2021-03-19 | 2021-07-30 | 青岛檬豆网络科技有限公司 | Buyer account period risk prediction method |
CN114841349A (en) * | 2022-04-29 | 2022-08-02 | 中铁四局集团有限公司 | Inference method of tunnel full-section blasting scheme based on decision tree |
-
2022
- 2022-04-29 CN CN202210467781.XA patent/CN114841349A/en active Pending
-
2023
- 2023-03-15 WO PCT/CN2023/081528 patent/WO2023207387A1/en unknown
- 2023-03-15 LU LU505928A patent/LU505928B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
WO2023207387A1 (en) | 2023-11-02 |
CN114841349A (en) | 2022-08-02 |
LU505928A1 (en) | 2024-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
LU505928B1 (en) | A decision tree-based inference method for a full-section tunnel blasting plan | |
CN111639237B (en) | Electric power communication network risk assessment system based on clustering and association rule mining | |
CN111625922B (en) | Large-scale oil reservoir injection-production optimization method based on machine learning agent model | |
CN104573106A (en) | Intelligent urban construction examining and approving method based on case-based reasoning technology | |
CN109635461A (en) | A kind of application carrys out the method and system of automatic identification Grades of Surrounding Rock with brill parameter | |
CN107918830B (en) | Power distribution network running state evaluation method based on big data technology | |
CN106066873A (en) | A kind of travel information based on body recommends method | |
CN111365015A (en) | Shield tunneling parameter feature extraction and attitude deviation prediction method based on XGboost | |
CN104615680B (en) | The method for building up of web page quality model and device | |
CN109409647A (en) | A kind of analysis method of the salary level influence factor based on random forests algorithm | |
CN103984788A (en) | Automatic intelligent design and optimization system for anchor bolt support of coal tunnel | |
CN105868887A (en) | Building comprehensive energy efficiency analysis method based on subentry measure | |
CN106227828A (en) | A kind of isomorphism hierarchical data contrast visual analysis methods and applications | |
CN115796702A (en) | Evaluation method and system for ecological restoration effect of comprehensive treatment of red soil land | |
CN115481577A (en) | Automatic oil reservoir history fitting method based on random forest and genetic algorithm | |
CN113656868A (en) | BIM technology-based hospital construction collaborative management platform | |
CN108763164A (en) | Evaluation method for coal and gas outburst inversion similarity | |
CN115439012A (en) | GIS-based method for carrying out detailed evaluation on suitability of regional town construction in county level of arid oasis | |
CN106599511A (en) | Method for optimally selecting thin-coal-seam long-wall fully-mechanized mining-face coal mining method | |
CN106096733A (en) | A kind of water conservancy big data, services A+E model | |
Tan | Carbon Emission Prediction with Macroeconomic Variables and Machine Learning | |
Pham et al. | Assessment of plant species for the roadside at Vung Tau city of Vietnam using multi-criteria analysis | |
He et al. | A study on evaluation of farmland fertility levels based on optimization of the decision tree algorithm | |
Yang et al. | The visual simulation technology in formatting forest management plan at unit level based on WF | |
CN118133104A (en) | Rapid identification method for lithofacies of deep sea-phase shale gas well |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FG | Patent granted |
Effective date: 20240429 |