LU505928B1 - A decision tree-based inference method for a full-section tunnel blasting plan - Google Patents

A decision tree-based inference method for a full-section tunnel blasting plan Download PDF

Info

Publication number
LU505928B1
LU505928B1 LU505928A LU505928A LU505928B1 LU 505928 B1 LU505928 B1 LU 505928B1 LU 505928 A LU505928 A LU 505928A LU 505928 A LU505928 A LU 505928A LU 505928 B1 LU505928 B1 LU 505928B1
Authority
LU
Luxembourg
Prior art keywords
decision tree
des
information
der
inference
Prior art date
Application number
LU505928A
Other languages
French (fr)
Other versions
LU505928A1 (en
Inventor
Xianming Lin
Yong Wang
Zhichao Xu
Wenyin Chen
Cheng Yu
Zhongjie Yang
Xifei Deng
Xinghuo Xu
Gaofeng Zhao
Original Assignee
The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group
Univ Tianjin
China Tiesiju Civil Eng Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group, Univ Tianjin, China Tiesiju Civil Eng Group Co Ltd filed Critical The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group
Publication of LU505928A1 publication Critical patent/LU505928A1/en
Application granted granted Critical
Publication of LU505928B1 publication Critical patent/LU505928B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Devices For Executing Special Programs (AREA)
  • Monitoring And Testing Of Transmission In General (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a decision tree-based inference method for a tunnel full-section blasting plan. The specific steps include obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set; using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The present invention constructs a decision tree based on the C4.5 algorithm by obtaining a data set, and after constructing the decision tree and inference program modules, it performs logical analysis and automated inference, thereby obtaining the ideal blasting plan required by the user.The invention discloses a decision tree-based inference method for a full-section tunnel blasting plan. The specific steps include obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set; using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The present invention constructs a decision tree based on the C4.5 algorithm by obtaining a data set, and after constructing the decision tree and inference program modules, it performs logical analysis and automated inference, thereby obtaining the ideal blasting plan required by the user.

Description

BL-5800 1BL-5800 1

LU505928LU505928

A DECISION TREE-BASED INFERENCE METHOD FOR A TUNNEL FULL-A DECISION TREE-BASED INFERENCE METHOD FOR A FULL-TUNNEL

SECTION BLASTING PLANBLASTING PLAN SECTION

Technical fieldTechnical field

The invention relates to the technical field of tunnel blasting plans, and in particular to a decision tree-based inference method for a tunnel full-section blasting plan.The invention relates to the technical field of tunnel blasting plans, and in particular to a decision tree-based inference method for a full-section tunnel blasting plan.

BackgroundBackground

In recent years, with the rapid development of infrastructure construction industries such as railways, highways and urban rail transit at home and abroad, the tasks of tunnel excavation and construction have become increasingly arduous. At present, in many tunnel excavation projects, the drill and blast method plays a very important role. Therefore, the blasting design work of drill and blast construction has become an important factor restricting its development.In recent years, with the rapid development of infrastructure construction industries such as railways, highways and urban rail transit at home and abroad, the tasks of tunnel excavation and construction have become increasingly arduous. At present, in many tunnel excavation projects, the drill and blast method plays a very important role. Therefore, the blasting design work of drill and blast construction has become an important factor restricting its development.

The shortcoming of the existing technology is that there are some prominent problems in the design of tunnel blasting plans, such as: blasting plans are designed based only on experience, and there are relatively large errors due to different experience levels of designers. For some novices, there is a lack of auxiliary tools for blasting plan design. The calculation workload of blasting plan design is very large, resulting in low design speed and efficiency. Therefore, there is an urgent need for a convenient and efficient automated blasting plan inference solution. Decision trees based on expert systems can solve this problem very well. As an intelligent computer program system, the expert system uses artificial intelligence and computer technology to conduct logical analysis and judgment on actual production problems in a specific field based on the theoretical knowledge or production experience provided by one or more experts in a certain field, to solve problems that only human experts can solve.The shortcoming of the existing technology is that there are some prominent problems in the design of tunnel blasting plans, such as: blasting plans are designed based only on experience, and there are relatively large errors due to different experience levels of designers. For some novices, there is a lack of auxiliary tools for blasting plan design. The calculation workload of blasting plan design is very large, resulting in low design speed and efficiency. Therefore, there is an urgent need for a convenient and efficient automated blasting plan inference solution. Decision trees based on expert systems can solve this problem very well. As an intelligent computer program system, the expert system uses artificial intelligence and computer technology to conduct logical analysis and judgment on actual production problems in a specific field based on the theoretical knowledge or production experience provided by one or more experts in a certain field, to solve problems that only human experts can solve.

Summary of inventionSummary of invention

The purpose of the present invention is to overcome the shortcomings of the existing technology and adopt a decision tree-based inference method for a tunnel full-section blasting plan.The purpose of the present invention is to overcome the shortcomings of the existing technology and adopt a decision tree-based inference method for a full-section tunnel blasting plan.

BL-5800 2BL-5800 2

A decision tree-based inference method for a tunnel full-section blasting plan, LU505928 including specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.A decision tree-based inference method for a tunnel full-section blasting plan, LU505928 including specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.

As a further solution of the present invention, the specific method of obtaining and preprocessing the blasting data set is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.As a further solution of the present invention, the specific method of obtaining and preprocessing the blasting data set is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.

As a further solution of the present invention, the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: obtaining the proportion p.(1,2,3...) of the data set D and its k-th category samples first, and obtaining the expected information of the data set D as:As a further solution of the present invention, the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: obtaining the proportion p.(1,2,3. ..) of the data set D and its k-th category samples first, and obtaining the expected information of the data set D as:

LyLy

Ent(D)= > Pr log, (p,) k=1 : where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samplesEnt(D)= > Pr log, (p,) k=1: where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples

BL-5800 3 according to an information gain formula. The information gain formula is: 0505528 v D”BL-5800 3 according to an information gain formula. The information gain formula is: 0505528 v D”

Gain(D,a)= Ent(D)=$ 5 Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and Y represents the v-th branch under attribute a, then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate.Gain(D,a)= Ent(D)=$ 5 Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and Y represents the v-th branch under attribute a, then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate.

The information gain rate formula is:The information gain rate formula is:

Gain ratio(D, a) = Gain(D,a) 2) a IV(a)Gain ratio(D, a) = Gain(D,a) 2) a IV(a)

V D" D”V D" D”

IV(a) = Sa pi where velIV(a) = Sa pi where vel

As a further solution of the present invention, the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.As a further solution of the present invention, the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.

As a further solution of the present invention, the specific steps of using programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree;As a further solution of the present invention, the specific steps of using programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree;

BL-5800 4 using programming language to establish an inference program module based on LU505928 decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.BL-5800 4 using programming language to establish an inference program module based on LU505928 decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.

As a further solution of the present invention, the specific steps of performing logical analysis and automated inference on the blasting plan to obtain the final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.As a further solution of the present invention, the specific steps of performing logical analysis and automated inference on the blasting plan to obtain the final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.

Compared with the prior art, the present invention has the following technical effects:Compared with the prior art, the present invention has the following technical effects:

By using the above technical solution, a decision tree is constructed by calculating the expected information, information gain, and information gain rate of the data set and verified. Then, the node information based on the decision tree is used to develop the inference program module, so as to conduct logical analysis and reasoning on the actual activity problems that arise in reality, and obtain the optimal solution based on the thoughts of experts.By using the above technical solution, a decision tree is constructed by calculating the expected information, information gain, and information gain rate of the data set and verified. Then, the node information based on the decision tree is used to develop the inference program module, so as to conduct logical analysis and reasoning on the actual activity problems that arise in reality, and obtain the optimal solution based on the thoughts of experts.

Brief Description of the DrawingsBrief Description of the Drawings

Fig. 1 is a schematic diagram of the steps of the inference method of the tunnel full-section blasting plan according to some embodiments of the present application;Fig. 1 is a schematic diagram of the steps of the inference method of the tunnel full-section blasting plan according to some embodiments of the present application;

Fig. 2 is a flow chart of decision tree generation according to some embodiments of the present application.Fig. 2 is a flow chart of decision tree generation according to some embodiments of the present application.

Detailed Description of the EmbodimentsDetailed Description of the Embodiments

The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are onlyThe technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only

BL-5800 5 some of the embodiments of the present invention, rather than all the embodiments. LU505928BL-5800 5 some of the embodiments of the present invention, rather than all the embodiments. LU505928

Referring to Figs. 1 and 2, in the embodiment of the present invention, a decision tree-based inference method for a tunnel full-section blasting plan is provided, which includes the following specific steps:Referring to Figs. 1 and 2, in the embodiment of the present invention, a decision tree-based inference method for a tunnel full-section blasting plan is provided, which includes the following specific steps:

S1. obtaining a blasting data set and performing preprocessing;S1. obtaining a blasting data set and performing preprocessing;

The specific method is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.The specific method is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.

S2. calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm;S2. calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm;

In this embodiment, the calculation is explained through the data set of the relationship between weather and sport in Table 1 below:In this embodiment, the calculation is explained through the data set of the relationship between weather and sport in Table 1 below:

Table 1 Dataset of the relationship between weather and sport days weather temperature humidity windy sport (data set) (attribute 1) (Attribute 2) (Attribute 3) (Attribute 4) (category) 1 sunny hot high no no 2 sunny hot high yes no 3 cloudy hot high no yes 4 rainy warm high no yes 5 rainy cold normal no yes 6 rainy cold normal yes no 7 cloudy cold normal yes yes 8 sunny warm high no no 9 sunny cold normal no yes 10 rainy cold normal no yes 11 sunny warm normal yes yes 12 cloudy warm high yes yes 13 cloudy hot normal no yes 14 rainy warm high yes noTable 1 Dataset of the relationship between weather and sport days weather temperature humidity windy sport (data set) (attribute 1) (Attribute 2) (Attribute 3) (Attribute 4) (category) 1 sunny hot high no no 2 sunny hot high yes no 3 cloudy hot high no yes 4 rainy warm high no yes 5 rainy cold normal no yes 6 rainy cold normal yes no 7 cloudy cold normal yes yes 8 sunny warm high no no 9 sunny cold normal no yes 10 rainy cold normal no yes 11 sunny warm normal yes yes 12 cloudy warm high yes yes 13 cloudy hot normal no yes 14 rainy warm high yes no

As shown in Table 1 above, in the relationship between weather and sport, the number of days is a data set, while weather, temperature, humidity and wind are individual attributes, and whether to play sports is the final conclusion. Conclusions are divided into two categories based on yes or no, and the conclusions are determined by the previous attributes. The decision tree is to reach the final conclusion through continuous decision-making of previous attribute values. But the question that needsAs shown in Table 1 above, in the relationship between weather and sport, the number of days is a data set, while weather, temperature, humidity and wind are individual attributes, and whether to play sports is the final conclusion. Conclusions are divided into two categories based on yes or no, and the conclusions are determined by the previous attributes. The decision tree is to reach the final conclusion through continuous decision-making of previous attribute values. But the question that needs

BL-5800 6 to be considered is which attribute is the most effective to put at the beginning of the LU505928 decision. This is confirmed through the following steps.BL-5800 6 to be considered is which attribute is the most effective to put at the beginning of the LU505928 decision. This is confirmed through the following steps.

Specific steps include: calculating information gain: Information gain is an attribute selection metric and is also the most commonly used index for measuring data sets. i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set D as:Specific steps include: calculating information gain: Information gain is an attribute selection metric and is also the most commonly used index for measuring data sets. i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set D as:

LyLy

Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula.Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula.

Specifically, if attribute a is set to have x different values a1, a2, a3, ..., ax, and attribute a is used to divide the data set D, x branch nodes will be generated, and each branch node is a collection of samples with the same value. According to the information entropy formula, the information entropy Ent(D,) of different branches can be calculated separately. Because the number of samples contained in each branch is different, different weights need to be assigned so that the branch with a larger number of samples has a matching weight. The information gain formula can well reflect this feature.Specifically, if attribute a is set to have x different values a1, a2, a3, ..., ax, and attribute a is used to divide the data set D, x branch nodes will be generated, and each branch node is a collection of samples with the same value. According to the information entropy formula, the information entropy Ent(D,) of different branches can be calculated separately. Because the number of samples contained in each branch is different, different weights need to be assigned so that the branch with a larger number of samples has a matching weight. The information gain formula can well reflect this feature.

The proportion weight of branches with a large number of samples is calculated according to an information gain formula. The information gain formula is: v D*The proportion weight of branches with a large number of samples is calculated according to an information gain formula. The information gain formula is: v D*

Gain(D,a)= Ent(D)-Y Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, VGain(D,a)= Ent(D)-Y Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V

BL-5800 7 represents the number of branches under attribute a, and Y represents the v-th 0505528 branch under attribute a.BL-5800 7 represents the number of branches under attribute a, and Y represents the v-th 0505528 branch under attribute a.

Generally, the greater the information gain, the better the effect of using this attribute as a node splitting attribute. Simply using information gain as an attribute selection metric also has the disadvantage of favoring attributes with a large number of values during the decision tree branching process. That is, the more different values there are in an attribute, the more likely it is that the attribute will be used as a splitting attribute.Generally, the greater the information gain, the better the effect of using this attribute as a node splitting attribute. Simply using information gain as an attribute selection metric also has the disadvantage of favoring attributes with a large number of values during the decision tree branching process. That is, the more different values there are in an attribute, the more likely it is that the attribute will be used as a splitting attribute.

Due to the shortcomings of only using information gain as an attribute selection metric, it can be selected based on the information gain rate as a method for judging splitting attributes.Due to the shortcomings of only using information gain as an attribute selection metric, it can be selected based on the information gain rate as a method for judging splitting attributes.

Then use the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate;Then use the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate;

The information gain rate formula is:The information gain rate formula is:

Gain ratio(D, a) = Gain(D,a) 2) 17 (a)Gain ratio(D, a) = Gain(D,a) 2) 17 (a)

V D" D”V D" D”

IV(a) = Do, where vel D D .IV(a) = Do, where vel D D .

It should be noted that the information gain rate is biased towards attributes with a small number of values when judging and analyzing attributes. Therefore, the C4.5 algorithm does not directly select the information gain rate as the attribute selection metric. Instead, it first selects attributes with higher information gain among the attributes to be divided, and then makes a judgment selection based on the information gain rate among the selected attributes.It should be noted that the information gain rate is biased towards attributes with a small number of values when judging and analyzing attributes. Therefore, the C4.5 algorithm does not directly select the information gain rate as the attribute selection metric. Instead, it first selects attributes with higher information gain among the attributes to be divided, and then makes a judgment selection based on the information gain rate among the selected attributes.

S3. constructing and verifying a decision tree based on the calculation results of the data set. The specific steps include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node;S3. constructing and verifying a decision tree based on the calculation results of the data set. The specific steps include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node;

BL-5800 8 repeating the above steps until the information gain value of the node is zero, that LU505928 is, there is only one category, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.BL-5800 8 repeating the above steps until the information gain value of the node is zero, that LU505928 is, there is only one category, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.

S4. using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree. The specific steps include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information.S4. using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree. The specific steps include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information.

In specific implementations, the programming language can be C, C++ and other languages for programming modules.In specific implementations, the programming language can be C, C++ and other languages for programming modules.

The information of the decision tree is output according to the obtained inference program module. The output can be text in TXT format, and finally the current decision tree is generated.The information of the decision tree is output according to the obtained inference program module. The output can be text in TXT format, and finally the current decision tree is generated.

S5. performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The specific steps include:S5. performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The specific steps include:

According to the decision tree and inference program module, all the information of the current blasting plan is sequentially operated by the inference program module, and the information at each stage of the decision tree is sequentially output for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.According to the decision tree and inference program module, all the information of the current blasting plan is sequentially operated by the inference program module, and the information at each stage of the decision tree is sequentially output for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.

Automated inference for blasting plans:Automated inference for blasting plans:

In this embodiment, specific cutting form selection regarding inclined cutting is taken as an example. In actual projects, if it is known that the tunnel is being tunneled horizontally, the rock mass has horizontal joints, and the specific cutting form of the inclined cutting needs to be determined, then only this decision tree needs to be selected. For example, in the first attribute "Whether the rock mass is tunneled downwards", select "No" and click Next, then in the second attribute "Whether the rock mass is horizontally jointed", select "Yes" and click Next. Then the inference conclusion can be drawn: adopt the form of vertical wedge-shaped cutting. From the decision treeIn this embodiment, specific cutting form selection regarding inclined cutting is taken as an example. In actual projects, if it is known that the tunnel is being tunneled horizontally, the rock mass has horizontal joints, and the specific cutting form of the inclined cutting needs to be determined, then only this decision tree needs to be selected. For example, in the first attribute "Where the rock mass is tunneled downwards", select "No" and click Next, then in the second attribute "Where the rock mass is horizontally jointed", select "Yes" and click Next. Then the inference conclusion can be drawn: adopt the form of vertical wedge-shaped cutting. From the decision tree

BL-5800 9 representation diagram below, the attribute selection branches and results of each step LU505928 can be seen intuitively.BL-5800 9 representation diagram below, the attribute selection branches and results of each step LU505928 can be seen intuitively.

The final decision trees are formed, as shown in Table 2 below. À total of 12 decision trees are generated in the total program. The left column indicates the functions that the decision tree can achieve, and the right column indicates the attributes that need to be referenced. Respectively, they are:The final decision trees are formed, as shown in Table 2 below. A total of 12 decision trees are generated in the total program. The left column indicates the functions that the decision tree can achieve, and the right column indicates the attributes that need to be referenced. Respectively, they are:

The first decision tree is "determining tunnel excavation method". Based on the surrounding rock, tunnel cross-sectional area, tunnel length, and surrounding rock grade, the basic method of tunnel excavation can be determined.The first decision tree is "determining tunnel excavation method". Based on the surrounding rock, tunnel cross-sectional area, tunnel length, and surrounding rock grade, the basic method of tunnel excavation can be determined.

The second decision tree "determining the form of wedge-shaped cut based on cyclic footage". On the premise of determining the wedge-shaped cut, the number of wedge-shaped cuts can be determined based on lithology and cyclic footage.The second decision tree "determining the form of wedge-shaped cut based on cyclic footage". On the premise of determining the wedge-shaped cut, the number of wedge-shaped cuts can be determined based on lithology and cyclic footage.

The third decision tree is "determining the blast hole depth based on the air leg rock drill". On the premise of using the air leg rock drill for excavation, the blast hole depth range can be determined based on the rock solidity coefficient and the excavation cross-sectional area.The third decision tree is "determining the blast hole depth based on the air leg rock drill". On the premise of using the air leg rock drill for excavation, the blast hole depth range can be determined based on the rock solidity coefficient and the excavation cross-sectional area.

The fourth decision tree is “determining the parameters of wedge cut based on the rock solidity coefficient.” On the premise of determining the wedge cut, the number of blast holes in the wedge cut, the angle between the blast holes and the working surface, and the distance between the two rows of blast holes can be determined based on the rock solidity coefficient.The fourth decision tree is “determining the parameters of wedge cut based on the rock solidity coefficient.” On the premise of determining the wedge cut, the number of blast holes in the wedge cut, the angle between the blast holes and the working surface, and the distance between the two rows of blast holes can be determined based on the rock solidity coefficient.

The fifth decision tree is "selecting specific cut forms for inclined cut". The specific cut form can be determined based on the determination of using inclined cut.The fifth decision tree is "selecting specific cut forms for inclined cut". The specific cut form can be determined based on the determination of using inclined cut.

The sixth decision tree "determining the distance beyond the contour line of the hole bottom based on the rock solidity coefficient in light explosion" can query the distance beyond the contour line of the hole bottom based on the rock solidity coefficient.The sixth decision tree "determining the distance beyond the contour line of the hole bottom based on the rock solidity coefficient in light explosion" can query the distance beyond the contour line of the hole bottom based on the rock solidity coefficient.

The seventh decision tree "determining the parameters of hollow large-diameter straight hole cutting based on the surrounding rock level" can give specific parameters about the hollow hole based on the surrounding rock level based on the premise of straight hole cutting.The seventh decision tree "determining the parameters of hollow large-diameter straight hole cutting based on the surrounding rock level" can give specific parameters about the hollow hole based on the surrounding rock level based on the premise of straight hole cutting.

The eighth decision tree "determining the control height of the entrance slope and heading slope based on the surrounding rock grade" can determine the specific controlThe eighth decision tree "determining the control height of the entrance slope and heading slope based on the surrounding rock grade" can determine the specific control

BL-5800 10 height based on the surrounding rock grade and slope. LU505928BL-5800 10 height based on the surrounding rock grade and slope. LU505928

The ninth decision tree "determining noise specified limit" can provide a query about the maximum noise limit based on the rock drill type and quality.The ninth decision tree "determining noise specified limit" can provide a query about the maximum noise limit based on the rock drill type and quality.

The tenth decision tree "determining the filling length" can deduce the filling length of blast mud in the blast hole.The tenth decision tree "determining the filling length" can deduce the filling length of blast mud in the blast hole.

The eleventh decision tree "determining the parameters of the tapered cut hole based on the rock solidity coefficient". On the premise of determining the use of tapered cutouts, the appropriate inclination angle and hole bottom spacing of the blast hole for the cutout hole can be obtained based on the rock solidity coefficient.The eleventh decision tree "determining the parameters of the tapered cut hole based on the rock solidity coefficient". On the premise of determining the use of tapered cutouts, the appropriate inclination angle and hole bottom spacing of the blast hole for the cutout hole can be obtained based on the rock solidity coefficient.

The twelfth decision tree "determining the photoblast parameters based onThe twelfth decision tree "determining the photoblast parameters based on

Ma'anshan Mining Research Institute" can determine the specific data of photoblast holes based on the span of the rock mass.Ma'anshan Mining Research Institute" can determine the specific data of photoblast holes based on the span of the rock mass.

Table 2 Summary of Decision Trees determining the tunnel excavation method surrounding rock, tunnel cross- sectional area, tunnel length, surrounding rock grade determining the form of wedge-shaped cutting lithology, cyclic footage determining the blast hole depth based on air leg rock drill rock solidity coefficient, excavation cross-section area parameters of the wedge-shaped cutting rock solidity coefficient selection of specific cutting forms for inclined cutting excavation direction, horizontal or vertical rock mass joints distance beyond the outline of the hole bottom rock solidity coefficient determining the parameters of medium hole and large surrounding rock grade diameter straight hole cutting determining the control height of the entrance slope and surrounding rock grade and slope heading slope rock drill noise regulations limits rock drill type, rock drill weight determining packing length whether it is light blast hole, pre-split blast hole, blast hole depth determining the parameters of the tapered cut hole rock solidity coefficient determining the photoblast parameters based on data from rock quality, excavation location,Table 2 Summary of Decision Trees determining the tunnel excavation method surrounding rock, tunnel cross-sectional area, tunnel length, surrounding rock grade determining the form of wedge-shaped cutting lithology, cyclic footage determining the blast hole depth based on air leg rock drill rock solidity coefficient, excavation cross-section area parameters of the wedge-shaped cutting rock solidity coefficient selection of specific cutting forms for inclined cutting excavation direction, horizontal or vertical rock mass joints distance beyond the outline of the hole bottom rock solidity coefficient determining the parameters of medium hole and large surrounding rock grade diameter straight hole cutting determining the control height of the entrance slope and surrounding rock grade and slope heading slope rock drill noise regulations limits rock drill type, rock drill weight determining packing length whether it is light blast hole, pre -split blast hole, blast hole depth determining the parameters of the tapered cut hole rock solidity coefficient determining the photoblast parameters based on data from rock quality, excavation location,

Ma'anshan mining research institute excavation spanMa'anshan mining research institute excavation span

Although the embodiments of the present invention have been shown andAlthough the embodiments of the present invention have been shown and

BL-5800 11 described, those of ordinary skill in the art will understand that various changes, LU505928 modifications, substitutions and variants can be made to these embodiments without departing from the principles and spirit of the invention.BL-5800 11 described, those of ordinary skill in the art will understand that various changes, LU505928 modifications, substitutions and variants can be made to these embodiments without departing from the principles and spirit of the invention.

The scope of the present invention is defined by the appended claims and their equivalents, and should all fall within the protection scope of the present invention.The scope of the present invention is defined by the appended claims and their equivalents, and should all fall within the protection scope of the present invention.

Claims (6)

BL-5800 12 Claims LU505928BL-5800 12 Claims LU505928 1. À decision tree-based inference method for a tunnel full-section blasting plan, characterized in that it includes specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.1. A decision tree-based inference method for a tunnel full-section blasting plan, characterized in that it includes specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. 2. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific method of obtaining the blasting data set and preprocessing is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.2. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific method of obtaining the blasting data set and preprocessing is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs. 3. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the3. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set P as: Ly Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set; thenC4.5 algorithm include: i nn P,(L23...) D: obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set P as: Ly Ent (D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set; then BL-5800 13 LU505928 using the attributes à € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples according to an information gain formula, which is: v D* Gain(D,a)= Ent(D)-Y Eni(D) v=l where D" represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and V represents the v-th branch under attribute a, and then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate; wherein the information gain rate formula is: Gain ratio(D, a) = Gain(D,a) 2) a IV(a) V D" D” IV(a) = Do, where vel D D .BL-5800 13 LU505928 using the attributes à € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples according to an information gain formula, which is: v D* Gain(D,a)= Ent(D)-Y Eni(D) v=l where D" represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and V represents the v-th branch under attribute a, and then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate; in which the information gain rate formula is: Gain ratio(D, a) = Gain(D,a) 2) a IV(a) , where vel D D . 4. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally4. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally BL-5800 14 verifying the post-pruning processing of the decision tree. 0505528BL-5800 14 verifying the post-pruning processing of the decision tree. 0505528 5. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of using a programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.5. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of using a programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree. 6. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.6. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained. BL-5800 1 LU505928 AnsprücheBL-5800 1 LU505928 Ansprüche 1. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt, dadurch gekennzeichnet, dass es bestimmte Schritte umfasst: Erhalten eines Sprengdatensatzes und Durchführen einer Vorverarbeitung; Berechnen der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnrate des Datensatzes basierend auf einem C4.5-Algorithmus; Erstellen und Uberprifen eines Entscheidungsbaums basierend auf den Berechnungsergebnissen des Datensatzes; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul des Entscheidungsbaums basierend auf den Knoteninformationen des erstellten Entscheidungsbaums zu entwickeln; Durchführen einer logischen Analyse und automatisierten Schlussfolgerungen für den Sprengplan, um einen endgültigen Inferenzplan basierend auf dem erstellten Entscheidungsbaum und dem Inferenzprogrammmodul zu erhalten.1. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt, dadurch gekennzeichnet, dass es bestimmte Schritte umfasst: Erhalten eines Sprengdatensatzes und Durchführen einer Vorverarbeitung; Berechnen der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnrate des Datatensatzes base auf einem C4.5-Algorithmus; Erstellen und Uberprifen eines Entscheidungsbaums basierend auf den Berechnungsergebnissen des Datensatzes; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul des Entscheidungsbaums basierend auf den Knoteninformationen des erstellten Entscheidungsbaums zu entwickeln; Durchführen einer logischen Analyze und automatisierten Schlussfolgerungen für den Sprengplan, um einen endgültigen Inferenzplan basierend auf dem erstellten Entscheidungsbaum und dem Inferenzprogrammmodul zu erhalten. 2. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifische Methode zum Erhalten des Sprengdatensatzes und zur Vorverarbeitung darin besteht, die gesammelten Datensätze nach verschiedenen Attributen und Kategorien zu organisieren und das entsprechende Dateiformat entsprechend den tatsächlichen Anforderungen auszugeben.2. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dassurch gekennzeichnet, dass die spezifische Methode zum Erhalten des Sprengdatensatzes und zur Verarbeitung darin besteht, die gesammelten Datensätze nach verschiedenen At Tributes and categories of organizers and speakers of DateiFormat are presented in the texts Anforderungen auszugeben. 3. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zur Berechnung der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnrate des Datensatzes basierend auf dem C4.5-Algorithmus Folgendes umfassen: Ermitteln des Anteils p.(1,2,3...) des Datensatzes D und seiner Stichproben der k-ten Kategorie und Ermitteln der erwarteten Informationen des Datensatzes D als:3. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dassurch gekennzeichnet, dass die spezifischen Schritte zur Berechnung der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnnrate des Datensatzes base auf C4.5 -Algorithmus Folgendes umfassen: Hermitteln des Anteils p .(1,2,3...) of the Dates of D and within the Stichproben of the K-ten Category and Ermitteln of the Erwarteten Information of the Dates of D als: BL-5800 2 HN LU505928 Ent(D) = > Pr log, (p,) k=1 wobei Ert(D) ist die Informationsentropie, was bedeutet, dass die Reinheit des Datensatzes D umso höher ist, je kleiner die Informationsentropie ist; dann Verwenden der im Datensatz D erhaltenen Attribute 2 € 1.a2.a3,.ax} zum Teilen, Erhalten von x Zweigknoten und Berechnen der Informationsentropie Ent(D,) verschiedener Zweige gemäß der Informationsentropieformel; Berechnen des Anteilsgewichts von Zweigen mit einer großen Anzahl von Stichproben gemäß einer Informationsgewinnformel, die lautet: V ID” Gain(D,a)= Ent(D)- > mp) v=l wobei DD" die Menge aller Stichproben darstellt, deren Wert ax im Attribut a ist, V die Anzahl der Zweige unter Attribut a darstellt und ” den v-ten Zweig unter Attribut a darstellt, und dann Verwenden der Attribute mit hohem Informationsgewinn, die aus der Informationsgewinnformel ermittelt werden, um eine Beurteilung und Auswahl auf der Grundlage der Informationsgewinnrate vorzunehmen; wobei die Formel für die Informationsgewinnrate lautet: Gain ratio(D,a) = Cein(D.a) = 1V(a) LP V(a)=-> — og, — wobei vel D D .BL-5800 2 HN LU505928 Ent(D) = > Pr log, (p,) k=1 wobei Ert(D) ist die Informationsentropie, was bedeutet, dass die Reinheit des Datensatzes D umso höher ist, je kleiner die Informationentropie ist; dann Verwenden der im Datensatz D erhaltenen Attribute 2 € 1.a2.a3,.ax} zum Teilen, Erhalten von x Zweigknoten und Berechnen der Informationsentropie Ent(D,) verschiedener Zweige gemäß der Informationentropieformel; Berechnen des Anteilsgewichts von Zweigen mit einer großen Anzahl von Stichproben gemäß einer Informationsgewinnformel, die lautet: V ID” Gain(D,a)= Ent(D)- > mp) v=l wobei DD" die Menge aller Stichproben darstellt, deren Wert ax im Attribute a ist, V die Anzahl der Zweige unter Attribut a darstellt und “den v-ten Zweig unter Attribute a darstellt, und dann Verwenden der Attribute mit hohem Informationsgewinn, die aus der Informationsgewinnformel ermittelt werden, um eine Beurteilung und Auswahl auf der Grundlage der Informationsgewinnrate vorzunehmen; wobei die Formel für die Informationsgewinnnrate lautet: Gain ratio(D,a) = Cein(D.a) = 1V(a) LP V(a)=-> — og, — wobei vel D D . 4. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zum Erstellen und Überprüfen eines Entscheidungsbaums basierend auf den Berechnungsergebnissen des Datensatzes Folgendes umfassen: zuerst Vergleichen der berechneten Informationsgewinnwerte jedes Attributs des4. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dassurch gekennzeichnet, dass die spezifischen Schritte zum Erstellen und Überprüfen eines Entscheidungsbaums base auf den Berechnungsergebnissen des Dates Additional information: the details of the contents of the information are available at this time. Attributes of BL-5800 3BL-5800 3 . ; i ; ; LU505928 Datensatzes, und Verwenden des Attributs mit dem größten Informationsgewinnwert als Wurzelknoten; dann Berechnen der Informationsgewinnrate jedes Datensatzes unter der Klassifizierung gemäß dem Attributwert des Wurzelknotens und Verwenden des Attributs mit der größten Informationsgewinnrate als Knoten der nächsten Ebene; Wiederholen der obigen Schritte, bis der Informationsgewinnwert des Knotens Null ist, und Verwenden dieses Knotens als Blattknoten und AbschlieBen der Konstruktion; abschließendes Überprüfen der Post-Pruning-Verarbeitung des Entscheidungsbaums.. ; i ; ; LU505928 Dates, and Verwenden des Attributes mit dem größten Informationsgewinnwert als Wurzelknoten; dann Berechnen der Informationsgewinnrate jedes Datensatzes unter der Klassifizierung gemäß dem Attributwert des Wurzelknotens et Verwenden des Attributs mit der größten Informationsgewinnrate als Knoten der nächsten Ebene; Wiederholen der obigen Schritte, bis der Informationsgewinnwert des Knotens Null ist, und Verwenden dieses Knotens als Blattknoten und AbschlieBen der Konstruktion; abschließendes Überprüfen der Post-Pruning-Verarbeitung des Entscheidungsbaums. 5. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte der Verwendung einer Programmiersprache zur Entwicklung des Inferenzprogrammmoduls des Entscheidungsbaums auf der Grundlage der Knoteninformationen des erstellten Entscheidungsbaums Folgendes umfassen: Ermitteln des Attributwerts, des Attributnamens, der Unterscheidungsnummer und der Informationen zum untergeordneten Knoten jedes Knotens des Entscheidungsbaums; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul basierend auf einem Entscheidungsbaum basierend auf den oben genannten Informationen zu erstellen; Ausgeben der Informationen des Entscheidungsbaums gemäß dem ermittelten Inferenzprogrammmodul und Erzeugen des aktuellen Entscheidungsbaums.5. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dassurch gekennzeichnet, dass die spezifischen Schritte der Wendung einer Programmiersprache zur Entwicklung des Inferenzprogrammmoduls des Entscheidungsbaums auf der Grundlage der K noteninformationen des erstellten Entscheidungsbaums Folgendes umfassen: Ermitteln des Attributwerts, des Attributnames, der Unterscheidungsnummer und der Informationen zum untergeordneten Knoten jedes Knotens des Entscheidungsbaums; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul base auf einem Entscheidungsbaum base auf den oben genannten Informationen zu erstellen; Ausgeben der Informationen des Entscheidungsbaums gemäß dem ermittelten Inferenzprogrammmodul und Erzeugen des actualen Entscheidungsbaums. 6. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zur Durchführung einer logischen Analyse und automatisierten Inferenz für den Sprengplan, um einen endgültigen Inferenzplan auf der Grundlage des erstellten Entscheidungsbaums und des Inferenzprogrammmoduls zu erhalten, Folgendes umfassen:6. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zur Durchführung einer logischen Analyze und automatisierten Inferenz für den Sprengplan, um einen endgültigen Inferenzplan auf der Grundlage des erstellten Entscheidungsbaums und des Inferenzprogrammmoduls zu erhalten, Words included: BL-5800 4 Sequentielles Bearbeiten aller Informationen des aktuellen Sprengplans durch 0505928 das Inferenzprogrammmodul gemäß dem Entscheidungsbaum und dem Inferenzprogrammmodul, und Ausgeben der Informationen in jeder Phase des Entscheidungsbaums nacheinander zur logischen Analyse und automatisierten Inferenz, bis ein Sprengplan vorliegt, der den Anforderungen des Benutzers entspricht.BL-5800 4 Sequential Bearings go to Information of the current Sprengplans during 0505928 of the Inferenzprogrammmodul gemäß dem Entscheidungsbaum and dem Inferenzprogrammmodul, and Ausgeben der Informationen in jeder Phase des Entscheidungsbaums nacheinander zur logischen Analyze und automatisierten Inferenz, s ein Sprengplan vorliegt, den Anforderungen des Benutzers entspricht .
LU505928A 2022-04-29 2023-03-15 A decision tree-based inference method for a full-section tunnel blasting plan LU505928B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210467781.XA CN114841349A (en) 2022-04-29 2022-04-29 Inference method of tunnel full-section blasting scheme based on decision tree

Publications (2)

Publication Number Publication Date
LU505928A1 LU505928A1 (en) 2024-01-09
LU505928B1 true LU505928B1 (en) 2024-04-29

Family

ID=82568546

Family Applications (1)

Application Number Title Priority Date Filing Date
LU505928A LU505928B1 (en) 2022-04-29 2023-03-15 A decision tree-based inference method for a full-section tunnel blasting plan

Country Status (3)

Country Link
CN (1) CN114841349A (en)
LU (1) LU505928B1 (en)
WO (1) WO2023207387A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114841349A (en) * 2022-04-29 2022-08-02 中铁四局集团有限公司 Inference method of tunnel full-section blasting scheme based on decision tree
CN117592163B (en) * 2023-12-04 2024-04-16 南宁轨道交通建设有限公司 Auxiliary decision method for treating longitudinal differential settlement of shield tunnel
CN117973044B (en) * 2024-02-04 2024-06-14 中南大学 Tunnel intelligent blasting design method and laser positioning equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191771A (en) * 2021-03-19 2021-07-30 青岛檬豆网络科技有限公司 Buyer account period risk prediction method
CN114841349A (en) * 2022-04-29 2022-08-02 中铁四局集团有限公司 Inference method of tunnel full-section blasting scheme based on decision tree

Also Published As

Publication number Publication date
WO2023207387A1 (en) 2023-11-02
CN114841349A (en) 2022-08-02
LU505928A1 (en) 2024-01-09

Similar Documents

Publication Publication Date Title
LU505928B1 (en) A decision tree-based inference method for a full-section tunnel blasting plan
CN111639237B (en) Electric power communication network risk assessment system based on clustering and association rule mining
CN111625922B (en) Large-scale oil reservoir injection-production optimization method based on machine learning agent model
CN104573106A (en) Intelligent urban construction examining and approving method based on case-based reasoning technology
CN109635461A (en) A kind of application carrys out the method and system of automatic identification Grades of Surrounding Rock with brill parameter
CN107918830B (en) Power distribution network running state evaluation method based on big data technology
CN106066873A (en) A kind of travel information based on body recommends method
CN111365015A (en) Shield tunneling parameter feature extraction and attitude deviation prediction method based on XGboost
CN104615680B (en) The method for building up of web page quality model and device
CN109409647A (en) A kind of analysis method of the salary level influence factor based on random forests algorithm
CN103984788A (en) Automatic intelligent design and optimization system for anchor bolt support of coal tunnel
CN105868887A (en) Building comprehensive energy efficiency analysis method based on subentry measure
CN106227828A (en) A kind of isomorphism hierarchical data contrast visual analysis methods and applications
CN115796702A (en) Evaluation method and system for ecological restoration effect of comprehensive treatment of red soil land
CN115481577A (en) Automatic oil reservoir history fitting method based on random forest and genetic algorithm
CN113656868A (en) BIM technology-based hospital construction collaborative management platform
CN108763164A (en) Evaluation method for coal and gas outburst inversion similarity
CN115439012A (en) GIS-based method for carrying out detailed evaluation on suitability of regional town construction in county level of arid oasis
CN106599511A (en) Method for optimally selecting thin-coal-seam long-wall fully-mechanized mining-face coal mining method
CN106096733A (en) A kind of water conservancy big data, services A+E model
Tan Carbon Emission Prediction with Macroeconomic Variables and Machine Learning
Pham et al. Assessment of plant species for the roadside at Vung Tau city of Vietnam using multi-criteria analysis
He et al. A study on evaluation of farmland fertility levels based on optimization of the decision tree algorithm
Yang et al. The visual simulation technology in formatting forest management plan at unit level based on WF
CN118133104A (en) Rapid identification method for lithofacies of deep sea-phase shale gas well

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20240429