LU505928A1 - A decision tree-based inference method for a tunnel full-section blasting plan - Google Patents

A decision tree-based inference method for a tunnel full-section blasting plan Download PDF

Info

Publication number
LU505928A1
LU505928A1 LU505928A LU505928A LU505928A1 LU 505928 A1 LU505928 A1 LU 505928A1 LU 505928 A LU505928 A LU 505928A LU 505928 A LU505928 A LU 505928A LU 505928 A1 LU505928 A1 LU 505928A1
Authority
LU
Luxembourg
Prior art keywords
decision tree
des
der
information
inference
Prior art date
Application number
LU505928A
Other languages
English (en)
Other versions
LU505928B1 (fr
Inventor
Xianming Lin
Yong Wang
Zhichao Xu
Wenyin Chen
Cheng Yu
Zhongjie Yang
Xifei Deng
Xinghuo Xu
Gaofeng Zhao
Original Assignee
The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group
Univ Tianjin
China Tiesiju Civil Eng Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group, Univ Tianjin, China Tiesiju Civil Eng Group Co Ltd filed Critical The Seventh Eng Co Ltd Of China Tiesiju Civil Eng Group
Publication of LU505928A1 publication Critical patent/LU505928A1/fr
Application granted granted Critical
Publication of LU505928B1 publication Critical patent/LU505928B1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Devices For Executing Special Programs (AREA)
  • Monitoring And Testing Of Transmission In General (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a decision tree-based inference method for a tunnel full-section blasting plan. The specific steps include obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set; using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The present invention constructs a decision tree based on the C4.5 algorithm by obtaining a data set, and after constructing the decision tree and inference program modules, it performs logical analysis and automated inference, thereby obtaining the ideal blasting plan required by the user.

Description

BL-5800 1
LU505928
A DECISION TREE-BASED INFERENCE METHOD FOR A TUNNEL FULL-
SECTION BLASTING PLAN
Technical field
The invention relates to the technical field of tunnel blasting plans, and in particular to a decision tree-based inference method for a tunnel full-section blasting plan.
Background
In recent years, with the rapid development of infrastructure construction industries such as railways, highways and urban rail transit at home and abroad, the tasks of tunnel excavation and construction have become increasingly arduous. At present, in many tunnel excavation projects, the drill and blast method plays a very important role. Therefore, the blasting design work of drill and blast construction has become an important factor restricting its development.
The shortcoming of the existing technology is that there are some prominent problems in the design of tunnel blasting plans, such as: blasting plans are designed based only on experience, and there are relatively large errors due to different experience levels of designers. For some novices, there is a lack of auxiliary tools for blasting plan design. The calculation workload of blasting plan design is very large, resulting in low design speed and efficiency. Therefore, there is an urgent need for a convenient and efficient automated blasting plan inference solution. Decision trees based on expert systems can solve this problem very well. As an intelligent computer program system, the expert system uses artificial intelligence and computer technology to conduct logical analysis and judgment on actual production problems in a specific field based on the theoretical knowledge or production experience provided by one or more experts in a certain field, to solve problems that only human experts can solve.
Summary of invention
The purpose of the present invention is to overcome the shortcomings of the existing technology and adopt a decision tree-based inference method for a tunnel full-section blasting plan.
BL-5800 2
A decision tree-based inference method for a tunnel full-section blasting plan, LU505928 including specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.
As a further solution of the present invention, the specific method of obtaining and preprocessing the blasting data set is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.
As a further solution of the present invention, the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm include: obtaining the proportion p.(1,2,3...) of the data set D and its k-th category samples first, and obtaining the expected information of the data set D as:
Ly
Ent(D)= > Pr log, (p,) k=1 : where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples
BL-5800 3 according to an information gain formula. The information gain formula is: 0505528 v D”
Gain(D,a)= Ent(D)=$ 5 Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and Y represents the v-th branch under attribute a, then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate.
The information gain rate formula is:
Gain ratio(D, a) = Gain(D,a) 2) a IV(a)
V D" D”
IV(a) = Sa pi where vel
As a further solution of the present invention, the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.
As a further solution of the present invention, the specific steps of using programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree;
BL-5800 4 using programming language to establish an inference program module based on LU505928 decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.
As a further solution of the present invention, the specific steps of performing logical analysis and automated inference on the blasting plan to obtain the final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.
Compared with the prior art, the present invention has the following technical effects:
By using the above technical solution, a decision tree is constructed by calculating the expected information, information gain, and information gain rate of the data set and verified. Then, the node information based on the decision tree is used to develop the inference program module, so as to conduct logical analysis and reasoning on the actual activity problems that arise in reality, and obtain the optimal solution based on the thoughts of experts.
Brief Description of the Drawings
Fig. 1 is a schematic diagram of the steps of the inference method of the tunnel full-section blasting plan according to some embodiments of the present application;
Fig. 2 is a flow chart of decision tree generation according to some embodiments of the present application.
Detailed Description of the Embodiments
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only
BL-5800 5 some of the embodiments of the present invention, rather than all the embodiments. LU505928
Referring to Figs. 1 and 2, in the embodiment of the present invention, a decision tree-based inference method for a tunnel full-section blasting plan is provided, which includes the following specific steps:
S1. obtaining a blasting data set and performing preprocessing;
The specific method is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.
S2. calculating the expected information, information gain, and information gain rate of the data set based on the C4.5 algorithm;
In this embodiment, the calculation is explained through the data set of the relationship between weather and sport in Table 1 below:
Table 1 Dataset of the relationship between weather and sport days weather temperature humidity windy sport (data set) (attribute 1) (Attribute 2) (Attribute 3) (Attribute 4) (category) 1 sunny hot high no no 2 sunny hot high yes no 3 cloudy hot high no yes 4 rainy warm high no yes 5 rainy cold normal no yes 6 rainy cold normal yes no 7 cloudy cold normal yes yes 8 sunny warm high no no 9 sunny cold normal no yes 10 rainy cold normal no yes 11 sunny warm normal yes yes 12 cloudy warm high yes yes 13 cloudy hot normal no yes 14 rainy warm high yes no
As shown in Table 1 above, in the relationship between weather and sport, the number of days is a data set, while weather, temperature, humidity and wind are individual attributes, and whether to play sports is the final conclusion. Conclusions are divided into two categories based on yes or no, and the conclusions are determined by the previous attributes. The decision tree is to reach the final conclusion through continuous decision-making of previous attribute values. But the question that needs
BL-5800 6 to be considered is which attribute is the most effective to put at the beginning of the LU505928 decision. This is confirmed through the following steps.
Specific steps include: calculating information gain: Information gain is an attribute selection metric and is also the most commonly used index for measuring data sets. i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set D as:
Ly
Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set D; then using the attributes 2 € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula.
Specifically, if attribute a is set to have x different values a1, a2, a3, ..., ax, and attribute a is used to divide the data set D, x branch nodes will be generated, and each branch node is a collection of samples with the same value. According to the information entropy formula, the information entropy Ent(D,) of different branches can be calculated separately. Because the number of samples contained in each branch is different, different weights need to be assigned so that the branch with a larger number of samples has a matching weight. The information gain formula can well reflect this feature.
The proportion weight of branches with a large number of samples is calculated according to an information gain formula. The information gain formula is: v D*
Gain(D,a)= Ent(D)-Y Eni(D) v=l where D'represents the set of all samples whose value is ax in attribute a, V
BL-5800 7 represents the number of branches under attribute a, and Y represents the v-th 0505528 branch under attribute a.
Generally, the greater the information gain, the better the effect of using this attribute as a node splitting attribute. Simply using information gain as an attribute selection metric also has the disadvantage of favoring attributes with a large number of values during the decision tree branching process. That is, the more different values there are in an attribute, the more likely it is that the attribute will be used as a splitting attribute.
Due to the shortcomings of only using information gain as an attribute selection metric, it can be selected based on the information gain rate as a method for judging splitting attributes.
Then use the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate;
The information gain rate formula is:
Gain ratio(D, a) = Gain(D,a) 2) 17 (a)
V D" D”
IV(a) = Do, where vel D D .
It should be noted that the information gain rate is biased towards attributes with a small number of values when judging and analyzing attributes. Therefore, the C4.5 algorithm does not directly select the information gain rate as the attribute selection metric. Instead, it first selects attributes with higher information gain among the attributes to be divided, and then makes a judgment selection based on the information gain rate among the selected attributes.
S3. constructing and verifying a decision tree based on the calculation results of the data set. The specific steps include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node;
BL-5800 8 repeating the above steps until the information gain value of the node is zero, that LU505928 is, there is only one category, and using this node as a leaf node, and completing the construction; finally verifying the post-pruning processing of the decision tree.
S4. using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree. The specific steps include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information.
In specific implementations, the programming language can be C, C++ and other languages for programming modules.
The information of the decision tree is output according to the obtained inference program module. The output can be text in TXT format, and finally the current decision tree is generated.
S5. performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module. The specific steps include:
According to the decision tree and inference program module, all the information of the current blasting plan is sequentially operated by the inference program module, and the information at each stage of the decision tree is sequentially output for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.
Automated inference for blasting plans:
In this embodiment, specific cutting form selection regarding inclined cutting is taken as an example. In actual projects, if it is known that the tunnel is being tunneled horizontally, the rock mass has horizontal joints, and the specific cutting form of the inclined cutting needs to be determined, then only this decision tree needs to be selected. For example, in the first attribute "Whether the rock mass is tunneled downwards", select "No" and click Next, then in the second attribute "Whether the rock mass is horizontally jointed", select "Yes" and click Next. Then the inference conclusion can be drawn: adopt the form of vertical wedge-shaped cutting. From the decision tree
BL-5800 9 representation diagram below, the attribute selection branches and results of each step LU505928 can be seen intuitively.
The final decision trees are formed, as shown in Table 2 below. À total of 12 decision trees are generated in the total program. The left column indicates the functions that the decision tree can achieve, and the right column indicates the attributes that need to be referenced. Respectively, they are:
The first decision tree is "determining tunnel excavation method". Based on the surrounding rock, tunnel cross-sectional area, tunnel length, and surrounding rock grade, the basic method of tunnel excavation can be determined.
The second decision tree "determining the form of wedge-shaped cut based on cyclic footage". On the premise of determining the wedge-shaped cut, the number of wedge-shaped cuts can be determined based on lithology and cyclic footage.
The third decision tree is "determining the blast hole depth based on the air leg rock drill". On the premise of using the air leg rock drill for excavation, the blast hole depth range can be determined based on the rock solidity coefficient and the excavation cross-sectional area.
The fourth decision tree is “determining the parameters of wedge cut based on the rock solidity coefficient.” On the premise of determining the wedge cut, the number of blast holes in the wedge cut, the angle between the blast holes and the working surface, and the distance between the two rows of blast holes can be determined based on the rock solidity coefficient.
The fifth decision tree is "selecting specific cut forms for inclined cut". The specific cut form can be determined based on the determination of using inclined cut.
The sixth decision tree "determining the distance beyond the contour line of the hole bottom based on the rock solidity coefficient in light explosion" can query the distance beyond the contour line of the hole bottom based on the rock solidity coefficient.
The seventh decision tree "determining the parameters of hollow large-diameter straight hole cutting based on the surrounding rock level" can give specific parameters about the hollow hole based on the surrounding rock level based on the premise of straight hole cutting.
The eighth decision tree "determining the control height of the entrance slope and heading slope based on the surrounding rock grade" can determine the specific control
BL-5800 10 height based on the surrounding rock grade and slope. LU505928
The ninth decision tree "determining noise specified limit" can provide a query about the maximum noise limit based on the rock drill type and quality.
The tenth decision tree "determining the filling length" can deduce the filling length of blast mud in the blast hole.
The eleventh decision tree "determining the parameters of the tapered cut hole based on the rock solidity coefficient". On the premise of determining the use of tapered cutouts, the appropriate inclination angle and hole bottom spacing of the blast hole for the cutout hole can be obtained based on the rock solidity coefficient.
The twelfth decision tree "determining the photoblast parameters based on
Ma'anshan Mining Research Institute" can determine the specific data of photoblast holes based on the span of the rock mass.
Table 2 Summary of Decision Trees determining the tunnel excavation method surrounding rock, tunnel cross- sectional area, tunnel length, surrounding rock grade determining the form of wedge-shaped cutting lithology, cyclic footage determining the blast hole depth based on air leg rock drill rock solidity coefficient, excavation cross-section area parameters of the wedge-shaped cutting rock solidity coefficient selection of specific cutting forms for inclined cutting excavation direction, horizontal or vertical rock mass joints distance beyond the outline of the hole bottom rock solidity coefficient determining the parameters of medium hole and large surrounding rock grade diameter straight hole cutting determining the control height of the entrance slope and surrounding rock grade and slope heading slope rock drill noise regulations limits rock drill type, rock drill weight determining packing length whether it is light blast hole, pre-split blast hole, blast hole depth determining the parameters of the tapered cut hole rock solidity coefficient determining the photoblast parameters based on data from rock quality, excavation location,
Ma'anshan mining research institute excavation span
Although the embodiments of the present invention have been shown and
BL-5800 11 described, those of ordinary skill in the art will understand that various changes, LU505928 modifications, substitutions and variants can be made to these embodiments without departing from the principles and spirit of the invention.
The scope of the present invention is defined by the appended claims and their equivalents, and should all fall within the protection scope of the present invention.

Claims (6)

BL-5800 12 Claims LU505928
1. À decision tree-based inference method for a tunnel full-section blasting plan, characterized in that it includes specific steps: obtaining a blasting data set and performing preprocessing; calculating the expected information, information gain, and information gain rate of the data set based on a C4.5 algorithm; constructing and verifying a decision tree based on the calculation results of the data set: using programming language to develop an inference program module of the decision tree based on the node information of the constructed decision tree; performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module.
2. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific method of obtaining the blasting data set and preprocessing is organizing the collected data sets according to different attributes and categories, and outputting the corresponding file format according to actual needs.
3. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of calculating the expected information, information gain, and information gain rate of the data set based on the
C4.5 algorithm include: i nn P,(L23...) D : obtaining the proportion £* of the data set and its k-th category samples first, and obtaining the expected information of the data set P as: Ly Ent(D)= > Pr log, (p,) k=1 where Ent(D) is the information entropy, which means that the smaller the information entropy, the higher the purity of the data set; then
BL-5800 13 LU505928 using the attributes à € fal,a2,a3...ax} obtained in the data set D to divide, obtaining x branch nodes, and calculating the information entropy Ent(D,) of different branches according to the information entropy formula; calculating the proportion weight of branches with a large number of samples according to an information gain formula, which is: v D* Gain(D,a)= Ent(D)-Y Eni(D) v=l where D" represents the set of all samples whose value is ax in attribute a, V represents the number of branches under attribute a, and V represents the v-th branch under attribute a, and then using the attributes with high information gain obtained from the information gain formula to make judgment and selection based on the information gain rate; wherein the information gain rate formula is: Gain ratio(D, a) = Gain(D,a) 2) a IV(a) V D" D” IV(a) = Do, where vel D D .
4. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of constructing and verifying a decision tree based on the calculation results of the data set include: comparing the calculated information gain values of each attribute of the data set first, and using the attribute with the largest information gain value as the root node; then calculating the information gain rate of each data set under the classification according to the attribute value of the root node, and using the attribute with the largest information gain rate as the next-level node; repeating the above steps until the information gain value of the node is zero, and using this node as a leaf node, and completing the construction; finally
BL-5800 14 verifying the post-pruning processing of the decision tree. 0505528
5. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of using a programming language to develop the inference program module of the decision tree based on the node information of the constructed decision tree include: obtaining the attribute value, attribute name, discrimination number, and child node information of each node of the decision tree; using programming language to establish an inference program module based on decision tree based on the above information; outputting the information of the decision tree according to the obtained inference program module and generating the current decision tree.
6. The decision tree-based inference method for a tunnel full-section blasting plan according to claim 1, characterized in that the specific steps of performing logical analysis and automated inference on the blasting plan to obtain a final inference plan based on the constructed decision tree and inference program module include: operating all the information of the current blasting plan sequentially by the inference program module according to the decision tree and inference program module, and outputting the information at each stage of the decision tree sequentially for logical analysis and automated inference until a blasting plan that meets the user's requirements is obtained.
BL-5800 1 LU505928 Ansprüche
1. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt, dadurch gekennzeichnet, dass es bestimmte Schritte umfasst: Erhalten eines Sprengdatensatzes und Durchführen einer Vorverarbeitung; Berechnen der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnrate des Datensatzes basierend auf einem C4.5-Algorithmus; Erstellen und Uberprifen eines Entscheidungsbaums basierend auf den Berechnungsergebnissen des Datensatzes; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul des Entscheidungsbaums basierend auf den Knoteninformationen des erstellten Entscheidungsbaums zu entwickeln; Durchführen einer logischen Analyse und automatisierten Schlussfolgerungen für den Sprengplan, um einen endgültigen Inferenzplan basierend auf dem erstellten Entscheidungsbaum und dem Inferenzprogrammmodul zu erhalten.
2. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifische Methode zum Erhalten des Sprengdatensatzes und zur Vorverarbeitung darin besteht, die gesammelten Datensätze nach verschiedenen Attributen und Kategorien zu organisieren und das entsprechende Dateiformat entsprechend den tatsächlichen Anforderungen auszugeben.
3. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zur Berechnung der erwarteten Informationen, des Informationsgewinns und der Informationsgewinnrate des Datensatzes basierend auf dem C4.5-Algorithmus Folgendes umfassen: Ermitteln des Anteils p.(1,2,3...) des Datensatzes D und seiner Stichproben der k-ten Kategorie und Ermitteln der erwarteten Informationen des Datensatzes D als:
BL-5800 2 HN LU505928 Ent(D) = > Pr log, (p,) k=1 wobei Ert(D) ist die Informationsentropie, was bedeutet, dass die Reinheit des Datensatzes D umso höher ist, je kleiner die Informationsentropie ist; dann Verwenden der im Datensatz D erhaltenen Attribute 2 € 1.a2.a3,.ax} zum Teilen, Erhalten von x Zweigknoten und Berechnen der Informationsentropie Ent(D,) verschiedener Zweige gemäß der Informationsentropieformel; Berechnen des Anteilsgewichts von Zweigen mit einer großen Anzahl von Stichproben gemäß einer Informationsgewinnformel, die lautet: V ID” Gain(D,a)= Ent(D)- > mp) v=l wobei DD" die Menge aller Stichproben darstellt, deren Wert ax im Attribut a ist, V die Anzahl der Zweige unter Attribut a darstellt und ” den v-ten Zweig unter Attribut a darstellt, und dann Verwenden der Attribute mit hohem Informationsgewinn, die aus der Informationsgewinnformel ermittelt werden, um eine Beurteilung und Auswahl auf der Grundlage der Informationsgewinnrate vorzunehmen; wobei die Formel für die Informationsgewinnrate lautet: Gain ratio(D,a) = Cein(D.a) = 1V(a) LP V(a)=-> — og, — wobei vel D D .
4. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zum Erstellen und Überprüfen eines Entscheidungsbaums basierend auf den Berechnungsergebnissen des Datensatzes Folgendes umfassen: zuerst Vergleichen der berechneten Informationsgewinnwerte jedes Attributs des
BL-5800 3
. ; i ; ; LU505928 Datensatzes, und Verwenden des Attributs mit dem größten Informationsgewinnwert als Wurzelknoten; dann Berechnen der Informationsgewinnrate jedes Datensatzes unter der Klassifizierung gemäß dem Attributwert des Wurzelknotens und Verwenden des Attributs mit der größten Informationsgewinnrate als Knoten der nächsten Ebene; Wiederholen der obigen Schritte, bis der Informationsgewinnwert des Knotens Null ist, und Verwenden dieses Knotens als Blattknoten und AbschlieBen der Konstruktion; abschließendes Überprüfen der Post-Pruning-Verarbeitung des Entscheidungsbaums.
5. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte der Verwendung einer Programmiersprache zur Entwicklung des Inferenzprogrammmoduls des Entscheidungsbaums auf der Grundlage der Knoteninformationen des erstellten Entscheidungsbaums Folgendes umfassen: Ermitteln des Attributwerts, des Attributnamens, der Unterscheidungsnummer und der Informationen zum untergeordneten Knoten jedes Knotens des Entscheidungsbaums; Verwenden einer Programmiersprache, um ein Inferenzprogrammmodul basierend auf einem Entscheidungsbaum basierend auf den oben genannten Informationen zu erstellen; Ausgeben der Informationen des Entscheidungsbaums gemäß dem ermittelten Inferenzprogrammmodul und Erzeugen des aktuellen Entscheidungsbaums.
6. Entscheidungsbaumbasiertes Inferenzverfahren für einen Sprengplan für den gesamten Tunnelabschnitt nach Anspruch 1, dadurch gekennzeichnet, dass die spezifischen Schritte zur Durchführung einer logischen Analyse und automatisierten Inferenz für den Sprengplan, um einen endgültigen Inferenzplan auf der Grundlage des erstellten Entscheidungsbaums und des Inferenzprogrammmoduls zu erhalten, Folgendes umfassen:
BL-5800 4 Sequentielles Bearbeiten aller Informationen des aktuellen Sprengplans durch 0505928 das Inferenzprogrammmodul gemäß dem Entscheidungsbaum und dem Inferenzprogrammmodul, und Ausgeben der Informationen in jeder Phase des Entscheidungsbaums nacheinander zur logischen Analyse und automatisierten Inferenz, bis ein Sprengplan vorliegt, der den Anforderungen des Benutzers entspricht.
LU505928A 2022-04-29 2023-03-15 A decision tree-based inference method for a tunnel full-section blasting plan LU505928B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210467781.XA CN114841349A (zh) 2022-04-29 2022-04-29 一种基于决策树的隧道全断面爆破方案的推理方法

Publications (2)

Publication Number Publication Date
LU505928A1 true LU505928A1 (fr) 2024-01-09
LU505928B1 LU505928B1 (fr) 2024-04-29

Family

ID=82568546

Family Applications (1)

Application Number Title Priority Date Filing Date
LU505928A LU505928B1 (fr) 2022-04-29 2023-03-15 A decision tree-based inference method for a tunnel full-section blasting plan

Country Status (3)

Country Link
CN (1) CN114841349A (fr)
LU (1) LU505928B1 (fr)
WO (1) WO2023207387A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114841349A (zh) * 2022-04-29 2022-08-02 中铁四局集团有限公司 一种基于决策树的隧道全断面爆破方案的推理方法
CN117592163B (zh) * 2023-12-04 2024-04-16 南宁轨道交通建设有限公司 盾构隧道纵向差异沉降治理的辅助决策方法
CN117973044B (zh) * 2024-02-04 2024-06-14 中南大学 一种隧道智能爆破设计方法及激光定位设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191771A (zh) * 2021-03-19 2021-07-30 青岛檬豆网络科技有限公司 采购商账期风险预测方法
CN114841349A (zh) * 2022-04-29 2022-08-02 中铁四局集团有限公司 一种基于决策树的隧道全断面爆破方案的推理方法

Also Published As

Publication number Publication date
LU505928B1 (fr) 2024-04-29
WO2023207387A1 (fr) 2023-11-02
CN114841349A (zh) 2022-08-02

Similar Documents

Publication Publication Date Title
LU505928B1 (fr) A decision tree-based inference method for a tunnel full-section blasting plan
CN111639237B (zh) 一种基于聚类和关联规则挖掘的电力通信网风险评估系统
CN111625922B (zh) 一种基于机器学习代理模型的大规模油藏注采优化方法
CN104573106A (zh) 一种基于案例推理技术的城市建设智能审批方法
CN111365015A (zh) 一种基于XGBoost的盾构掘进参数特征提取与姿态偏差预测方法
CN106355334A (zh) 农田建设区域判定方法
CN106066873A (zh) 一种基于本体的旅游信息推荐方法
CN109409647A (zh) 一种基于随机森林算法的薪资水平影响因素的分析方法
CN113344050A (zh) 一种基于深度学习的岩性智能化识别方法及系统
US20230169244A1 (en) Method for evaluating fracture connectivity and optimizing fracture parameters based on complex network theory
CN106227828A (zh) 一种同构层次数据对比可视分析方法和应用
CN111639877A (zh) 一种行道树健康状况评价方法及系统
CN113111504A (zh) 一种基于目标树经营间伐木智能选择算法
CN118133104A (zh) 一种深层海相页岩气井岩相快速识别方法
CN113656868A (zh) 基于bim技术的医院建设协同管理平台
CN108763164A (zh) 煤与瓦斯突出反演相似度的评价方法
CN115660221B (zh) 基于混合神经网络的油气藏经济可采储量评估方法及系统
CN115439012A (zh) 基于gis的干旱绿洲县级区域城镇建设适宜性细化评价方法
CN106599511A (zh) 一种薄煤层长壁综采工作面采煤方法优选的方法
CN112597661A (zh) 基于物种分布和生产力耦合的工业用材林生产力预测方法
Sun et al. Achievement Analysis and Management System Based on Data Mining Algorithm
Cao et al. Study on inferring interwell connectivity of injection-production system based on decision tree
Pham et al. Assessment of plant species for the roadside at Vung Tau city of Vietnam using multi-criteria analysis
CN117668453B (zh) 一种基于FP-growth的井壁失稳风险预测与辅助决策系统及方法
Yang et al. Analysis of influencing factors of carbon sequestration based on BP neural network and clustering decision tree model

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20240429