CN104506340A - Creation method of decision tree in industrial Ethernet fault diagnosis method - Google Patents

Creation method of decision tree in industrial Ethernet fault diagnosis method Download PDF

Info

Publication number
CN104506340A
CN104506340A CN201410677493.2A CN201410677493A CN104506340A CN 104506340 A CN104506340 A CN 104506340A CN 201410677493 A CN201410677493 A CN 201410677493A CN 104506340 A CN104506340 A CN 104506340A
Authority
CN
China
Prior art keywords
decision
decision tree
attribute
node
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410677493.2A
Other languages
Chinese (zh)
Inventor
孟瑾
吴雪芹
王德吉
张乾
蔡保国
孟霞
刘博�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Tobacco Henan Industrial Co Ltd
Original Assignee
China Tobacco Henan Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Tobacco Henan Industrial Co Ltd filed Critical China Tobacco Henan Industrial Co Ltd
Priority to CN201410677493.2A priority Critical patent/CN104506340A/en
Publication of CN104506340A publication Critical patent/CN104506340A/en
Pending legal-status Critical Current

Links

Landscapes

  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The invention discloses a creation method of a decision tree in an industrial Ethernet fault diagnosis method. The method comprises the following steps: partitioning the decision tree into a decision attribute node, an attribute value branch and a leaf node, wherein the decision attribute node is a set of decision attributes for classification; the attribute value branch is a set of attribute values of value characteristics which are further partitioned according to the decision attributes; and the leaf node is a set of decision or classification results; constructing a decision tree according to a greedy algorithm; selecting a type identification attribute and a decision attribute set of the decision tree according to the practical demand of a user; and verifying and correcting the constructed decision tree. Through adoption of the method, effective inductive decision and data support can be provided for industrial Ethernet fault diagnosis, and accurate implementation and running of industrial Ethernet fault diagnosis are ensured.

Description

Based on the creation method of decision tree in Industrial Ethernet method for diagnosing faults
Technical field
The present invention relates to Industrial Ethernet Control System technical field, particularly relate to a kind of creation method based on decision tree in Industrial Ethernet method for diagnosing faults.
Background technology
PROFINET is released by PROFIBUS international organization (PROFIBUS International, PI), is the automation bus standard of a new generation based on industrial Ethernet technology.As a strategic technological innovation; PROFINET is that automated communication field provides a complete Networking Solutions & provisioned; enumerate the much-talked-about topic of the current automatic fields such as such as real-time ethernet, motion control, distributed automatization, failure safe and network security; and; as the technology across supplier; can complete compatible Industrial Ethernet and existing fieldbus (as PROFIBUS) technology, protection existing investment.
Between in the past several years, the scale of industrial machine network experienced by explosive growth.The application of network has been deep into each corner of people's production, becomes requisite infrastructure.Along with the reinforcement to mesh dependence, the reliability of people to network it is also proposed higher requirement: the first, has stable, efficient, safe network environment: the second, when network failure, can detect failure cause timely and repair.Can find out, network fault diagnosis has great importance to keeping the health status of network.But under current network environment, network fault diagnosis encounters unprecedented difficulty, and it is mainly manifested in the following aspects; Controller network no matter from scale, or has had huge development from network complexity and business diversity.The fault relationship of large scale network is intricate, and the corresponding relation between failure cause and phenomenon of the failure is fuzzy, substantially increases the difficulty of failure diagnosis.
The complexity of the network equipment also improves the difficulty of failure diagnosis.The complexity of the network equipment has two implications: first is that the new network equipment is constantly released, and function gets more and more, and becomes increasingly complex; Second is equipment supplier's One's name is legion, product specification and standard disunity.
Along with the extensive use of PROFINET, controller technology and the network communications technology are developed rapidly, and new digital communications network not only has multiple business flow, and have employed the network transmission technology of multiple fusion.The new network of continuous employing proposes more and more higher requirement to network fault diagnosis; Just because of the existence of above-mentioned difficulties, the failure diagnosis that traditional dependence digerait manual type is carried out can not have been satisfied the demand.Modern Network calls intelligentized fault diagnosis technology, to realize the automation of network fault diagnosis, people is freed from heavy diagnostic work.
Intelligent network fault diagnosis technology has the difficult point of following four aspects:
The first, the dynamic change of the uncertainty that fault discovery network failure occurs and network hardware and Software Architecture, makes the knowledge comprising expert receive limitation.
The second, fault location equipment produces fault can affect the equipment or subsystem that are much connected with it, and even can cause the paralysis of network, this phenomenon is just called fault correlation.
3rd, the fault detection method of fault detect routine needs founding mathematical models, and the complexity of Mathematical Modeling and accuracy are difficult to the real-time requirement meeting express network; The Mathematical Modeling simplified causes again Actual Control Effect of Strong can not be satisfactory.
4th, representation for fault, due to the diversity of network application and continuous renewal, can't find a clear and definite function can represent all application layer faults now.
Summary of the invention
The object of this invention is to provide a kind of creation method based on decision tree in Industrial Ethernet method for diagnosing faults, effective inductive decision and Data support can be provided for Industrial Ethernet failure diagnosis, ensure accurate enforcement and the work of Industrial Ethernet failure diagnosis.
The technical solution used in the present invention is:
A kind of creation method based on decision tree in Industrial Ethernet method for diagnosing faults, described decision tree comprises decision attribute node, property value branch and leaf node, decision attribute node is the set carrying out the decision attribute of classifying, property value branch is the set of the property value of value characteristic according to decision attribute Further Division, and leaf node is the set of decision-making or classification results;
The creation method of decision tree comprises the following steps:
A: first, the decision kind set of classification logotype attribute and decision tree is selected according to the actual needs of user, classification logotype attribute refers to a certain particular community chosen according to user's actual needs, and decision kind set refers to the property set selected in all properties except classification logotype attribute;
B: according to greedy algorithm structure decision tree, greedy algorithm refers to top-down recurrence rule, the mode of defeating in detail, and greedy algorithm step is as follows:
B1: set from the individual node representing training sample;
B2: if training sample is all at same class, then this node becomes leaf node, and mark with such, otherwise, select there is the attribute node of classification capacity most as the current leaf node of decision tree;
B3: according to the difference of present node attribute value, is divided into some subsets by training sample data collection, and each value forms a branch;
B4: the subset obtained for step B3, repeats step B3, and recurrence forms the decision tree on each division sample;
B5: stop when recurrence partiting step and if only if one of following condition is set up:
(1) all samples of given node belong to same class.
(2) do not remain attribute and can be used for Further Division sample; In this case, use majority voting, convert given node to leaf, and using the maximum classification of tuple number in sample as category label, also can deposit the category distribution of this node sample simultaneously.
(3) if a certain branch does not have sample, then a leaf node is created with most classes of sample.
C: after the decision tree structure in step B, need to verify the decision tree of construction complete, revise;
C1: the preliminary rule of the decision tree generation of the data check construction complete using new training sample data to concentrate;
C2: wiped out by the branch of impact prediction accuracy, revises complete.
Decision tree is divided into and is made up of decision attribute node, property value branch and leaf node by the present invention, decision attribute node is the set carrying out the decision attribute of classifying, property value branch is the set of the property value of value characteristic according to decision attribute Further Division, leaf node is the set of decision-making or classification results, again according to greedy algorithm structure decision tree, select the decision kind set of classification logotype attribute and decision tree according to the actual needs of user, then the decision tree of construction complete is verified, revised; Utilize this method can provide effective inductive decision and Data support for Industrial Ethernet failure diagnosis, ensure accurate enforcement and the work of Industrial Ethernet failure diagnosis.
Accompanying drawing explanation
Fig. 1 is decision tree visioning procedure figure of the present invention;
Fig. 2 is that decision tree of the present invention builds schematic diagram.
Embodiment
As shown in Figure 1, 2, decision tree of the present invention comprises decision attribute node, property value branch and leaf node, decision attribute node is the set carrying out the decision attribute of classifying, property value branch is the set of the property value of value characteristic according to decision attribute Further Division, and leaf node is the set of decision-making or classification results;
The creation method of decision tree comprises the following steps:
A: first, the decision kind set of classification logotype attribute and decision tree is selected according to the actual needs of user, classification logotype attribute refers to a certain particular community chosen according to user's actual needs, and decision kind set refers to the property set selected in all properties except classification logotype attribute;
B: according to greedy algorithm structure decision tree, greedy algorithm refers to top-down recurrence rule, the mode of defeating in detail, and greedy algorithm step is as follows:
B1: set from the individual node representing training sample;
B2: if training sample is all at same class, then this node becomes leaf node, and mark with such, otherwise, select there is the attribute node of classification capacity most as the current leaf node of decision tree;
B3: according to the difference of present node attribute value, is divided into some subsets by training sample data collection, and each value forms a branch;
B4: for a subset obtained in the previous step, repeat step B3, recurrence forms the decision tree on each division sample;
B5: stop when recurrence partiting step and if only if one of following condition is set up:
(1) all samples of given node belong to same class.
(2) do not remain attribute and can be used for Further Division sample; In this case, use majority voting, convert given node to leaf, and using the maximum classification of tuple number in sample as category label, also can deposit the category distribution of this node sample simultaneously.
(3) if a certain branch does not have sample, then a leaf node is created with most classes of sample.
C: after the decision tree structure in step B, need to verify the decision tree of construction complete, revise;
C1: the preliminary rule of the decision tree generation of the data check construction complete using new training sample data to concentrate;
C2: wiped out by the branch of impact prediction accuracy, revises complete.
The limb that decision tree is made up of the decision attribute of inside, property value and forming for the leaf node of saving result, so its reasoning process is exactly the process that the knowledge that utilizes its inside to contain carries out problem solving.That is, decision tree reasoning is exactly from root node, by the comparison carrying out decision attribute and value thereof on internal node repeated, determine the branch of decision tree to downward-extension (namely searching for), final arrival leaf node, obtain desired conclusion, so, the reasoning process of decision tree terminates.In fact, the reasoning process of decision tree is exactly the process traveled through by depth-first strategy decision tree, as long as arrive leaf node, and its reasoning of constipation bundle (or traversal) process.
The object of decision tree learning is from a large amount of examples, summarize the knowledge represented with form of decision tree, therefore can think that decision tree learning process is exactly a kind of knowledge acquisition process.Like this, decision tree learning and Knowledge Acquisition just connect, and we just can realize the automatic acquisition of knowledge by Knowledge Acquisition being converted to decision tree learning problem, and the core of decision tree learning is learning algorithm.
The representation of knowledge is a kind of agreement describing expertise, so that the data structure becoming machine to process the representation of knowledge of the mankind.Good representation of knowledge form not only can improve validity and the operation efficiency of knowledge store, and can improve the Reasoning Efficiency of intelligent system.The internal decision making attribute node of decision tree, property value branch and leaf node constitute a kind of tree form data structure.Generate a decision tree by the learning algorithm of decision tree, it just can realize classifying to unknown example or carrying out Analysis of Policy Making.Therefore can think and contain certain knowledge in the decision tree that a study completes, that is, decision tree has the ability expressing knowledge.The internal node of decision tree is the set of attribute, and branch is the set of property value, and leaf node is the set of decision-making or classification results.Decision tree is exactly utilize attribute and value thereof to represent the premise part of knowledge, and represents the conclusion part of knowledge with leaf node, thus expertise is showed with the form of decision tree.Article one, decision tree classification rule is exactly determine a knowledge of failure modes decision-making, that is can judge the type of fault based on this knowledge.This utilizes decision tree to set up the basic foundation of Network Fault Diagnosis Expert System knowledge base just.
In order to decision tree knowledge representation method is described, exemplarily, the rule of one group of failure judgement is provided.
Rule 1:if host network card and working properly host configuration correct and host CPU utilance > 90, then host system failure fault.
The abnormal and host network card of rule 2:if host network card work can not receive bag, then mainframe network hardware fault.
By this method, decision rule set becomes a form in relational database, and the logical relation between form conditional attribute and decision attribute is embodied by regular dictionary.Every bar decision rule becomes a data record.
Therefore, in fact the knowledge acquisition method studied based on decision tree is exactly the learning algorithm studying decision tree learning.Discuss below and utilize learning algorithm to generate decision tree, to realize the automatic acquisition of knowledge, i.e. machine learning.
Decision tree structure can carry out in two steps.The first step, the generation of decision tree: generate decision tree by training sample set.Generally, training sample data collection be lattice according to actual needs by history, have certain degree of integration, for the data acquisition system of Data Analysis Services.Second step, the beta pruning of decision tree: the decision tree generated on last stage is tested, corrects and revised.The preliminary rule that the effect of this step mainly uses the data check Decision Tree Construction step one in new sample data collection (being called test data set) to produce, wipes out the branch of those impact prediction accuracys.
In the process that decision tree generates, be input as training sample data collection, Output rusults is exactly decision tree.Node, branch, leaf three kinds of elements are comprised in decision tree.Wherein, each decision node of decision tree correspond to a decision attribute (testing attribute) of carrying out classifying, and branch correspond to the value feature by this attribute Further Division, and leaf represents the distribution of class or class.First, the decision kind set of classification logotype attribute and decision tree is selected according to the actual needs of user, decision kind set refers to the property set selected in candidate attribute (all properties except except classification logotype attribute), then start to construct decision tree, the rudimentary algorithm of Decision Tree Inductive is greedy algorithm, namely constructs decision tree in the mode of defeating in detail of top-down recurrence.Arthmetic statement is as follows:
Step 1: the individual node setting to represent training sample starts.
Step 2: if sample is all at same class, then this node becomes leaf, and marks with such.
Step 3: otherwise algorithms selection has the present node of attribute as decision tree of classification capacity most.
Step 4: according to the difference of current decision nodal community value, is divided into some subsets by training sample data collection.Each value forms a branch, has several value to form several branch.
Step 5: for a subset obtained in the previous step, repeat previous steps, recurrence forms the decision tree on each division sample.Once an attribute occurs on one node, just need not be considered it in the spawn of this node.
Step 6: stop when recurrence partiting step and if only if one of following condition is set up:
(1) all samples of given node belong to same class.
(2) do not remain attribute and can be used for Further Division sample.In this case, use majority voting, convert given node to leaf, and using the maximum classification of tuple number in sample as category label, also can deposit the category distribution of this node sample simultaneously.
(3) if a certain branch does not have sample, then a leaf is created with most classes of sample.
Decision tree needs to extract classifying rules from decision tree, generally needs to carry out two steps, first obtain simple rule, then simplified rules attribute after generating.
One, simple rule is obtained
For generated decision tree, therefrom can extract classifying rules easily, and represent with the form of if-then.We create a rule to the every paths from root to leaf, along the conjunct of each attribute-value on given path to formation rule former piece (if part), leaf node comprises class prediction, formation rule consequent (then part).If-then rule easy to understand, particularly when given tree is very large.
Two, simplified rules attribute
The simple rule directly obtained from decision tree, generally all containing a lot of irrelevant attribute, when not affecting regular prediction effect, should delete those unnecessary rules as far as possible.
If the form of rule is W,
If CthenCLASSD
Rule format after simplifying is R
ifC’Then CLASS D
Wherein C ' be from C, delete condition Q after form.Like this, the example that regular W ' covers can be divided into following 4 parts: satisfy condition C, belongs to class D's; Satisfy condition C, belongs to other classes; Satisfy condition C ', but the Q that do not satisfy condition, belong to class D's; Satisfy condition C ', but the Q that do not satisfy condition, and belong to other classes, above four class examples use Y1, F1 respectively, and Y2, F2 represent.Rule W covers Y1+F1 example, and wherein judging example number by accident is F1.Rule R covers Y1+F1+Y2+F2.So the probability of miscarriage of justice of regular R is Ucf (E1, Y1+E1), the probability of miscarriage of justice of regular R-is UCF (E1+E2, Y1+E1+Y2+E2).If Ucf (F1, Y1+F1) >=Ucf (F1+F2, Y1+F1+Y2+F2), then can delete condition Q from condition C.
Obtaining optimal rules former piece collection is a major issue.A kind of greedy search method is that from set of circumstances, delete one affects minimum condition to prediction effect at every turn, if after deleting this condition, probability of miscarriage of justice decreases, then said process continues.If after deleting, probability of miscarriage of justice adds, then can not delete this condition, and whole process of simplifying also terminates simultaneously.
After decision tree is converted into rule, because they are understandable, so they can form the basis of expert system.Pruning algorithms than tree is provided higher accuracy rate by the beta pruning of rule, because the beta pruning of rule is equivalent to only cut a leaf node in the beta pruning of tree, and this does not accomplish in the beta pruning of tree.
In order to verify the validity of structure traditional decision-tree on systematic knowledge obtains, choose 7 attribute composition Fault Identification parameter set A in network equipment information.Wherein A1 represents network card status, whether A2 representative configuration is correct, A 3 represents cpu busy percentage, A4 represents DISK utilance, A 5 represents network interface card packet receiving error rate, whether A6 representative restarts, A7 represents network interface card and whether captures bag, totally 80 sample instance set up fault decision tree, wherein choose 20 as test data.The sample data chosen is as shown in table 4.1.
Table 4.1 sample data
A1 A2 A3 A4 A5 A6 A7 Mistake
1 0 90 29 70 Y N Normally
0 0 80 80 20 N N Version updating
1 0 70 67 10 Y Y Configuration error
0 0 70 67 10 Y N Configuration error
Whether final attribute " configures correct " to be the root of decision tree, the value of " whether correctly configuring " to be divided into 2 sections: left subtree and right subtree.Then on the basis of 2 branches with identical Feature Selection Algorithm recurrence Construction child node separately and final leaf node, the decision tree structure obtained here is fairly simple.When data sample becomes very large, time fault category is also very abundant, can form a comparatively complicated decision tree, the rule drawn is more suitable for the identification for fault.
Travel through whole decision tree from tree root, 7 classifying ruless obtained are as follows:
If configuration is correct, and network interface card can not capture bag,
Then hardware fault;
If configuration is correct, and network interface card can capture bag,
Then safety problem;
If is improperly-configured, and network interface card can not catch bag, cpu busy percentage <=50,
Then is normal;
If is improperly-configured, and network interface card can not catch bag, and cpu busy percentage >50, does not restart,
Then version updating;
If is improperly-configured, and network interface card can not catch bag, and cpu busy percentage >50, restarts,
Then hardware fault;
If is improperly-configured, and network interface card can capture bag, network interface card packet receiving error rate≤20,
Then configuration error;
If is improperly-configured, and network interface card can capture bag, network interface card packet receiving error rate >20,
Then is normal;
These rules embody the feature of fault.Finally these rules stored in knowledge base, utilize it to provide decision-making foundation to failure modes, provide failure cause.

Claims (1)

1. the creation method based on decision tree in Industrial Ethernet method for diagnosing faults, it is characterized in that: described decision tree comprises decision attribute node, property value branch and leaf node, decision attribute node is the set carrying out the decision attribute of classifying, property value branch is the set of the property value of value characteristic according to decision attribute Further Division, and leaf node is the set of decision-making or classification results;
The creation method of decision tree comprises the following steps:
A: first, the decision kind set of classification logotype attribute and decision tree is selected according to the actual needs of user, classification logotype attribute refers to a certain particular community chosen according to user's actual needs, and decision kind set refers to the property set selected in all properties except classification logotype attribute;
B: according to greedy algorithm structure decision tree, greedy algorithm refers to top-down recurrence rule, the mode of defeating in detail, and greedy algorithm step is as follows:
B1: set from the individual node representing training sample;
B2: if training sample is all at same class, then this node becomes leaf node, and mark with such, otherwise, select there is the attribute node of classification capacity most as the current leaf node of decision tree;
B3: according to the difference of present node attribute value, is divided into some subsets by training sample data collection, and each value forms a branch;
B4: the subset obtained for step B3, repeats step B3, and recurrence forms the decision tree on each division sample;
B5: stop when recurrence partiting step and if only if one of following condition is set up:
(1) all samples of given node belong to same class;
(2) do not remain attribute and can be used for Further Division sample; In this case, use majority voting, convert given node to leaf, and using the maximum classification of tuple number in sample as category label, also can deposit the category distribution of this node sample simultaneously;
(3) if a certain branch does not have sample, then a leaf node is created with most classes of sample;
C: after the decision tree structure in step B, need to verify the decision tree of construction complete, revise;
C1: the preliminary rule of the decision tree generation of the data check construction complete using new training sample data to concentrate;
C2: wiped out by the branch of impact prediction accuracy, revises complete.
CN201410677493.2A 2014-11-21 2014-11-21 Creation method of decision tree in industrial Ethernet fault diagnosis method Pending CN104506340A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410677493.2A CN104506340A (en) 2014-11-21 2014-11-21 Creation method of decision tree in industrial Ethernet fault diagnosis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410677493.2A CN104506340A (en) 2014-11-21 2014-11-21 Creation method of decision tree in industrial Ethernet fault diagnosis method

Publications (1)

Publication Number Publication Date
CN104506340A true CN104506340A (en) 2015-04-08

Family

ID=52948055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410677493.2A Pending CN104506340A (en) 2014-11-21 2014-11-21 Creation method of decision tree in industrial Ethernet fault diagnosis method

Country Status (1)

Country Link
CN (1) CN104506340A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600163A (en) * 2016-12-26 2017-04-26 中电长城(长沙)信息技术有限公司 Financial self-service terminal fault diagnosis method based on decision tree learning algorithm and system thereof
CN106682421A (en) * 2016-12-28 2017-05-17 湖南坤宇网络科技有限公司 Boiler central-air-pipe clogging early-warning method based on decision-making tree system
CN106710163A (en) * 2016-12-28 2017-05-24 湖南坤宇网络科技有限公司 Boiler air pressure early warning method based on decision-making tree system
CN107483267A (en) * 2017-09-19 2017-12-15 中国人民解放军防空兵学院 A kind of EIGRP routing failures recognition methods
CN108805295A (en) * 2018-03-26 2018-11-13 海南电网有限责任公司电力科学研究院 A kind of method for diagnosing faults based on decision Tree algorithms
CN108989075A (en) * 2017-06-05 2018-12-11 中国移动通信集团广东有限公司 A kind of network failure locating method and system
CN110716820A (en) * 2019-10-10 2020-01-21 厦门钛尚人工智能科技有限公司 Fault diagnosis method based on decision tree algorithm
CN110855480A (en) * 2019-11-01 2020-02-28 中盈优创资讯科技有限公司 Network fault cause analysis method and device
CN111242164A (en) * 2019-12-27 2020-06-05 天津幸福生命科技有限公司 Decision result determination method, device and equipment
CN113034264A (en) * 2020-09-04 2021-06-25 深圳大学 Method and device for establishing customer loss early warning model, terminal equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101609986A (en) * 2008-06-20 2009-12-23 上海申瑞电力科技股份有限公司 Multilevel joint coordination automatic voltage control method based on decision tree
CN101615789A (en) * 2008-06-23 2009-12-30 上海申瑞电力科技股份有限公司 Method for estimating tracking state of wide area measurement system
CN101752866A (en) * 2008-12-10 2010-06-23 上海申瑞电力科技股份有限公司 Automatic heavy-load equipment early warning implementation method based on decision tree

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101609986A (en) * 2008-06-20 2009-12-23 上海申瑞电力科技股份有限公司 Multilevel joint coordination automatic voltage control method based on decision tree
CN101615789A (en) * 2008-06-23 2009-12-30 上海申瑞电力科技股份有限公司 Method for estimating tracking state of wide area measurement system
CN101752866A (en) * 2008-12-10 2010-06-23 上海申瑞电力科技股份有限公司 Automatic heavy-load equipment early warning implementation method based on decision tree

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吕俊: "《电力设备故障诊断与检修决策支持系统的研究》", 《中国优秀硕士学位论文全文数据库工程科技II辑》 *
石金彦等: "《基于决策树的数据挖掘方法在故障诊断中的应用》", 《水利电力机械》 *
陶维,王海涛: "《一种基于ID3决策树的优化算法》", 《自动化技术与应用》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106600163A (en) * 2016-12-26 2017-04-26 中电长城(长沙)信息技术有限公司 Financial self-service terminal fault diagnosis method based on decision tree learning algorithm and system thereof
CN106682421A (en) * 2016-12-28 2017-05-17 湖南坤宇网络科技有限公司 Boiler central-air-pipe clogging early-warning method based on decision-making tree system
CN106710163A (en) * 2016-12-28 2017-05-24 湖南坤宇网络科技有限公司 Boiler air pressure early warning method based on decision-making tree system
CN108989075A (en) * 2017-06-05 2018-12-11 中国移动通信集团广东有限公司 A kind of network failure locating method and system
CN107483267A (en) * 2017-09-19 2017-12-15 中国人民解放军防空兵学院 A kind of EIGRP routing failures recognition methods
CN107483267B (en) * 2017-09-19 2021-01-15 中国人民解放军防空兵学院 EIGRP route fault identification method
CN108805295A (en) * 2018-03-26 2018-11-13 海南电网有限责任公司电力科学研究院 A kind of method for diagnosing faults based on decision Tree algorithms
CN110716820A (en) * 2019-10-10 2020-01-21 厦门钛尚人工智能科技有限公司 Fault diagnosis method based on decision tree algorithm
CN110855480A (en) * 2019-11-01 2020-02-28 中盈优创资讯科技有限公司 Network fault cause analysis method and device
CN110855480B (en) * 2019-11-01 2023-01-13 中盈优创资讯科技有限公司 Network fault fixed factor analysis method and device
CN111242164A (en) * 2019-12-27 2020-06-05 天津幸福生命科技有限公司 Decision result determination method, device and equipment
CN113034264A (en) * 2020-09-04 2021-06-25 深圳大学 Method and device for establishing customer loss early warning model, terminal equipment and medium

Similar Documents

Publication Publication Date Title
CN104506340A (en) Creation method of decision tree in industrial Ethernet fault diagnosis method
CN104506338A (en) Fault diagnosis expert system based on decision tree for industrial Ethernet network
CN106326585B (en) Prediction analysis method and device based on Bayesian Network Inference
CN106662854B (en) The method and system of the configuration of device for control system
CN110569867A (en) Decision tree algorithm-based power transmission line fault reason distinguishing method, medium and equipment
CN105335752A (en) Principal component analysis multivariable decision-making tree-based connection manner identification method
CN109785180A (en) A kind of scene perception system and method towards the twin workshop of number
CN107944705B (en) Full-end reliability calculation method for dividing communication communities based on modularity
CN112217674B (en) Alarm root cause identification method based on causal network mining and graph attention network
CN107729939B (en) CIM (common information model) expansion method and device for newly added power grid resources
CN114153980A (en) Knowledge graph construction method and device, inspection method and storage medium
Maier Identification of timed behavior models for diagnosis in production systems.
CN114281590A (en) Automatic generation method of fault tree
CN112416369A (en) Intelligent deployment method oriented to heterogeneous mixed environment
Xie et al. Logm: Log analysis for multiple components of hadoop platform
Mo et al. Network simplification and k-terminal reliability evaluation of sensor-cloud systems
CN107766943A (en) A kind of Knowledge Component automation exchange method under CPS environment
CN105718591B (en) A kind of rule-based and constraint satisfaction qualitative reasoning of spatial relations method
Lu et al. Zen-CC: An automated and incremental conformance checking solution to support interactive product configuration
CN117036060A (en) Vehicle insurance fraud recognition method, device and storage medium
CN109190204B (en) Complex mechanical product module division method based on complex network
CN108711074B (en) Service classification method, device, server and readable storage medium
CN106933844A (en) Towards the construction method of the accessibility search index of extensive RDF data
Lin et al. A method of satellite network fault synthetic diagnosis based on C4. 5 algorithm and expert knowledge database
Karegar et al. Data-mining by probability-based patterns

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 450000 Yulin South Road, Henan, Zheng Dong, No. 16 South Road, Zhengzhou

Applicant after: China Tobacco Henan Industrial Co., Ltd.

Address before: 450000 Zhengzhou agricultural road, Henan, No. 29

Applicant before: China Tobacco Henan Industrial Co., Ltd.

COR Change of bibliographic data
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150408