CN103886030B - Cost-sensitive decision-making tree based physical information fusion system data classification method - Google Patents

Cost-sensitive decision-making tree based physical information fusion system data classification method Download PDF

Info

Publication number
CN103886030B
CN103886030B CN201410077291.4A CN201410077291A CN103886030B CN 103886030 B CN103886030 B CN 103886030B CN 201410077291 A CN201410077291 A CN 201410077291A CN 103886030 B CN103886030 B CN 103886030B
Authority
CN
China
Prior art keywords
decision tree
classification
data
decision
test sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410077291.4A
Other languages
Chinese (zh)
Other versions
CN103886030A (en
Inventor
陈志�
朱超
岳文静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Hengtong Mdt InfoTech Ltd
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201410077291.4A priority Critical patent/CN103886030B/en
Publication of CN103886030A publication Critical patent/CN103886030A/en
Application granted granted Critical
Publication of CN103886030B publication Critical patent/CN103886030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification

Abstract

Disclosed is a cost-sensitive decision-making tree based physical information fusion system data classification method. The cost-sensitive decision-making tree based physical information fusion system data classification method includes: firstly, using physical units for data acquisition; then transmitting acquired data to information units; finally, performing further analyzing and processing in the information units. The information unit is operated in the following steps that 1), the information units respectively establish decision-making trees corresponding to the physical units according to respectively stored sample data; 2), the established decision-making trees are tested, specific value of the number of misclassification to the overall number of tests of the decision-making trees is recorded, and the decision-making trees too large in the specific value are deleted; 3), classifying the data on the basis of the rest decision-making trees and combining specific value of the previously recorded number of misclassification to the overall number of tests. A tree structure can be conveniently and rapidly established to perform efficient classification on the data, which is easy for a user to know about the data, and the user does not need to be with a lot of background knowledge.

Description

Information physical emerging system data classification method based on cost-sensitive decision tree
Technical field
The present invention relates to a kind of Decision-Tree Method based on cost-sensitive, it is mainly used in solving information physical fusion system In system, data efficient classification and the loss producing when classifying or cost summation are minimum problem, belong to information physical and merge system The interleaving techniques application that system data is excavated.
Background technology
Information physical emerging system is considered as the third wave of world information technology after computer, the Internet. Information physical emerging system can be understood as the high-efficiency network intelligent information system based on embedded device, is to have height certainly Main perception, autonomous judge, from main regulation and capacity of self-government, be capable of virtual world and real physical world interconnection with collaborative Intelligence system of future generation.Information physical emerging system functionally mainly considers performance optimization, is that collection calculates (computation), communication (communication) and control (control) 3c are in the intellectual technology of one.Now, information Physics emerging system technology has been obtained for international industrial and commercial circles and the highest attention of many large-scale international corporations, development speed pole For rapid, it has been applied to multiple important development such as traffic, medical treatment, energy field, has had broad application prospects.
Data mining is an iterative process, and it searches valuable, unusual fresh information from substantial amounts of data, It is people and computer result of the joint efforts;It describes between knowledge and the search capability of computer of problem and target in human expert Seeking balance, in the hope of obtaining best result.Data mining is one of field with fastest developing speed in computer industry, and it is only in the past It is one of computer science and statistics theme, nowadays, it has developed rapidly becomes an independent field.Data Excavate a most powerful advantage and be that it can be the application of many methods and techniques and substantial amounts of problem set.Data mining is one The individual non-human act carrying out on large data sets, so its maximum target market is whole data warehouse, Data Mart and determines Plan supports industry, including industries such as retail, manufacture, telecommunications, medical treatment, insurance, transports.
Classification is a kind of important data analysis form, and it extracts the model portraying significant data class.This model is referred to as Grader, the class label of prediction classification.Classification is generally divided into two steps: the first step, and we set up one based on the data providing Individual disaggregated model;Second step, determines whether the accuracy rate of this model can accept, if it is then using this model to new Data is classified.Most sorting algorithm is the algorithm of memory resident, often assumes that data volume very little.With modern technologies Continuous development, data mining research set up on these working foundations, develop telescopic classification and Predicting Technique, can Process the big, data of resident disk.Classification have extensive application, including fraud detection, target marketing, performance prediction, manufacture and Medical diagnosiss etc..
Decision tree is a kind of tree construction of similar flow chart, and it is a kind of typical sorting technique.It enters to data first Row is processed, and generates readable rule using inductive algorithm and sets up decision tree, then using decision-making, new data is analyzed.This In matter, decision tree is process data classified by series of rules.In 20 century 70 later stages and 80 years 20th century For the initial stage, machine learning academy j.rossquinlan develops decision Tree algorithms, the referred to as riffle of iteration (iterativedichotomiser, id3).The follow-up c4.5 that quinlan had also been proposed id3 later becomes new supervision Practise the performance comparision benchmark of algorithm.1984, multidigit statistician published " classificationandregressiontrees " (cart), describes the generation of binary decision tree.Traditional decision tree Calculate and mostly employ greedy method, and employ the method construct tree construction of dividing and ruling of top-down recurrence.
The prototype of cost-sensitive (cost-sensitivelearning, csl) classification problem is medical diagnosiss problem.At this In problem, doctor must be weighed for the probability of patient's a variety of medical treatment test, test cost and desired test effect Weighing apparatus.The learning method of cost-sensitive mainly considers in classification, when different classification errors can lead to different punishment dynamics How to train grader.For example in medical treatment, " being the cost of Healthy People by patient's mistaken diagnosis " and " Healthy People mistaken diagnosis is patient's Cost " is different;Usurp in detection in financial credit card, " by usurping the cost being mistakenly considered normally to use " and will " normally use by mistake Think the cost usurped " also different.It is not difficult to find out, probability erroneous judgement is very little, but if can not correctly judge One potential erroneous judgement, it will lead to a series of loss, thus more have realistic meaning using final loss as weighing target.
Since the cost study based on cost-sensitive is suggested, cause the concern of a lot of experts it is proposed that a lot Novel method.As a whole, there are two class basic ideas: a kind of method is exactly the sorting technique not changing classics, and simply right Original data set makees certain process, or perhaps the one layer of algorithm of outer wrapping in classical sorting algorithm, makes up to pre- The fixed purpose to cost-sensitive;Another kind of thinking is then on classical taxonomy algorithm basis, adds some other factorses, reaches Purpose to cost sensitive learning.
Content of the invention
Technical problem: it is an object of the invention to provide a kind of information physical emerging system number based on cost-sensitive decision tree According to sorting technique, the method increased the consideration of cost-sensitive during decision tree foundation, to solve information physical fusion Data efficient classification in system, and in classification, the loss of generation or cost summation are down to the problem of minimum.
Technical scheme: the information physical emerging system data classification side based on cost-sensitive decision tree of the present invention Method, user first builds many decision trees in information physical emerging system, then these decision trees are formed a combination decision tree. Under the consideration based on cost-sensitive, by the calculating to combination decision tree error rate, data is classified.
Information physical emerging system of the present invention is made up of some physical locations and an information unit, wherein said Physical location be used for collecting data;Described information unit is used for receiving and store the data of physical location transmission, and to this It is analyzed and process.
Comprised the following steps based on the information physical emerging system data classification method of cost-sensitive decision tree:
The component units of information physical emerging system are divided into multiple physical locations and an information list by step 1) user Unit;Described physical location is used for collecting data, and described information unit is used for analysis and processing data;
Step 2) training sample and test sample placed in systems by user in advance, described training sample and test sample In each sample include catalogue number(Cat.No.), Property Name, corresponding property value and classification;
Step 3) user starts each physical location and collects training sample data;
Each physical location of step 4) sends the training sample collected data to information unit, and information unit is to difference The training sample data that physical location sends is stored respectively;
Step 5) information unit is respectively these physical locations according to the sample data storing respectively and sets up corresponding decision-making Tree, described decision tree is the grader of one of machine learning classification algorithm tree, and grader is a kind of computer journey Sequence, effect is automatically data can be assigned to known class;
The decision tree obtaining is placed in a list step 6) information unit, forms a decision tree list, described List is according to linear precedence, the set of the data item of arrangement in data structure, can carry out in this data structure Basic operation include to element lookup, insertion and deletion;
Step 7) information unit is tested to each decision tree.Specifically comprise the following steps that
Step 7.1) one group of ready test sample is input in the decision tree that each establishes;
Step 7.2) test sample can obtain corresponding classification results through the decision-making of decision tree;
Step 7.3) classification of the classification results obtaining and test sample itself to be compared, identical then classification is correct, Difference then classification error;The sample number of misregistration classification;
Step 7.4) if the test sample number of mistake classification and the ratio of total test sample number are more than 10%, certainly This decision tree is deleted, if being otherwise retained in decision tree list in plan tree list;Wherein by the test sample of mistake classification The ratio of number and total test sample number is designated as wi, and in information unit, i represents i-th decision tree in decision tree list to record;
Step 8) is after above-mentioned steps, if decision tree list is sky, chooses that minimum decision tree of wi as Whole decision tree;
If step 9) decision tree list comprises only a decision tree, this class decision tree is final decision tree;
If step 10) decision tree list contains the decision tree of two or more than two, by determining in decision tree list The grader that plan tree forms for multiple decision trees as combination decision tree, described combination decision tree, each decision tree has accordingly Weights, this weights is the test sample number of mistake classification and the ratio of total test sample number;
Step 11) calculates wrong classification rate h of combination decision tree, returns corresponding classification during h value minimum as data Classification results;Wherein h passes throughCalculate and obtain, i represents i-th decision tree;J represents classification;ci,j Represent whether the classification results obtaining after i-th decision tree are classification j, if then ci,jIt is worth for 1, if otherwise ci,jIt is worth for 0; wiThe ratio of the test sample number of mistake classification and total test sample number in i-th decision tree when testing before expression;N represents In combination decision tree, the classification of classification results is a tree of the decision tree of j, and argmin represents selectionIn minimum Value is as the value of final h.
Beneficial effect: the present invention, when information physical emerging system data is classified, employs data mining The decision Tree algorithms of middle sorting algorithm, and introduce process with regard to cost-sensitive to solve data in information physical emerging system Efficiently classify, and in classification, the loss of generation or cost summation are down to the problem of minimum.Specifically, of the present invention Following beneficial effect is had based on the information physical emerging system data classification method of cost-sensitive decision tree:
(1) decision tree should be readily appreciated that and realizes, user in learning process not it should be understood that a lot of background knowledges, only Need by explanation after it is possible to understand that the meaning expressed by decision tree at once.
(2) decision tree can directly embody the feature of data, and tree person easy to use observes and understands.
(3) for decision tree, the preparation of data is often simple, and can process discrete data or continuous simultaneously Type data, can make feasible and respond well result within the relatively short time to large data source.
(4) decision tree is easy to by test, model be evaluated and tested, can be with rating model credibility.If given observe Data, then be easy to release corresponding classifying ruless according to produced decision tree.
(5) cost sensitive learning introducing is by training dataset, trains and obtains examining of minimal error classification cost Disconnected learning system, is finally to pursue mistake classification cost minimization, reducing the probability of mistake classification.So it is no longer Classical classification and prediction, but the extension of classical taxonomy.
Brief description
Fig. 1 decision tree sets up process schematic,
The information physical emerging system data classification method flow chart based on cost-sensitive decision tree for the Fig. 2.
Specific embodiment
The present invention, using the decision Tree algorithms with reference to cost-sensitive, the classification of optimization information physics emerging system data, subtracts The loss that few mistake classification brings.For a more detailed description to the present invention below according to drawings and Examples.
The present invention is specifically described according to weather sample data, weather sample data include catalogue number(Cat.No.), Property Name and Corresponding property value, classification.Wherein attribute includes rain or shine (corresponding property value has sunny, overcast, rainy), temperature (corresponding property value has hot, mild, cool), humidity (corresponding property value has high, normal), (corresponding attribute of blowing Value has true, false), classification is play (corresponding classification has yes, no) of whether going out.
1st, the component units of information physical emerging system are divided into multiple physical locations and an information unit by user.Institute State physical location to be used for collecting weather data;Described information unit is used for analysis and processing data.
2nd, weather training sample and weather test sample are placed in systems by user in advance.Described weather training sample and Each sample in weather test sample includes catalogue number(Cat.No.), Property Name, corresponding property value, classification.
3rd, user starts each physical location collection weather training sample data.
4th, each physical location sends the weather collected training sample data to information unit.Information unit is to difference The weather training sample data that physical location sends is stored respectively.
5th, information unit is respectively these physical locations according to the weather sample data storing respectively and sets up corresponding decision-making Tree.Described decision tree is the grader of one of machine learning classification algorithm tree, wherein root node comprise all of Input sample data, leafy node is the result of classification, and remaining node represents the test to certain attribute.Grader is a kind of meter Calculation machine program, effect is automatically data can be assigned to known class.It is as follows that each decision tree specific sets up process:
5.1st, set up a node as the root node of decision tree
5.2nd, training of judgement concentrates whether only one of which classification, if then node is labeled as leafy node, and terminates to determine The construction process of plan tree;If otherwise whether judgment sample is with the presence of attribute.
If 5.3 samples do not have attribute, it is leafy node by having the most category label of number in training set, and ties Bundle construction process;If otherwise beginning look for best splitting rule.
5.4th, calculate information gain, that chooses maximum sets up branch, and the deletion from sample data as Split Attribute This attribute.Described information gain is a kind of criterion, is to see to bring how much information for classification by attribute (information herein is Refer to the impact for final classification result), the information brought more (impact is bigger), this attribute is more important.Wherein information gain Formula beD is training set, piRepresent the nonzero probability belonging to class i in training set.Described point Split attribute to refer to select certain attribute as the condition of decision tree tree branch, if property value is discrete, typically Comprise the branch of this attribute be possible to property value number;If serial number attribute, then typically judge this numerical value Or whether more than the constant less than or equal to certain predefined, provide a y-bend division.
5.5th, each branch is carried out with the operation of above-mentioned Step d, and carries out the judgement of b and step c.
6th, the decision tree obtaining is placed in a list information unit, forms a decision tree list.Described list Can carry out basic in data structure according to linear precedence, the set of the data item of arrangement in this data structure Operation include to element lookup, insertion, and delete.
7th, information unit is tested to each decision tree.Specifically comprise the following steps that
7.1st, one group of ready weather test sample is input in the decision tree that each establishes;
7.2nd, weather test sample can obtain corresponding classification results through the decision-making of decision tree;
7.3rd, the classification results obtaining and the weather test sample classification of itself are compared, identical then classification is correct, Difference then classification error;The sample number of misregistration classification;
If the ratio of the weather test sample number of 7.4 mistake classification and total weather test sample number is more than 10%, This decision tree is deleted in decision tree list;If it is not, being then retained in decision tree list.Wherein by mistake point class testing The ratio of sample number and total test sample number is designated as wi, in information unit, wherein i represents i-th in decision tree list record Individual decision tree;
8th, judge whether decision tree list is empty, if then choosing that minimum decision tree of wi is final decision tree.
9 if it is determined that decision tree list is not empty, then continue to judge whether decision tree list comprises only a decision tree, If then this class decision tree is final decision tree.If it is not, then using the decision tree in decision tree list as combination decision tree.Institute State the grader that combination decision tree forms for multiple decision trees, each decision tree has corresponding weights (herein for mistake classification Test sample number and the ratio of total test sample number), data needs the classification results of each decision tree comprehensive can obtain Classification afterwards.
10th, calculate wrong classification rate h of combination decision tree, return corresponding classification dividing as data during h value minimum Class result.H passes throughCalculate and obtain, wherein i represents i-th decision tree;J represents classification;ci,jTable Show whether the classification results obtaining after i-th decision tree are classification j, if then ci,jIt is worth for 1, if otherwise ci,jIt is worth for 0;wi The ratio of the test sample number of mistake classification and total test sample number in i-th decision tree when testing before expression;N expression group In conjunction decision tree, the classification of classification results is a tree of the decision tree of j;Argmin represents selectionIn minima Value as final h.

Claims (1)

1. a kind of information physical emerging system data classification method based on cost-sensitive decision tree is it is characterised in that the method bag Include following steps:
Step 1) component units of information physical emerging system are divided into multiple physical locations and an information unit by user;Institute State physical location and be used for collecting data, described information unit is used for analysis and processing data;
Step 2) training sample and test sample placed in systems, in described training sample and test sample by user in advance Each sample includes catalogue number(Cat.No.), Property Name, corresponding property value and classification;
Step 3) user start each physical location collect training sample data;
Step 4) each physical location sends the training sample collected data to information unit, and information unit is to different physics The training sample data that unit sends is stored respectively;
Step 5) information unit is respectively these physical locations according to the sample data storing respectively and sets up corresponding decision tree, institute State the grader that decision tree is one of machine learning classification algorithm tree, grader is a kind of computer program, make With being automatically data can be assigned to known class;
Step 6) decision tree obtaining is placed in a list information unit, forms a decision tree list, described list Can carry out basic in data structure according to linear precedence, the set of the data item of arrangement in this data structure Operation include to element lookup, insertion and deletion;
Step 7) information unit tests to each decision tree, specifically comprises the following steps that
Step 7.1) one group of ready test sample is input in the decision tree that each establishes;
Step 7.2) test sample can obtain corresponding classification results through the decision-making of decision tree;
Step 7.3) classification of the classification results obtaining and test sample itself is compared, identical correct, difference of then classifying Then classification error;The sample number of misregistration classification;
Step 7.4) if the test sample number of mistake classification and the ratio of total test sample number are more than 10%, in decision tree This decision tree is deleted, if being otherwise retained in decision tree list in list;Wherein by mistake classification test sample number and The ratio of total test sample number is designated as wi, in information unit, i represents i-th decision tree in decision tree list to record;
Step 8) after above-mentioned steps, if decision tree list is sky, choose wiThat minimum decision tree is determined as final Plan tree;
Step 9) if decision tree list comprises only a decision tree, this decision tree is final decision tree;
Step 10) if decision tree list contains the decision tree of more than two, using the decision tree in decision tree list as group Close decision tree, the grader that described combination decision tree forms for multiple decision trees, each decision tree has corresponding weights, this weights Test sample number for mistake classification and the ratio of total test sample number;
Step 11) calculate wrong classification rate h of combination decision tree, corresponding classification dividing as data when returning h value minimum Class result;Wherein h passes throughCalculate and obtain, i represents i-th decision tree;J represents classification;ci,jRepresent Whether the classification results being obtained after i-th decision tree are classification j, if then ci,jIt is worth for 1, if otherwise ci,jIt is worth for 0;wiTable The ratio of the test sample number of mistake classification and total test sample number in i-th decision tree during test before showing;N represents combination In decision tree, the classification of classification results is a tree of the decision tree of j, and argmin represents selectionIn minima make Value for final h.
CN201410077291.4A 2014-03-05 2014-03-05 Cost-sensitive decision-making tree based physical information fusion system data classification method Active CN103886030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410077291.4A CN103886030B (en) 2014-03-05 2014-03-05 Cost-sensitive decision-making tree based physical information fusion system data classification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410077291.4A CN103886030B (en) 2014-03-05 2014-03-05 Cost-sensitive decision-making tree based physical information fusion system data classification method

Publications (2)

Publication Number Publication Date
CN103886030A CN103886030A (en) 2014-06-25
CN103886030B true CN103886030B (en) 2017-01-25

Family

ID=50954922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410077291.4A Active CN103886030B (en) 2014-03-05 2014-03-05 Cost-sensitive decision-making tree based physical information fusion system data classification method

Country Status (1)

Country Link
CN (1) CN103886030B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657610B (en) * 2015-02-13 2017-11-17 南京邮电大学 A kind of information physical emerging system sequential logic robustness appraisal procedure
CN104778250B (en) * 2015-04-14 2017-09-29 南京邮电大学 Information physical emerging system data classification method based on genetic planning decision tree
CN104915608B (en) * 2015-05-08 2018-07-27 南京邮电大学 A kind of information physical emerging system secret protection type data classification method
CN106611181A (en) * 2016-05-30 2017-05-03 四川用联信息技术有限公司 Method for constructing cost-sensitive two-dimensional decision tree
CN108763221B (en) * 2018-06-20 2022-05-17 科大讯飞股份有限公司 Attribute name representation method and device
CN109358608A (en) * 2018-11-28 2019-02-19 广东电网有限责任公司 A kind of transformer state methods of risk assessment and device based on integrated study
CN110389952B (en) * 2019-06-06 2021-09-07 口碑(上海)信息技术有限公司 Dish data processing method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012108889A (en) * 2010-10-22 2012-06-07 Intec Inc Distributed database system and distributed data processing system
CN102999582A (en) * 2012-11-15 2013-03-27 南京邮电大学 Lightweight rule-based WoT (Web of Things) monitoring system
CN103092918A (en) * 2012-12-21 2013-05-08 东莞中国科学院云计算产业技术创新与育成中心 Method for construction and query of knowledge base of emergency aid decision-making system based on location information

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325276A (en) * 2000-05-12 2001-11-22 Internatl Business Mach Corp <Ibm> Information processing system, information processing method, computer program, storage medium and program transmitter

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012108889A (en) * 2010-10-22 2012-06-07 Intec Inc Distributed database system and distributed data processing system
CN102999582A (en) * 2012-11-15 2013-03-27 南京邮电大学 Lightweight rule-based WoT (Web of Things) monitoring system
CN103092918A (en) * 2012-12-21 2013-05-08 东莞中国科学院云计算产业技术创新与育成中心 Method for construction and query of knowledge base of emergency aid decision-making system based on location information

Also Published As

Publication number Publication date
CN103886030A (en) 2014-06-25

Similar Documents

Publication Publication Date Title
CN103886030B (en) Cost-sensitive decision-making tree based physical information fusion system data classification method
CN110597735B (en) Software defect prediction method for open-source software defect feature deep learning
CN104992184B (en) A kind of multiclass image classification method based on semi-supervised extreme learning machine
CN104573669A (en) Image object detection method
CN110188209A (en) Cross-module state Hash model building method, searching method and device based on level label
CN103412888B (en) A kind of point of interest recognition methods and device
CN108549954A (en) Risk model training method, risk identification method, device, equipment and medium
CN107506793A (en) Clothes recognition methods and system based on weak mark image
CN106897821A (en) A kind of transient state assesses feature selection approach and device
CN111008337B (en) Deep attention rumor identification method and device based on ternary characteristics
CN106874963B (en) A kind of Fault Diagnosis Method for Distribution Networks and system based on big data technology
CN111754345A (en) Bit currency address classification method based on improved random forest
CN107392241A (en) A kind of image object sorting technique that sampling XGBoost is arranged based on weighting
CN105654196A (en) Adaptive load prediction selection method based on electric power big data
CN111914162B (en) Method for guiding personalized learning scheme based on knowledge graph
CN111292195A (en) Risk account identification method and device
CN107918657A (en) The matching process and device of a kind of data source
CN112732921B (en) False user comment detection method and system
CN106649266A (en) Logical inference method for ontology knowledge
CN106407482B (en) A kind of network academic report category method based on multi-feature fusion
CN107977670A (en) Accident classification stage division, the apparatus and system of decision tree and bayesian algorithm
CN108762503A (en) A kind of man-machine interactive system based on multi-modal data acquisition
CN112756759A (en) Spot welding robot workstation fault judgment method
CN104850862B (en) A kind of sorting technique based on unit cost income sensitivity decision tree
CN107423697A (en) Activity recognition method based on non-linear fusion depth 3D convolution description

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180705

Address after: 210003 A, block 1502, Nanjing University of Finances and Economics science and Technology Park, 128 North Railway Street, Nanjing, Jiangsu.

Patentee after: Nanjing Hengtong Mdt InfoTech Ltd

Address before: 210023 9 Wen Yuan Road, Ya Dong new town, Nanjing, Jiangsu.

Patentee before: Nanjing Post & Telecommunication Univ.