CN111709225A - Event cause and effect relationship judging method and device and computer readable storage medium - Google Patents
Event cause and effect relationship judging method and device and computer readable storage medium Download PDFInfo
- Publication number
- CN111709225A CN111709225A CN202010385693.6A CN202010385693A CN111709225A CN 111709225 A CN111709225 A CN 111709225A CN 202010385693 A CN202010385693 A CN 202010385693A CN 111709225 A CN111709225 A CN 111709225A
- Authority
- CN
- China
- Prior art keywords
- event
- vector representation
- vector
- cause
- output
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000000694 effects Effects 0.000 title claims description 18
- 239000013598 vector Substances 0.000 claims abstract description 77
- 230000001364 causal effect Effects 0.000 claims abstract description 35
- 238000003062 neural network model Methods 0.000 claims abstract description 22
- 238000004364 calculation method Methods 0.000 claims abstract description 21
- 238000007781 pre-processing Methods 0.000 claims abstract description 16
- 238000002372 labelling Methods 0.000 claims abstract description 11
- 238000000605 extraction Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 5
- 238000012850 discrimination method Methods 0.000 claims 1
- 230000006870 function Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 7
- 238000003058 natural language processing Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004880 explosion Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
- G06F40/216—Parsing using statistical methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Abstract
The embodiment of the application discloses a method, a device and a computer readable storage medium for judging causal relationship of events, wherein the method comprises the following steps: acquiring an original event corpus and preprocessing the event corpus; the pre-processing may include: extracting and labeling events; acquiring event vector representation according to the event subjected to event labeling; taking the event vector representation as input data, and inputting the input data into a preset deep neural network model; and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events. Through the scheme of the embodiment, the classification of causal events is facilitated, and the accuracy of the classification result is improved.
Description
Technical Field
The present disclosure relates to information processing technologies, and more particularly, to a method, an apparatus, and a computer-readable storage medium for event cause and effect relationship determination.
Background
At present, with the rapid development of artificial intelligence technology, the natural language processing field makes many progress, and many advanced achievements are continuously emerging, but the mutual influence between the events simulated by judging the causal relationship of the events by using the natural language processing technology still needs to be improved. The causal event relation determination can be applied to many fields, for example, the financial field can evaluate the business state of the financial risk event to downstream companies, stock fluctuation and the like through event causal transfer, so that a valuable reference analysis is provided for future development of companies, decision can be effectively assisted, and the financial operation risk is reduced.
At present, event causal relationship is extracted based on text information, and classification and identification are carried out mainly by calculating word frequency and constructing a feature vector through word frequency information such as TF-IDF (term frequency-inverse document frequency-inverse text frequency index).
The prior art method only utilizes frequency information of the word interrelation in the event in the cause and effect information, and utilizes the frequency information to represent that the mutual sufficiency and the necessary relation dimension between words are single and are greatly influenced by the internal word distribution of text data. Meanwhile, some words in the sentence do not greatly help causal calculation, and belong to the category of noise, and the information redundancy is brought by fully utilizing the words in the sentence, and even the accuracy of classification can be reduced.
Disclosure of Invention
The embodiment of the application provides a method and a device for judging causal relationship of events and a computer readable storage medium, which can help to classify causal events and improve the accuracy of classification results.
The embodiment of the application provides a method for judging a causal relationship of events, which can comprise the following steps:
acquiring an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: extracting and labeling events;
acquiring event vector representation according to the event subjected to event labeling;
taking the event vector representation as input data, and inputting the input data into a preset deep neural network model;
and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events.
The embodiment of the present application further provides an event cause and effect relationship determination device, which may include a processor and a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed by the processor, the method for determining event cause and effect relationship according to any one of the above-mentioned items is implemented.
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements any one of the above methods for determining cause and effect relationship of events.
Compared with the related art, the method comprises the steps of obtaining an original event corpus and preprocessing the event corpus; the pre-processing may include: extracting and labeling events; acquiring event vector representation according to the event subjected to event labeling; taking the event vector representation as input data, and inputting the input data into a preset deep neural network model; and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events. Through the scheme of the embodiment, the classification of causal events is facilitated, and the accuracy of the classification result is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. Other advantages of the present application may be realized and attained by the instrumentalities and combinations particularly pointed out in the specification and the drawings.
Drawings
The accompanying drawings are included to provide an understanding of the present disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the examples serve to explain the principles of the disclosure and not to limit the disclosure.
FIG. 1 is a flowchart illustrating a method for determining cause and effect relationship of events according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a deep neural network model according to an embodiment of the present application;
fig. 3 is a block diagram illustrating an event cause and effect determination apparatus according to an embodiment of the present application.
Detailed Description
The present application describes embodiments, but the description is illustrative rather than limiting and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the embodiments described herein. Although many possible combinations of features are shown in the drawings and discussed in the detailed description, many other combinations of the disclosed features are possible. Any feature or element of any embodiment may be used in combination with or instead of any other feature or element in any other embodiment, unless expressly limited otherwise.
The present application includes and contemplates combinations of features and elements known to those of ordinary skill in the art. The embodiments, features and elements disclosed in this application may also be combined with any conventional features or elements to form a unique inventive concept as defined by the claims. Any feature or element of any embodiment may also be combined with features or elements from other inventive aspects to form yet another unique inventive aspect, as defined by the claims. Thus, it should be understood that any of the features shown and/or discussed in this application may be implemented alone or in any suitable combination. Accordingly, the embodiments are not limited except as by the appended claims and their equivalents. Furthermore, various modifications and changes may be made within the scope of the appended claims.
Further, in describing representative embodiments, the specification may have presented the method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. Other orders of steps are possible as will be understood by those of ordinary skill in the art. Therefore, the particular order of the steps set forth in the specification should not be construed as limitations on the claims. Further, the claims directed to the method and/or process should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the embodiments of the present application.
The embodiment of the application provides a method for discriminating cause-and-effect relationship of events, as shown in fig. 1, the method may include S101-S104:
s101, obtaining an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: event extraction and event annotation.
In an exemplary embodiment of the present application, a data preparation work, i.e., data acquisition, may be performed first to acquire the original event corpus.
In an exemplary embodiment of the present application, obtaining the original event corpus may include: a crawler technology is utilized to crawl a large amount of event data from network resources.
In an exemplary embodiment of the present application, a large amount of news corpora, narrative article stories describing events, and the like may be crawled from network resources such as news websites and the like using existing crawler technologies as the event corpora.
In the exemplary embodiment of the present application, after the event corpus is obtained, event extraction, event tagging (which may also be referred to as corpus tagging) and other preprocessing may be further performed on the event corpus.
In an exemplary embodiment of the present application, when the preprocessing is event extraction, the preprocessing the event corpus may include:
and extracting a core predicate of the event from the sentence of the event corpus by adopting a preset event extraction tool, forming a binary group by the core predicate and a subject corresponding to the core predicate, and respectively representing an event by each binary group so as to realize event extraction.
In an exemplary embodiment of the present application, the obtained corpus data of the event corpus including the event relationship may be subjected to event representation extraction by a preset event extraction tool, for example, NLP (natural language processing tool). In particular, the core representation word (typically a predicate) of an event can be extracted from a sentence, such as: eating, fighting, running, exploding, dying, etc., an event is represented by the core predicate and a two-tuple relationship based on the subject of the core predicate. For example, a news: explosion events at suzhou xx chemical plants; the news event can be extracted as a binary (explosion, chemical plant) to represent the event. Another news: the dramatic increase in xx month xx day energy stocks can be drawn into (rising, energy stocks) this binary group.
In an exemplary embodiment of the present application, when the preprocessing is event tagging, the preprocessing the event corpus may include:
respectively carrying out event marking on every two binary groups to indicate the causal relationship of a first event and a second event corresponding to the two binary groups; the causal relationship comprises: the first event is the cause of the second event, the first event is the result of the second event, or the first event is non-causal to the second event.
In an exemplary embodiment of the present application, since a supervised algorithm in machine learning is required to predetermine the labeling of the data, at least a part of the data of the event extraction result may be labeled based on the result of the event extraction. For example, it is noted whether event a (which may be the first event described above) is the cause of event B (which may be the second event described above), whether event B is the cause of event a, or both. It can be expressed by one-hot coding, for example, using [1,0,0] to indicate whether event a is the cause of event B, using [0,1,0] to indicate whether event a is the result of event B, and using [0,0,1] to indicate that event a and event B are not causal.
And S102, acquiring event vector representation according to the event subjected to event annotation.
In an exemplary embodiment of the present application, after obtaining the event labeled binary, vector representation mapping of the event may be performed on the binary.
In an exemplary embodiment of the present application, the obtaining an event vector representation according to an event after event annotation may include:
obtaining vector representations p of two sentences in each binary group by randomly initializing vectorsiAndi is a natural number, i is 1, 2, 3, …, m represents the vector representation piAnddimension (d);
representing the vector as pi and siStitching to obtain the event vector representation wherein ,piA statement vector representation corresponding to the core predicate,and representing the statement vector corresponding to the subject.
In an exemplary embodiment of the present application, the words of the binary group obtained above may be processed by means of randomly initializing a vector to obtain a corresponding word vectorP is to beiAndthe vector result of (i.e. vector is spliced directly)Are arranged in sequence in a vector piAfter the last column) get the event vector representation
In an exemplary embodiment of the present application, the method may further include:
dividing the sentences of the event corpus into single Chinese characters, and acquiring a character vector representation w corresponding to each Chinese character in a random vector initialization modej(ii) a Representing w for the word vector by a preset vector calculationjCalculating to obtain statement vector representation corresponding to the subject
Wherein the predetermined vector calculation includes:n represents the dimension represented by the word vector, j is a natural number, and j is 1, 2, 3, … and n.
In an exemplary embodiment of the present application, the sentences of the event are participled into individual Chinese characters, and the vector representation is also obtained by randomly initializing the vectors wherein ,wiThe Chinese characters represent corresponding character vectors; si=[w1,w2,..,wn]。siRepresenting a sentence vector consisting of one or more word vectors,as a sentence vector siAverage value of (d);may be a word vector or a sentence vector.
S103, the event vector representation is used as input data, and a preset deep neural network model is input.
In an exemplary embodiment of the present application, an event vector representation is obtainedThereafter, e can be represented based on the resulting event vectoriAndand carrying out causal relationship calculation on a pre-constructed model (such as the deep neural network model) based on a machine learning supervision algorithm.
In an exemplary embodiment of the present application, as shown in fig. 2, the deep neural network model may sequentially include: an input layer, a hidden layer and an output layer;
the input layer may include at least a first input unit and a second input end element, the first input unit being configured to input an event vector representation e corresponding to a first event to be predictedi1 and corresponding statement vector representation of the first subjectThe second input unit is used for inputting an event vector representation e corresponding to a second event to be predictedi2 and corresponding statement vector representation of the second subject
wherein ,giFor the output of the hidden layer, WgFor preset hidden network parameter weights, bgFor a predetermined bias parameter, f represents a non-linear variation function,is represented by the event vector representation eiStatement vector representation corresponding to the subjectVector representation of the composition;
the output layer may include the following models: oi=f(Wo·gi+bo);
Wherein the output of the hidden layer is the input of the output layer, oiIs the output of the output layer, oiIs at least 3; wo、boThe network parameters of a preset output layer are set;
the output of the deep neural network model is oi1-oi2=[x1,x2,x3];
wherein ,oi1 is a first output result corresponding to the first event to be predicted, oi2 is a second output result corresponding to the second event to be predicted; x is the number of1、x2、x3Are numerical values respectively indicating causal relationships between the first event to be predicted and the second event to be predicted.
In an exemplary embodiment of the present application, the vector represents the sum of eiCan be used as input for the input layer, the vector representation can include at least two groups, such as vector representation ei1 and and vector representation ei2 and the input layer can directly connectVector representation ei1 and and vector representation ei2 and an input hidden layer as input of the hidden layer, output g of the hidden layeriCan be used as input of output layer, output of output layeriCan execute oi1-oi2 and output is oi1-oi2 as the output result of the deep neural network model.
In an exemplary embodiment of the present application, o is presetiIs at least 3, then output is oi1-oi2=[x1,x2,x3]; wherein ,x1It may be indicated that the first event to be predicted is the cause of the second event to be predicted, the second event to be predicted is the cause of the first event to be predicted, or the first event to be predicted and the second event to be predicted have no relationship.
S104, calculating the output result of the deep neural network model through a preset calculation function to obtain the prediction of the causal relationship among a plurality of events.
In an exemplary embodiment of the present application, the calculating the output result of the deep neural network model through a preset calculation function to obtain the prediction of the causal relationship between the events may include:
calculating a numerical value with the maximum occurrence probability in an output result of the deep neural network model through the calculation function; and using the calculation result of the calculation function as a prediction result of causal relation among a plurality of events.
In an exemplary embodiment of the present application, the calculation function may include: softmax function.
In an exemplary embodiment of the present application, the result of output may be further calculated by a softmax function, so as to obtain a final predicted value.
In an exemplary embodiment of the application, the embodiment of the application provides a method for determining causal relationships of classified events, so that the causal relationships among the calculated events are effectively calculated, and certain prediction and help are provided for decision analysis in different fields. And the method can make up the defects that the prior method only utilizes frequency information and has insufficient expressive ability and the like. According to the embodiment of the application, the frequency information of words in causal events is not only utilized, a deep neural network model is further constructed, semantic information is effectively captured by using a word2vec method, the causal events are more favorably classified, and the accuracy of classification results is improved.
The embodiment of the present application further provides an event cause and effect relationship determination apparatus 1, as shown in fig. 3, which may include a processor 11 and a computer-readable storage medium 12, where the computer-readable storage medium 12 stores instructions, and when the instructions are executed by the processor 11, the method for determining an event cause and effect relationship as any one of the above is implemented.
In the exemplary embodiment of the present application, any of the above-described method embodiments is applicable to the apparatus embodiment, and is not described in detail herein.
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements any one of the above methods for determining cause and effect relationship of events.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
Claims (10)
1. A method for discriminating cause and effect relationship of an event, the method comprising:
acquiring an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: extracting and labeling events;
acquiring event vector representation according to the event subjected to event labeling;
taking the event vector representation as input data, and inputting the input data into a preset deep neural network model;
and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events.
2. The method for discriminating event cause-and-effect relationship according to claim 1, wherein when the preprocessing is event extraction, the preprocessing the event corpus comprises:
and extracting a core predicate of the event from the sentence of the event corpus by adopting a preset event extraction tool, forming a binary group by the core predicate and a subject corresponding to the core predicate, and respectively representing an event by each binary group so as to realize event extraction.
3. The method for discriminating event cause-and-effect relationship according to claim 2, wherein when the preprocessing is event labeling, the preprocessing the event corpus comprises:
respectively carrying out event marking on every two binary groups to indicate the causal relationship of a first event and a second event corresponding to the two binary groups; the causal relationship comprises: the first event is the cause of the second event, the first event is the result of the second event, or the first event is non-causal to the second event.
4. The method for discriminating event cause-and-effect relationship according to claim 3, wherein the obtaining the event vector representation according to the event labeled by the event labeling comprises:
obtaining vector representations p of two sentences in each binary group by randomly initializing vectorsiAndi is a natural number, i is 1, 2, 3, …, m represents the vector representation piAnddimension (d);
5. The method for discriminating event cause and effect relationship according to claim 4, further comprising:
dividing the sentences of the event corpus into single Chinese characters, and acquiring a character vector representation w corresponding to each Chinese character in a random vector initialization modej(ii) a Representing w for the word vector by a preset vector calculationjCalculating to obtain statement vector representation corresponding to the subject
6. The method for discriminating event cause and effect relationship according to claim 4, wherein the deep neural network model comprises in sequence: an input layer, a hidden layer and an output layer;
the input layer at least comprises a first input unit and a second input end element, wherein the first input unit is used for inputting an event vector representation e corresponding to a first event to be predictedi1 and corresponding statement vector representation of the first subjectThe second input unit is used for inputting an event vector representation e corresponding to a second event to be predictedi2 and corresponding statement vector representation of the second subject
wherein ,giFor the output of the hidden layer, WgFor preset hidden network parameter weights, bgFor a predetermined bias parameter, f represents a non-linear variation function,is represented by the event vector representation eiStatement vector representation corresponding to the subjectVector representation of the composition;
the output layer includes the following models: oi=f(Wo·gi+bo);
Wherein the output of the hidden layer is the input of the output layer, oiIs the output of the output layer, oiIs at least 3; wo、boThe network parameters of a preset output layer are set;
the output of the deep neural network model is oi1-oi2=[x1,x2,x3];
wherein ,oi1 is a first output result corresponding to the first event to be predicted, oi2 is a second output result corresponding to the second event to be predicted; x is the number of1、x2、x3Are numerical values respectively indicating causal relationships between the first event to be predicted and the second event to be predicted.
7. The method for discriminating event causal relationship according to any one of claims 1 to 6, wherein the calculating the output result of the deep neural network model by a preset calculation function to obtain the prediction of the causal relationship between a plurality of events comprises:
calculating a numerical value with the maximum occurrence probability in an output result of the deep neural network model through the calculation function; and using the calculation result of the calculation function as a prediction result of causal relation among a plurality of events.
8. The method for event causality discrimination according to claim 7, wherein the calculation function includes: softmax function.
9. An event cause and effect discrimination apparatus comprising a processor and a computer readable storage medium having instructions stored thereon, wherein the instructions, when executed by the processor, implement the event cause and effect discrimination method according to any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for event cause and effect discrimination according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010385693.6A CN111709225B (en) | 2020-05-09 | 2020-05-09 | Event causal relationship discriminating method, device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010385693.6A CN111709225B (en) | 2020-05-09 | 2020-05-09 | Event causal relationship discriminating method, device and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111709225A true CN111709225A (en) | 2020-09-25 |
CN111709225B CN111709225B (en) | 2023-05-09 |
Family
ID=72536796
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010385693.6A Active CN111709225B (en) | 2020-05-09 | 2020-05-09 | Event causal relationship discriminating method, device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111709225B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116227598A (en) * | 2023-05-08 | 2023-06-06 | 山东财经大学 | Event prediction method, device and medium based on dual-stage attention mechanism |
WO2023217127A1 (en) * | 2022-05-13 | 2023-11-16 | 华为技术有限公司 | Causation determination method and related device |
US11922129B2 (en) | 2021-06-22 | 2024-03-05 | International Business Machines Corporation | Causal knowledge identification and extraction |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016206914A (en) * | 2015-04-22 | 2016-12-08 | 株式会社日立製作所 | Decision-making assistance system and decision-making assistance method |
CN110704890A (en) * | 2019-08-12 | 2020-01-17 | 上海大学 | Automatic text causal relationship extraction method fusing convolutional neural network and cyclic neural network |
CN110781369A (en) * | 2018-07-11 | 2020-02-11 | 天津大学 | Emotional cause mining method based on dependency syntax and generalized causal network |
-
2020
- 2020-05-09 CN CN202010385693.6A patent/CN111709225B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016206914A (en) * | 2015-04-22 | 2016-12-08 | 株式会社日立製作所 | Decision-making assistance system and decision-making assistance method |
CN110781369A (en) * | 2018-07-11 | 2020-02-11 | 天津大学 | Emotional cause mining method based on dependency syntax and generalized causal network |
CN110704890A (en) * | 2019-08-12 | 2020-01-17 | 上海大学 | Automatic text causal relationship extraction method fusing convolutional neural network and cyclic neural network |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11922129B2 (en) | 2021-06-22 | 2024-03-05 | International Business Machines Corporation | Causal knowledge identification and extraction |
WO2023217127A1 (en) * | 2022-05-13 | 2023-11-16 | 华为技术有限公司 | Causation determination method and related device |
CN116227598A (en) * | 2023-05-08 | 2023-06-06 | 山东财经大学 | Event prediction method, device and medium based on dual-stage attention mechanism |
Also Published As
Publication number | Publication date |
---|---|
CN111709225B (en) | 2023-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110377759B (en) | Method and device for constructing event relation graph | |
US11321671B2 (en) | Job skill taxonomy | |
CN112070138B (en) | Construction method of multi-label mixed classification model, news classification method and system | |
US10853697B2 (en) | System and method for monitoring online retail platform using artificial intelligence and fixing malfunction | |
CN112434535B (en) | Element extraction method, device, equipment and storage medium based on multiple models | |
CN111709225A (en) | Event cause and effect relationship judging method and device and computer readable storage medium | |
CN112711953A (en) | Text multi-label classification method and system based on attention mechanism and GCN | |
CN114896388A (en) | Hierarchical multi-label text classification method based on mixed attention | |
CN110347791B (en) | Topic recommendation method based on multi-label classification convolutional neural network | |
CN111339260A (en) | BERT and QA thought-based fine-grained emotion analysis method | |
CN113392209A (en) | Text clustering method based on artificial intelligence, related equipment and storage medium | |
Sharp et al. | Toward Semi-autonomous Information: Extraction for Unstructured Maintenance Data in Root Cause Analysis | |
CN112685539A (en) | Text classification model training method and device based on multi-task fusion | |
CN111984792A (en) | Website classification method and device, computer equipment and storage medium | |
Jagdish et al. | Identification of end-user economical relationship graph using lightweight blockchain-based BERT model | |
CN114691525A (en) | Test case selection method and device | |
CN111930944B (en) | File label classification method and device | |
CN109543038B (en) | Emotion analysis method applied to text data | |
CN107729509B (en) | Discourse similarity determination method based on recessive high-dimensional distributed feature representation | |
CN112948561B (en) | Method and device for automatically expanding question-answer knowledge base | |
CN114881173A (en) | Resume classification method and device based on self-attention mechanism | |
CN112035607B (en) | Method, device and storage medium for matching citation difference based on MG-LSTM | |
CN115269833A (en) | Event information extraction method and system based on deep semantics and multitask learning | |
CN114969253A (en) | Market subject and policy matching method and device, computing device and medium | |
Roelands et al. | Classifying businesses by economic activity using web-based text mining |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |