CN111709225A - Event cause and effect relationship judging method and device and computer readable storage medium - Google Patents

Event cause and effect relationship judging method and device and computer readable storage medium Download PDF

Info

Publication number
CN111709225A
CN111709225A CN202010385693.6A CN202010385693A CN111709225A CN 111709225 A CN111709225 A CN 111709225A CN 202010385693 A CN202010385693 A CN 202010385693A CN 111709225 A CN111709225 A CN 111709225A
Authority
CN
China
Prior art keywords
event
vector representation
vector
cause
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010385693.6A
Other languages
Chinese (zh)
Other versions
CN111709225B (en
Inventor
袁杰
于皓
张�杰
陈秀坤
高古明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Mininglamp Software System Co ltd
Original Assignee
Beijing Mininglamp Software System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Mininglamp Software System Co ltd filed Critical Beijing Mininglamp Software System Co ltd
Priority to CN202010385693.6A priority Critical patent/CN111709225B/en
Publication of CN111709225A publication Critical patent/CN111709225A/en
Application granted granted Critical
Publication of CN111709225B publication Critical patent/CN111709225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/216Parsing using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The embodiment of the application discloses a method, a device and a computer readable storage medium for judging causal relationship of events, wherein the method comprises the following steps: acquiring an original event corpus and preprocessing the event corpus; the pre-processing may include: extracting and labeling events; acquiring event vector representation according to the event subjected to event labeling; taking the event vector representation as input data, and inputting the input data into a preset deep neural network model; and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events. Through the scheme of the embodiment, the classification of causal events is facilitated, and the accuracy of the classification result is improved.

Description

Event cause and effect relationship judging method and device and computer readable storage medium
Technical Field
The present disclosure relates to information processing technologies, and more particularly, to a method, an apparatus, and a computer-readable storage medium for event cause and effect relationship determination.
Background
At present, with the rapid development of artificial intelligence technology, the natural language processing field makes many progress, and many advanced achievements are continuously emerging, but the mutual influence between the events simulated by judging the causal relationship of the events by using the natural language processing technology still needs to be improved. The causal event relation determination can be applied to many fields, for example, the financial field can evaluate the business state of the financial risk event to downstream companies, stock fluctuation and the like through event causal transfer, so that a valuable reference analysis is provided for future development of companies, decision can be effectively assisted, and the financial operation risk is reduced.
At present, event causal relationship is extracted based on text information, and classification and identification are carried out mainly by calculating word frequency and constructing a feature vector through word frequency information such as TF-IDF (term frequency-inverse document frequency-inverse text frequency index).
The prior art method only utilizes frequency information of the word interrelation in the event in the cause and effect information, and utilizes the frequency information to represent that the mutual sufficiency and the necessary relation dimension between words are single and are greatly influenced by the internal word distribution of text data. Meanwhile, some words in the sentence do not greatly help causal calculation, and belong to the category of noise, and the information redundancy is brought by fully utilizing the words in the sentence, and even the accuracy of classification can be reduced.
Disclosure of Invention
The embodiment of the application provides a method and a device for judging causal relationship of events and a computer readable storage medium, which can help to classify causal events and improve the accuracy of classification results.
The embodiment of the application provides a method for judging a causal relationship of events, which can comprise the following steps:
acquiring an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: extracting and labeling events;
acquiring event vector representation according to the event subjected to event labeling;
taking the event vector representation as input data, and inputting the input data into a preset deep neural network model;
and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events.
The embodiment of the present application further provides an event cause and effect relationship determination device, which may include a processor and a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed by the processor, the method for determining event cause and effect relationship according to any one of the above-mentioned items is implemented.
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements any one of the above methods for determining cause and effect relationship of events.
Compared with the related art, the method comprises the steps of obtaining an original event corpus and preprocessing the event corpus; the pre-processing may include: extracting and labeling events; acquiring event vector representation according to the event subjected to event labeling; taking the event vector representation as input data, and inputting the input data into a preset deep neural network model; and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events. Through the scheme of the embodiment, the classification of causal events is facilitated, and the accuracy of the classification result is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. Other advantages of the present application may be realized and attained by the instrumentalities and combinations particularly pointed out in the specification and the drawings.
Drawings
The accompanying drawings are included to provide an understanding of the present disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the examples serve to explain the principles of the disclosure and not to limit the disclosure.
FIG. 1 is a flowchart illustrating a method for determining cause and effect relationship of events according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a deep neural network model according to an embodiment of the present application;
fig. 3 is a block diagram illustrating an event cause and effect determination apparatus according to an embodiment of the present application.
Detailed Description
The present application describes embodiments, but the description is illustrative rather than limiting and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the embodiments described herein. Although many possible combinations of features are shown in the drawings and discussed in the detailed description, many other combinations of the disclosed features are possible. Any feature or element of any embodiment may be used in combination with or instead of any other feature or element in any other embodiment, unless expressly limited otherwise.
The present application includes and contemplates combinations of features and elements known to those of ordinary skill in the art. The embodiments, features and elements disclosed in this application may also be combined with any conventional features or elements to form a unique inventive concept as defined by the claims. Any feature or element of any embodiment may also be combined with features or elements from other inventive aspects to form yet another unique inventive aspect, as defined by the claims. Thus, it should be understood that any of the features shown and/or discussed in this application may be implemented alone or in any suitable combination. Accordingly, the embodiments are not limited except as by the appended claims and their equivalents. Furthermore, various modifications and changes may be made within the scope of the appended claims.
Further, in describing representative embodiments, the specification may have presented the method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. Other orders of steps are possible as will be understood by those of ordinary skill in the art. Therefore, the particular order of the steps set forth in the specification should not be construed as limitations on the claims. Further, the claims directed to the method and/or process should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the embodiments of the present application.
The embodiment of the application provides a method for discriminating cause-and-effect relationship of events, as shown in fig. 1, the method may include S101-S104:
s101, obtaining an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: event extraction and event annotation.
In an exemplary embodiment of the present application, a data preparation work, i.e., data acquisition, may be performed first to acquire the original event corpus.
In an exemplary embodiment of the present application, obtaining the original event corpus may include: a crawler technology is utilized to crawl a large amount of event data from network resources.
In an exemplary embodiment of the present application, a large amount of news corpora, narrative article stories describing events, and the like may be crawled from network resources such as news websites and the like using existing crawler technologies as the event corpora.
In the exemplary embodiment of the present application, after the event corpus is obtained, event extraction, event tagging (which may also be referred to as corpus tagging) and other preprocessing may be further performed on the event corpus.
In an exemplary embodiment of the present application, when the preprocessing is event extraction, the preprocessing the event corpus may include:
and extracting a core predicate of the event from the sentence of the event corpus by adopting a preset event extraction tool, forming a binary group by the core predicate and a subject corresponding to the core predicate, and respectively representing an event by each binary group so as to realize event extraction.
In an exemplary embodiment of the present application, the obtained corpus data of the event corpus including the event relationship may be subjected to event representation extraction by a preset event extraction tool, for example, NLP (natural language processing tool). In particular, the core representation word (typically a predicate) of an event can be extracted from a sentence, such as: eating, fighting, running, exploding, dying, etc., an event is represented by the core predicate and a two-tuple relationship based on the subject of the core predicate. For example, a news: explosion events at suzhou xx chemical plants; the news event can be extracted as a binary (explosion, chemical plant) to represent the event. Another news: the dramatic increase in xx month xx day energy stocks can be drawn into (rising, energy stocks) this binary group.
In an exemplary embodiment of the present application, when the preprocessing is event tagging, the preprocessing the event corpus may include:
respectively carrying out event marking on every two binary groups to indicate the causal relationship of a first event and a second event corresponding to the two binary groups; the causal relationship comprises: the first event is the cause of the second event, the first event is the result of the second event, or the first event is non-causal to the second event.
In an exemplary embodiment of the present application, since a supervised algorithm in machine learning is required to predetermine the labeling of the data, at least a part of the data of the event extraction result may be labeled based on the result of the event extraction. For example, it is noted whether event a (which may be the first event described above) is the cause of event B (which may be the second event described above), whether event B is the cause of event a, or both. It can be expressed by one-hot coding, for example, using [1,0,0] to indicate whether event a is the cause of event B, using [0,1,0] to indicate whether event a is the result of event B, and using [0,0,1] to indicate that event a and event B are not causal.
And S102, acquiring event vector representation according to the event subjected to event annotation.
In an exemplary embodiment of the present application, after obtaining the event labeled binary, vector representation mapping of the event may be performed on the binary.
In an exemplary embodiment of the present application, the obtaining an event vector representation according to an event after event annotation may include:
obtaining vector representations p of two sentences in each binary group by randomly initializing vectorsiAnd
Figure BDA0002483724160000051
i is a natural number, i is 1, 2, 3, …, m represents the vector representation piAnd
Figure BDA0002483724160000052
dimension (d);
representing the vector as pi and siStitching to obtain the event vector representation
Figure BDA0002483724160000053
wherein ,piA statement vector representation corresponding to the core predicate,
Figure BDA0002483724160000054
and representing the statement vector corresponding to the subject.
In an exemplary embodiment of the present application, the words of the binary group obtained above may be processed by means of randomly initializing a vector to obtain a corresponding word vector
Figure BDA0002483724160000055
P is to beiAnd
Figure BDA0002483724160000056
the vector result of (i.e. vector is spliced directly)
Figure BDA0002483724160000057
Are arranged in sequence in a vector piAfter the last column) get the event vector representation
Figure BDA0002483724160000058
In an exemplary embodiment of the present application, the method may further include:
dividing the sentences of the event corpus into single Chinese characters, and acquiring a character vector representation w corresponding to each Chinese character in a random vector initialization modej(ii) a Representing w for the word vector by a preset vector calculationjCalculating to obtain statement vector representation corresponding to the subject
Figure BDA0002483724160000061
Wherein the predetermined vector calculation includes:
Figure BDA0002483724160000062
n represents the dimension represented by the word vector, j is a natural number, and j is 1, 2, 3, … and n.
In an exemplary embodiment of the present application, the sentences of the event are participled into individual Chinese characters, and the vector representation is also obtained by randomly initializing the vectors
Figure BDA0002483724160000063
wherein ,wiThe Chinese characters represent corresponding character vectors; si=[w1,w2,..,wn]。siRepresenting a sentence vector consisting of one or more word vectors,
Figure BDA0002483724160000064
as a sentence vector siAverage value of (d);
Figure BDA0002483724160000065
may be a word vector or a sentence vector.
S103, the event vector representation is used as input data, and a preset deep neural network model is input.
In an exemplary embodiment of the present application, an event vector representation is obtained
Figure BDA0002483724160000066
Thereafter, e can be represented based on the resulting event vectoriAnd
Figure BDA0002483724160000067
and carrying out causal relationship calculation on a pre-constructed model (such as the deep neural network model) based on a machine learning supervision algorithm.
In an exemplary embodiment of the present application, as shown in fig. 2, the deep neural network model may sequentially include: an input layer, a hidden layer and an output layer;
the input layer may include at least a first input unit and a second input end element, the first input unit being configured to input an event vector representation e corresponding to a first event to be predictedi1 and corresponding statement vector representation of the first subject
Figure BDA0002483724160000068
The second input unit is used for inputting an event vector representation e corresponding to a second event to be predictedi2 and corresponding statement vector representation of the second subject
Figure BDA0002483724160000069
The hidden layer may include the following models:
Figure BDA00024837241600000610
wherein ,giFor the output of the hidden layer, WgFor preset hidden network parameter weights, bgFor a predetermined bias parameter, f represents a non-linear variation function,
Figure BDA00024837241600000611
is represented by the event vector representation eiStatement vector representation corresponding to the subject
Figure BDA00024837241600000612
Vector representation of the composition;
the output layer may include the following models: oi=f(Wo·gi+bo);
Wherein the output of the hidden layer is the input of the output layer, oiIs the output of the output layer, oiIs at least 3; wo、boThe network parameters of a preset output layer are set;
the output of the deep neural network model is oi1-oi2=[x1,x2,x3];
wherein ,oi1 is a first output result corresponding to the first event to be predicted, oi2 is a second output result corresponding to the second event to be predicted; x is the number of1、x2、x3Are numerical values respectively indicating causal relationships between the first event to be predicted and the second event to be predicted.
In an exemplary embodiment of the present application, the vector represents the sum of ei
Figure BDA0002483724160000071
Can be used as input for the input layer, the vector representation can include at least two groups, such as vector representation ei1 and
Figure BDA0002483724160000072
and vector representation ei2 and
Figure BDA0002483724160000073
the input layer can directly connectVector representation ei1 and
Figure BDA0002483724160000074
and vector representation ei2 and
Figure BDA0002483724160000075
an input hidden layer as input of the hidden layer, output g of the hidden layeriCan be used as input of output layer, output of output layeriCan execute oi1-oi2 and output is oi1-oi2 as the output result of the deep neural network model.
In an exemplary embodiment of the present application, o is presetiIs at least 3, then output is oi1-oi2=[x1,x2,x3]; wherein ,x1It may be indicated that the first event to be predicted is the cause of the second event to be predicted, the second event to be predicted is the cause of the first event to be predicted, or the first event to be predicted and the second event to be predicted have no relationship.
S104, calculating the output result of the deep neural network model through a preset calculation function to obtain the prediction of the causal relationship among a plurality of events.
In an exemplary embodiment of the present application, the calculating the output result of the deep neural network model through a preset calculation function to obtain the prediction of the causal relationship between the events may include:
calculating a numerical value with the maximum occurrence probability in an output result of the deep neural network model through the calculation function; and using the calculation result of the calculation function as a prediction result of causal relation among a plurality of events.
In an exemplary embodiment of the present application, the calculation function may include: softmax function.
In an exemplary embodiment of the present application, the result of output may be further calculated by a softmax function, so as to obtain a final predicted value.
In an exemplary embodiment of the application, the embodiment of the application provides a method for determining causal relationships of classified events, so that the causal relationships among the calculated events are effectively calculated, and certain prediction and help are provided for decision analysis in different fields. And the method can make up the defects that the prior method only utilizes frequency information and has insufficient expressive ability and the like. According to the embodiment of the application, the frequency information of words in causal events is not only utilized, a deep neural network model is further constructed, semantic information is effectively captured by using a word2vec method, the causal events are more favorably classified, and the accuracy of classification results is improved.
The embodiment of the present application further provides an event cause and effect relationship determination apparatus 1, as shown in fig. 3, which may include a processor 11 and a computer-readable storage medium 12, where the computer-readable storage medium 12 stores instructions, and when the instructions are executed by the processor 11, the method for determining an event cause and effect relationship as any one of the above is implemented.
In the exemplary embodiment of the present application, any of the above-described method embodiments is applicable to the apparatus embodiment, and is not described in detail herein.
The embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements any one of the above methods for determining cause and effect relationship of events.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.

Claims (10)

1. A method for discriminating cause and effect relationship of an event, the method comprising:
acquiring an original event corpus and preprocessing the event corpus; the pretreatment comprises the following steps: extracting and labeling events;
acquiring event vector representation according to the event subjected to event labeling;
taking the event vector representation as input data, and inputting the input data into a preset deep neural network model;
and calculating the output result of the deep neural network model through a preset calculation function so as to obtain the prediction of the causal relationship among a plurality of events.
2. The method for discriminating event cause-and-effect relationship according to claim 1, wherein when the preprocessing is event extraction, the preprocessing the event corpus comprises:
and extracting a core predicate of the event from the sentence of the event corpus by adopting a preset event extraction tool, forming a binary group by the core predicate and a subject corresponding to the core predicate, and respectively representing an event by each binary group so as to realize event extraction.
3. The method for discriminating event cause-and-effect relationship according to claim 2, wherein when the preprocessing is event labeling, the preprocessing the event corpus comprises:
respectively carrying out event marking on every two binary groups to indicate the causal relationship of a first event and a second event corresponding to the two binary groups; the causal relationship comprises: the first event is the cause of the second event, the first event is the result of the second event, or the first event is non-causal to the second event.
4. The method for discriminating event cause-and-effect relationship according to claim 3, wherein the obtaining the event vector representation according to the event labeled by the event labeling comprises:
obtaining vector representations p of two sentences in each binary group by randomly initializing vectorsiAnd
Figure FDA0002483724150000011
i is a natural number, i is 1, 2, 3, …, m represents the vector representation piAnd
Figure FDA0002483724150000012
dimension (d);
representing the vector as pi and siStitching to obtain the event vector representation
Figure FDA0002483724150000013
wherein ,piA statement vector representation corresponding to the core predicate,
Figure FDA0002483724150000014
and representing the statement vector corresponding to the subject.
5. The method for discriminating event cause and effect relationship according to claim 4, further comprising:
dividing the sentences of the event corpus into single Chinese characters, and acquiring a character vector representation w corresponding to each Chinese character in a random vector initialization modej(ii) a Representing w for the word vector by a preset vector calculationjCalculating to obtain statement vector representation corresponding to the subject
Figure FDA0002483724150000021
Wherein the predetermined vector calculation includes:
Figure FDA0002483724150000022
n represents the dimension represented by the word vector, j is a natural number, and j is 1, 2, 3, … and n.
6. The method for discriminating event cause and effect relationship according to claim 4, wherein the deep neural network model comprises in sequence: an input layer, a hidden layer and an output layer;
the input layer at least comprises a first input unit and a second input end element, wherein the first input unit is used for inputting an event vector representation e corresponding to a first event to be predictedi1 and corresponding statement vector representation of the first subject
Figure FDA0002483724150000023
The second input unit is used for inputting an event vector representation e corresponding to a second event to be predictedi2 and corresponding statement vector representation of the second subject
Figure FDA0002483724150000024
The hidden layer comprises the following diesType (2):
Figure FDA0002483724150000025
wherein ,giFor the output of the hidden layer, WgFor preset hidden network parameter weights, bgFor a predetermined bias parameter, f represents a non-linear variation function,
Figure FDA0002483724150000026
is represented by the event vector representation eiStatement vector representation corresponding to the subject
Figure FDA0002483724150000027
Vector representation of the composition;
the output layer includes the following models: oi=f(Wo·gi+bo);
Wherein the output of the hidden layer is the input of the output layer, oiIs the output of the output layer, oiIs at least 3; wo、boThe network parameters of a preset output layer are set;
the output of the deep neural network model is oi1-oi2=[x1,x2,x3];
wherein ,oi1 is a first output result corresponding to the first event to be predicted, oi2 is a second output result corresponding to the second event to be predicted; x is the number of1、x2、x3Are numerical values respectively indicating causal relationships between the first event to be predicted and the second event to be predicted.
7. The method for discriminating event causal relationship according to any one of claims 1 to 6, wherein the calculating the output result of the deep neural network model by a preset calculation function to obtain the prediction of the causal relationship between a plurality of events comprises:
calculating a numerical value with the maximum occurrence probability in an output result of the deep neural network model through the calculation function; and using the calculation result of the calculation function as a prediction result of causal relation among a plurality of events.
8. The method for event causality discrimination according to claim 7, wherein the calculation function includes: softmax function.
9. An event cause and effect discrimination apparatus comprising a processor and a computer readable storage medium having instructions stored thereon, wherein the instructions, when executed by the processor, implement the event cause and effect discrimination method according to any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for event cause and effect discrimination according to any one of claims 1 to 8.
CN202010385693.6A 2020-05-09 2020-05-09 Event causal relationship discriminating method, device and computer readable storage medium Active CN111709225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010385693.6A CN111709225B (en) 2020-05-09 2020-05-09 Event causal relationship discriminating method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010385693.6A CN111709225B (en) 2020-05-09 2020-05-09 Event causal relationship discriminating method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111709225A true CN111709225A (en) 2020-09-25
CN111709225B CN111709225B (en) 2023-05-09

Family

ID=72536796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010385693.6A Active CN111709225B (en) 2020-05-09 2020-05-09 Event causal relationship discriminating method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111709225B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116227598A (en) * 2023-05-08 2023-06-06 山东财经大学 Event prediction method, device and medium based on dual-stage attention mechanism
WO2023217127A1 (en) * 2022-05-13 2023-11-16 华为技术有限公司 Causation determination method and related device
US11922129B2 (en) 2021-06-22 2024-03-05 International Business Machines Corporation Causal knowledge identification and extraction

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016206914A (en) * 2015-04-22 2016-12-08 株式会社日立製作所 Decision-making assistance system and decision-making assistance method
CN110704890A (en) * 2019-08-12 2020-01-17 上海大学 Automatic text causal relationship extraction method fusing convolutional neural network and cyclic neural network
CN110781369A (en) * 2018-07-11 2020-02-11 天津大学 Emotional cause mining method based on dependency syntax and generalized causal network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016206914A (en) * 2015-04-22 2016-12-08 株式会社日立製作所 Decision-making assistance system and decision-making assistance method
CN110781369A (en) * 2018-07-11 2020-02-11 天津大学 Emotional cause mining method based on dependency syntax and generalized causal network
CN110704890A (en) * 2019-08-12 2020-01-17 上海大学 Automatic text causal relationship extraction method fusing convolutional neural network and cyclic neural network

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11922129B2 (en) 2021-06-22 2024-03-05 International Business Machines Corporation Causal knowledge identification and extraction
WO2023217127A1 (en) * 2022-05-13 2023-11-16 华为技术有限公司 Causation determination method and related device
CN116227598A (en) * 2023-05-08 2023-06-06 山东财经大学 Event prediction method, device and medium based on dual-stage attention mechanism

Also Published As

Publication number Publication date
CN111709225B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN110377759B (en) Method and device for constructing event relation graph
US11321671B2 (en) Job skill taxonomy
CN112070138B (en) Construction method of multi-label mixed classification model, news classification method and system
US10853697B2 (en) System and method for monitoring online retail platform using artificial intelligence and fixing malfunction
CN112434535B (en) Element extraction method, device, equipment and storage medium based on multiple models
CN111709225A (en) Event cause and effect relationship judging method and device and computer readable storage medium
CN112711953A (en) Text multi-label classification method and system based on attention mechanism and GCN
CN114896388A (en) Hierarchical multi-label text classification method based on mixed attention
CN110347791B (en) Topic recommendation method based on multi-label classification convolutional neural network
CN111339260A (en) BERT and QA thought-based fine-grained emotion analysis method
CN113392209A (en) Text clustering method based on artificial intelligence, related equipment and storage medium
Sharp et al. Toward Semi-autonomous Information: Extraction for Unstructured Maintenance Data in Root Cause Analysis
CN112685539A (en) Text classification model training method and device based on multi-task fusion
CN111984792A (en) Website classification method and device, computer equipment and storage medium
Jagdish et al. Identification of end-user economical relationship graph using lightweight blockchain-based BERT model
CN114691525A (en) Test case selection method and device
CN111930944B (en) File label classification method and device
CN109543038B (en) Emotion analysis method applied to text data
CN107729509B (en) Discourse similarity determination method based on recessive high-dimensional distributed feature representation
CN112948561B (en) Method and device for automatically expanding question-answer knowledge base
CN114881173A (en) Resume classification method and device based on self-attention mechanism
CN112035607B (en) Method, device and storage medium for matching citation difference based on MG-LSTM
CN115269833A (en) Event information extraction method and system based on deep semantics and multitask learning
CN114969253A (en) Market subject and policy matching method and device, computing device and medium
Roelands et al. Classifying businesses by economic activity using web-based text mining

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant