CN111353306A - Entity relationship and dependency Tree-LSTM-based combined event extraction method - Google Patents

Entity relationship and dependency Tree-LSTM-based combined event extraction method Download PDF

Info

Publication number
CN111353306A
CN111353306A CN202010109601.1A CN202010109601A CN111353306A CN 111353306 A CN111353306 A CN 111353306A CN 202010109601 A CN202010109601 A CN 202010109601A CN 111353306 A CN111353306 A CN 111353306A
Authority
CN
China
Prior art keywords
vector
sentence
entity
event
word
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010109601.1A
Other languages
Chinese (zh)
Other versions
CN111353306B (en
Inventor
张旻
曹祥彪
汤景凡
姜明
李鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202010109601.1A priority Critical patent/CN111353306B/en
Publication of CN111353306A publication Critical patent/CN111353306A/en
Application granted granted Critical
Publication of CN111353306B publication Critical patent/CN111353306B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses a method for extracting a combined event based on an entity relationship and a dependency Tree-LSTM. The method comprises the following steps: 1. and encoding the original text and the text marking information. 2. The result of step 1 is input into the bi-directional LSTM. Forward and backward implicit state vectors with timing are obtained. 3. The input sentence is first parsed into a dependency tree structure,and then inputting the result of the step 1 into the constructed dependency Tree-LSTM, and acquiring the hidden state vector of the Tree root node and the hidden state vector of each moment. 4. And acquiring and storing the entity relationship sentence information characteristic vector. Connecting forward and backward implicit state vectors of bidirectional LSTM t and implicit state vector depending on Tree-LSTM t time
Figure DDA0002389509340000011
5. Carrying out trigger word identification and classification; 6. event arguments are identified and classified.

Description

Entity relationship and dependency Tree-LSTM-based combined event extraction method
Technical Field
The invention relates to an event extraction method, in particular to a combined event extraction method based on an entity relationship and a dependency Tree-LSTM, and belongs to the field of natural language processing.
Background
Event Extraction (EE) is an important component of an Information Extraction task (IE). The Event extraction mainly comprises two subtasks of trigger word recognition and classification (ED) and Event Argument recognition and classification (AI), wherein the ED task mainly finds out the trigger words triggering the events from the text and correctly judges the Event types of the trigger words. The latter is to determine whether the sentence is an event sentence (containing trigger words), and then to determine whether the entity reference appearing in the sentence is the event argument. And assigns each entity mention the correct event argument role. With the occurrence of massive text information and the deep development of deep learning technology, event extraction also becomes a hot problem for research of people. In addition, the event extraction technology has been applied to news message classification, social public opinion management, and the like.
Disclosure of Invention
The invention provides a combined event extraction method based on an entity relationship and a dependency Tree-LSTM, which is mainly aimed at the problems that the dependency path of an event trigger word and an event argument is too long and the output characteristics of a model lack the entity relationship.
The method for extracting the combined event based on the entity relationship and the dependency Tree-LSTM comprises the following steps:
step 1, encoding an original text and text labeling information;
step 2, inputting the result of the step 1 into a bidirectional LSTM; obtainTaking forward implicit state vectors with timing
Figure BDA0002389509320000011
And backward implicit state vector
Figure BDA0002389509320000012
Step 3, firstly, analyzing an input sentence into a dependency Tree structure by using a Stanford CoreNLP tool, then inputting the coding result of the step 1 into a dependency Tree-LSTM constructed by the dependency Tree structure, and obtaining a Tree root node hidden state vector
Figure BDA0002389509320000013
And implicit State vectors at t instants
Figure BDA0002389509320000014
Step 4, carrying out entity relation vector RkEncoding a junction tree root node hidden state vector
Figure BDA0002389509320000021
Obtaining and saving entity relation sentence vector
Figure BDA0002389509320000022
Forward implicit state vectors connecting simultaneously two-way LSTM t instants
Figure BDA0002389509320000023
And backward implicit state vector
Figure BDA0002389509320000024
And an implicit State vector that depends on the Tree-LSTM t time
Figure BDA0002389509320000025
Solving new implicit state vectors
Figure BDA0002389509320000026
Thereby not only saving the information of the sub-nodes but also obtaining the local context information with a certain time sequenceInformation;
step 5, connecting the hidden state vector H at the time t in the step 4tCarrying out trigger word recognition and classification with the sentence vector F;
step 6, sequentially identifying the hidden state vector H of the t-th word identified as the trigger word in the step 5tThe ith event argument candidate word (the ith entity reference) implicit state vector
Figure BDA0002389509320000029
Sentence vector F containing entity relationship and entity relationship vector R of ith event argument candidatekEntity relationship argument role in
Figure BDA0002389509320000027
Connecting, and identifying and classifying event arguments;
further, the step 1 is specifically realized as follows:
1-1, acquiring unprocessed original text and text labeling information from a source file, wherein the labeling information comprises entity mention, entity types, event trigger words, event arguments, event argument roles, entity relationships and entity relationship argument roles, wherein the number of the entity types is 7, the number of the event trigger word types is 39, the number of the entity relationship types is 20, and the number of the entity relationship argument roles is 16; then, sentence and word segmentation are carried out on the original text by using Stanford CoreNLP; acquiring a dependency tree structure of parts of speech and sentences, wherein each word is used as a node of the tree structure; respectively creating a part-of-speech vector table, an entity type vector table, an entity relation argument role vector table, a trigger word type vector table and an event argument role vector table, wherein each vector table has initialization vectors corresponding to other types; entity mentions may consist of multiple words; for convenience of representing entity mentions, we denote each entity mention by the head of each entity mention (mostly the last word of an entity mention) and the subscript that the head appears in sentences denotes the subscript of each entity mention; thus, the subscripts referred to by each entity are indicated by the symbols: head1,head2,head3,...,headk-1,headk(where k is the number of entity mentions, k may be zero); for this purpose, we use
Figure BDA0002389509320000028
Representing entity mentions that appear in the sentence; randomly initializing each vector in all vector tables, and updating the vectors during training;
1-2, inquiring the pre-trained glove word vector matrix to obtain the word vector w of each word in the sentenceiThen, the part-of-speech vector table is inquired to obtain a part-of-speech vector wposAnd querying the entity type vector table to obtain an entity type vector we
Obtain each word representation xi={wi,wpos,weThus the sentence vector matrix is denoted W ═ x1,x2,...,xn-1,xnWhere n is the length of the sentence;
further, step 2 is specifically implemented as follows:
converting the vector matrix W of the sentence into { x }1,x2,...,xn-1,xnInputting the sentence into a bidirectional LSTM, and respectively acquiring a forward implicit state matrix of the sentence
Figure BDA0002389509320000031
And backward implicit state matrix
Figure BDA0002389509320000032
Wherein
Figure BDA0002389509320000033
And
Figure BDA0002389509320000034
representing the forward hidden state vector and the backward hidden state at time t, respectively, t ∈ [1, n]Bi-directional LSTM is a time series sensitive model and, therefore,
Figure BDA0002389509320000035
and
Figure BDA0002389509320000036
respectively storing the upper and lower information with certain time sequence information;
further, step 3 is specifically implemented as follows:
analyzing each sentence into a tree structure through a Stanford CoreNLP tool, wherein each word in the sentence forms a node of the tree structure, and the parent node or the child node of the node appears when the dependency relationship exists between the word and the node; changing W to { x ═ x1,x2,...,xn-1,xnInputting the information into a dependency Tree-LSTM constructed based on the Tree structure, and obtaining an implicit state vector of each node in the Tree structure parsed by the sentence
Figure BDA0002389509320000037
And the implicit state vector of the root node
Figure BDA0002389509320000038
Thus, the dependency Tree-LSTM of a sentence outputs the implicit state matrix of the sentence
Figure BDA0002389509320000039
Wherein t, root ∈ [1, n ]]N is the length of the sentence;
further, step 4 is specifically implemented as follows:
4-1, obtaining an entity relation vector R in the sentence by inquiring the entity relation table initialized randomly in the step 1kRepresenting the kth entity relationship; if no entity relationship exists, RkPoint to the "other" entity relationship vectors and adjust the vectors during the training process;
the memory unit vector c and the hidden state vector h of each node in the 4-2 dependency Tree-LSTM are obtained by summing the hidden state vectors of the sub-nodes of the node; therefore, the root node in the semantic dependency tree structure contains the information of the whole sentence, and the hidden vector of the root node generated in the step 4 is used for making the sentence contain the sentence-level vector of the entity relationship information
Figure BDA0002389509320000041
And an entity relationship vector RkConnecting and obtaining sentence vector containing entity relation information
Figure BDA0002389509320000042
4-3, combining the hidden vectors at each moment in the step 2 and the step 3, and simultaneously, acquiring the hidden state vector at the moment t by adopting an averaging mode to reduce the dimensionality of the hidden vectors:
Figure BDA0002389509320000043
and the hidden state matrix of the whole sentence is H ═ H1,H2,···,Hn-1,HnWhere t ∈ [1, n ]]N is the length of the sentence;
further, step 5 is specifically implemented as follows:
5-1, only verbs and nouns are specified as trigger word candidates, and 39 types of seeds are provided in total, wherein the types comprise 'other' types; judging the part of speech of each word in the sentence, if the part of speech is a verb or a noun, carrying out hidden state vector H at the current t momenttThe expression is connected with the sentence vector F and is input into a trigger word multi-classification formula:
Figure BDA0002389509320000044
Figure BDA0002389509320000045
wherein, WTAnd bTRespectively triggering a weight matrix and a bias item of multi-classification of words;
Figure BDA0002389509320000046
representing the probability that the trigger word candidate for the tth word (each word being a time instant) triggers the event type,
Figure BDA0002389509320000047
representing the event type triggered at the t-th moment;
further, step 6 is specifically implemented as follows:
6-1, 20 entity relation argument roles are provided, a randomly initialized entity relation argument role vector table is established, the vector table is searched through the entity relation argument roles, and the vector is adjusted in the training process; by using
Figure BDA0002389509320000048
Representing the ith entity mention in an entity relationship vector RkPlays the role of j entity relationship argument;
6-2, referring the entity in the sentence as an event argument candidate word; sequentially subjecting the ith event argument candidate word (the ith entity reference) to implicit state vector
Figure BDA0002389509320000049
Implicit State vector H for the tth word identified as the trigger word in step 5-1tSentence vector F containing entity relationship and ith event argument candidate in entity relationship RkEntity relationship argument role in
Figure BDA00023895093200000410
Connecting; inputting the join vector into an event argument recognition multi-classification formula:
Figure BDA0002389509320000051
Figure BDA0002389509320000052
wherein, WAAnd bARespectively, a weight matrix and bias terms for the event argument class,
Figure BDA0002389509320000053
indicating the event type of the ith event argument candidate
Figure BDA0002389509320000054
Probability values of the role of event argument played;
Figure BDA0002389509320000055
indicating the event type of the ith event argument candidate
Figure BDA0002389509320000056
The role of event argument;
the invention has the following beneficial effects:
aiming at the defects of the prior art, a method for extracting the combined event based on the entity relationship and the dependency Tree-LSTM is provided. And obtaining the hidden state vector of each moment by using a dependency Tree-LSTM and a bidirectional LSTM, and respectively combining the entity relationship vector and the entity relationship argument role vector with the hidden state vectors to perform multi-classification on the trigger word candidate words and the argument candidate words. The model not only can reduce the influence of wrong trigger word types on argument identification, but also can fully utilize entity relationship and entity relationship argument role information, thereby improving the accuracy of the event extraction model.
Drawings
FIG. 1 is a flow chart of the overall implementation of the present invention.
FIG. 2 is a detailed flow diagram of the present invention triggering word recognition and classification and event argument recognition and classification.
Fig. 3 is a network architecture diagram of the model of the invention.
Detailed Description
The attached drawings disclose a flow chart of a preferred embodiment of the invention in a non-limiting way; the technical solution of the present invention will be described in detail below with reference to the accompanying drawings.
The event extraction is an important component of information extraction research and is a common technical basis for news hotspot extraction and social public opinion analysis. The event extraction is to find out event suggestions from a large amount of texts, and the event suggestions are composed of event trigger words and event arguments. Therefore, the event extraction mainly comprises two tasks of trigger word recognition and event argument role classification. Some researches divide the task into two stages, wherein the first stage firstly acquires the event type of the trigger word, and then judges the role of the event argument candidate word in the sentence according to the category of the trigger word. The method has the defect that the error classification of the trigger words in the first stage influences the effect of event argument role classification, so that a joint learning model of trigger word identification and event argument classification is provided. However, the above model does not take full advantage of the entity relationships and the role of entity references in entity relationship arguments. Therefore, a method for extracting the combined event based on the entity relationship and the dependency Tree-LSTM is provided.
1-3, a method for extracting join events based on entity relationships and dependency Tree-LSTM, comprises the following steps:
step 1, encoding the original text and the text label information.
Step 2 inputs the result of step 1 into the bi-directional LSTM. Obtaining forward implicit state vectors with timing
Figure BDA0002389509320000061
And backward implicit state vector
Figure BDA0002389509320000062
Step 3, firstly, the Stanford CoreNLP tool is utilized to analyze the input sentence into a dependency Tree structure, then the result of the step 1 is input into a dependency Tree-LSTM constructed by the dependency Tree structure, and the hidden state vector of the Tree root node is obtained
Figure BDA0002389509320000063
And an implicit state vector for each time instant
Figure BDA0002389509320000064
Step 4, entity relation RkCoded connection
Figure BDA0002389509320000065
Obtaining and storing entity relation sentence information characteristic vector
Figure BDA0002389509320000066
At the same time, the forward implicit state vector of the concatenated bi-directional LSTM t
Figure BDA0002389509320000067
And backward implicit state vector
Figure BDA0002389509320000068
And an implicit State vector that depends on the Tree-LSTM t time
Figure BDA0002389509320000069
Make it
Figure BDA00023895093200000610
The information of the sub-nodes can be saved, and the local context information with a certain time sequence can be obtained.
Step 5, connecting the hidden state vector H at the time t in the step 4tCarrying out trigger word recognition and classification with the sentence vector F;
step 6, sequentially identifying the hidden state vector H of the t-th word identified as the trigger word in the step 5tThe ith event argument candidate word (the ith entity reference) implicit state vector
Figure BDA00023895093200000612
Sentence vector F containing entity relationship and entity relationship vector R of ith event argument candidatekEntity relationship argument role in
Figure BDA00023895093200000611
And connecting, and identifying and classifying event arguments.
Further, the step 1 is specifically realized as follows:
the method comprises the steps of obtaining unprocessed original texts and labeling information from a source file, wherein the labeling information comprises entity words, entity types, event trigger words, event arguments, event argument roles, entity relationships and entity relationship argument roles, and the labeling information comprises 7 entity types, 39 event trigger word types, 20 entity relationship types and 16 entity relationship argument roles. And then, sentence and word segmentation are carried out on the original text by using the Stanford CoreNLP. And acquiring a dependency tree structure of parts of speech and sentences, wherein each word is used as a node of the tree structure. And respectively creating a part-of-speech vector table, an entity type vector table, an entity relation argument role vector table, a trigger part-of-speech type vector table and an event argument role vector table, wherein each vector table has other corresponding initialization vectors. These vectors are initialized randomly and updated at the time of training.
Inquiring the pre-trained glove word vector matrix to obtain the word vector w of each word in the sentenceiThen, the part-of-speech vector table is inquired to obtain wposAnd querying entity type to obtain we
Each word obtained is represented by xi={wi,wpos,weThus the sentence vector matrix is denoted W ═ x1,x2,...,xn-1,xnWhere n is the length of the sentence.
Converting the vector matrix W of the sentence into { x }1,x2,...,xn-1,xnInputting the sentence into a bidirectional LSTM, and respectively acquiring a forward implicit state matrix of the sentence
Figure BDA0002389509320000071
And backward implicit state matrix
Figure BDA0002389509320000072
Wherein
Figure BDA0002389509320000073
And
Figure BDA0002389509320000074
representing the forward hidden state vector and the backward hidden state at time t, respectively, t ∈ [1, n]Bi-directional LSTM is a time series sensitive model and, therefore,
Figure BDA0002389509320000075
and
Figure BDA0002389509320000076
the above and below information with certain timing information is saved separately.
Stanford CoreNLPThe tool parses each sentence into a tree structure, each word in the sentence constituting a node of the tree structure, wherein a dependency relationship with the word occurs with a parent node or a child node of the node. Changing W to { x ═ x1,x2,...,xn-1,xnInputting the information into the dependency Tree-LSTM constructed based on the Tree structure, and obtaining the implicit state vector of each node in the Tree structure analyzed by the sentence
Figure BDA0002389509320000077
And the implicit state vector of the root node
Figure BDA0002389509320000078
Dependency Tree-LSTM of a sentence thus outputs an implicit state matrix of the sentence
Figure BDA0002389509320000079
Wherein t, root ∈ [1, n ]]And n is the length of the sentence.
In event extraction, some trigger words may be ambiguous in recognition, for example: elop plan to leveNokia. Most event extraction models (EE) identify leave as an event type transport more easily, but if the relationship of an entity Elop in a sentence and membership existing in an entity Nokia is utilized, the EE can identify an End-Position event triggered by leave in the sentence more easily. Therefore, by inquiring the entity relationship table initialized randomly in the step (1), the entity relationship vector R in the sentence is obtainedk(indicating a kth entity relationship), and if no entity relationship exists, RkPoint to the "other" entity relationship vector and adjust the vector during the training process.
The memory element vector c and the hidden state vector h of each node in the dependency Tree-LSTM are obtained by summing the hidden state vectors of the sub-nodes of the node. Therefore, the root node in the semantic dependency tree structure contains the information of the whole sentence, and the hidden vector of the root node generated in step 4 is used for making the sentence contain the sentence-level vector of the entity relationship information
Figure BDA0002389509320000081
And an entity relationship vector RkConnecting and obtaining sentence vector containing entity relation information
Figure BDA0002389509320000082
The dependency Tree-LSTM is a non-time-series sensitive model, and the implicit state vector output at each time also lacks certain time-series information, so the implicit vectors at each time in steps 2 and 3 are combined, but in order to reduce the dimensionality of the implicit vectors, the implicit state vector at time t is obtained by averaging:
Figure BDA0002389509320000083
and the hidden state matrix of the whole sentence is H ═ H1,H2,…,Hn-1,HnWhere t ∈ [1, n ]]And n is the length of the sentence.
Specify that only verbs and nouns are candidates for trigger words, for a total of 39 seed types, including "other" types. Firstly, judging the part of speech of each word in a sentence, and if the part of speech is a verb or a noun, carrying out hidden state vector H at the current t momenttThe expression is connected with the sentence vector F and is input into a trigger word multi-classification formula:
Figure BDA0002389509320000084
Figure BDA0002389509320000085
wherein,
Figure BDA0002389509320000086
the probability that the trigger word candidate for the tth word triggers the event type,
Figure BDA0002389509320000087
indicating the event type triggered by the t-th word.
Event argument role played in event type for judging event argument candidate words (entity mentions) in sentencesIt is desirable to use the entity to mention the role of an entity relationship argument that is played in the entity relationship. As with the example sentence mentioned in 4-1, the two entity mentions Elop and Nokia, if learned by the model, act as employeemberber and org, respectively, in the entity relationship membership. The model will more easily assign event argument roles Person and Entity to the two event arguments Elop and Nokia in the event type transport. The total number of the entity relationship argument roles is 20, a randomly initialized entity relationship argument role vector table is established, the table is searched through the entity relationship argument roles, and the vectors are adjusted in the training process. By using
Figure BDA0002389509320000091
Entity mention at time i in entity relationship RkPlays the role of j entity relationship argument.
And (4) referring the entity in the sentence as an event argument candidate word. Sequentially subjecting ith event argument candidate word to implicit state vector HiImplicit State vector connection H for the t-th word identified as trigger word at 5-1tSentence vector F containing entity relationship and ith event argument candidate in relationship RkEntity relationship argument role in
Figure BDA0002389509320000092
And (4) connecting. Inputting the join vector into an event argument recognition multi-classification formula:
Figure BDA0002389509320000093
Figure BDA0002389509320000094
wherein,
Figure BDA0002389509320000095
indicating the event type of the ith event argument candidate
Figure BDA0002389509320000096
Summary of event argument roles playedThe value is obtained.
Figure BDA0002389509320000097
Indicating the event type of the ith event argument candidate
Figure BDA0002389509320000098
Plays the role of event argument.

Claims (7)

1. The method for extracting the combined event based on the entity relationship and the dependency Tree-LSTM is characterized by comprising the following steps of:
step 1, encoding an original text and text labeling information;
step 2, inputting the result of the step 1 into a bidirectional LSTM; obtaining forward implicit state vectors with timing
Figure FDA0002389509310000011
And backward implicit state vector
Figure FDA0002389509310000012
Step 3, firstly, analyzing an input sentence into a dependency Tree structure by using a Stanford CoreNLP tool, then inputting the coding result of the step 1 into a dependency Tree-LSTM constructed by the dependency Tree structure, and obtaining a Tree root node hidden state vector
Figure FDA0002389509310000013
And implicit State vectors at t instants
Figure FDA0002389509310000014
Step 4, carrying out entity relation vector RkEncoding a junction tree root node hidden state vector
Figure FDA0002389509310000015
Obtaining and saving entity relation sentence vector
Figure FDA0002389509310000016
Forward implicit state vectors connecting simultaneously bi-directional LSTMt instants
Figure FDA0002389509310000017
And backward implicit state vector
Figure FDA0002389509310000018
And an implicit State vector that depends on the Tree-LSTM t time
Figure FDA0002389509310000019
Solving new implicit state vectors
Figure FDA00023895093100000110
Thereby not only saving the information of the sub-nodes but also acquiring the local context information with a certain time sequence;
step 5, connecting the hidden state vector H at the time t in the step 4tCarrying out trigger word recognition and classification with the sentence vector F;
step 6, sequentially identifying the hidden state vector H of the t-th word identified as the trigger word in the step 5tThe ith event argument candidate word (the ith entity reference) implicit state vector
Figure FDA00023895093100000111
Sentence vector F containing entity relationship and entity relationship vector R of ith event argument candidatekEntity relationship argument role in
Figure FDA00023895093100000112
And connecting, and identifying and classifying event arguments.
2. The method for extracting combined event based on entity relationship and dependency Tree-LSTM as claimed in claim 1, wherein step 1 is implemented as follows:
1-1, acquiring unprocessed original text and text label information from a source file, wherein the label information comprises entity mention, entity type, event trigger word, event argument role, entity relationship and entity relationship argument role, wherein the label information comprises 7 entity types, 39 event trigger word types, 20 entity relationship types and 16 entity relationship argument roles; then, sentence and word segmentation are carried out on the original text by using Stanford CoreNLP; acquiring a dependency tree structure of parts of speech and sentences, wherein each word is used as a node of the tree structure; respectively creating a part-of-speech vector table, an entity type vector table, an entity relation argument role vector table, a trigger word type vector table and an event argument role vector table, wherein each vector table has initialization vectors corresponding to other types;
1-2, inquiring a pre-trained glove word vector matrix to obtain a word vector w of each word in a sentenceiThen, the part-of-speech vector table is inquired to obtain a part-of-speech vector wposAnd querying the entity type vector table to obtain an entity type vector we
Obtain each word representation xi={wi,wpos,weThus the sentence vector matrix is denoted W ═ x1,x2,...,xn-1,xnWhere n is the length of the sentence.
3. The method for extracting combined event based on entity relationship and dependency Tree-LSTM as claimed in claim 1 or 2, wherein step 2 is implemented as follows:
converting the vector matrix W of the sentence into { x }1,x2,...,xn-1,xnInputting the sentence into a bidirectional LSTM, and respectively acquiring a forward implicit state matrix of the sentence
Figure FDA0002389509310000021
And backward implicit state matrix
Figure FDA0002389509310000022
Wherein
Figure FDA0002389509310000023
And
Figure FDA0002389509310000024
representing the forward hidden state vector and the backward hidden state at time t, respectively, t ∈ [1, n]Bi-directional LSTM is a time series sensitive model and, therefore,
Figure FDA0002389509310000025
and
Figure FDA0002389509310000026
the above and below information with certain timing information is saved separately.
4. The method for extracting combined events based on entity relationship and dependency Tree-LSTM as claimed in claim 3, wherein step 3 is implemented as follows:
analyzing each sentence into a tree structure through a Stanford CoreNLP tool, wherein each word in the sentence forms a node of the tree structure, and the parent node or the child node of the node appears when the dependency relationship exists between the word and the node; changing W to { x ═ x1,x2,...,xn-1,xnInputting the information into a dependency Tree-LSTM constructed based on the Tree structure, and obtaining an implicit state vector of each node in the Tree structure parsed by the sentence
Figure FDA0002389509310000027
And the implicit state vector of the root node
Figure FDA0002389509310000028
Thus, the dependency Tree-LSTM of a sentence outputs the implicit state matrix of the sentence
Figure FDA0002389509310000029
Wherein t, root ∈ [1, n ]]And n is the length of the sentence.
5. The method for extracting combined events based on entity relationship and dependency Tree-LSTM as claimed in claim 4, wherein step 4 is implemented as follows:
4-1, obtaining an entity relation vector R in the sentence by inquiring the entity relation table initialized randomly in the step 1kRepresenting the kth entity relationship; if no entity relationship exists, RkPoint to the "other" entity relationship vectors and adjust the vectors during the training process;
the memory unit vector c and the hidden state vector h of each node in the 4-2 dependency Tree-LSTM are obtained by summing the hidden state vectors of the sub-nodes of the node; therefore, the root node in the semantic dependency tree structure contains the information of the whole sentence, and the hidden vector of the root node generated in the step 4 is used for making the sentence contain the sentence-level vector of the entity relationship information
Figure FDA0002389509310000031
And an entity relationship vector RkConnecting and obtaining sentence vector containing entity relation information
Figure FDA0002389509310000032
4-3, combining the hidden vectors at each moment in the step 2 and the step 3, and simultaneously, acquiring the hidden state vector at the moment t by adopting an averaging mode to reduce the dimensionality of the hidden vectors:
Figure FDA0002389509310000033
and the hidden state matrix of the whole sentence is H ═ H1,H2,···,Hn-1,HnWhere t ∈ [1, n ]]And n is the length of the sentence.
6. The method for extracting combined events based on entity relationship and dependency Tree-LSTM as claimed in claim 5, wherein step 5 is implemented as follows:
5-1, only verbs and nouns are specified as trigger word candidates, and 39 types of seeds are provided in total, wherein the types comprise 'other' types; judging the part of speech of each word in the sentence, if the part of speech is a verb or a noun, carrying out hidden state vector H at the current t momenttRepresentation and sentence vector F joinThen, inputting a trigger word multi-classification formula:
Pt tri=softmaxtri(WT[Ht,F]+bT)
Figure FDA0002389509310000034
wherein, WTAnd bTRespectively triggering a weight matrix and a bias item of multi-classification of words; pt triRepresenting the probability that the trigger word candidate for the tth word (each word being a time instant) triggers the event type,
Figure FDA0002389509310000035
indicating the type of event triggered at the t-th time.
7. The method for extracting combined event based on entity relationship and dependency Tree-LSTM as claimed in claim 6, wherein step 6 is implemented as follows:
6-1, 20 entity relation argument roles are provided, a randomly initialized entity relation argument role vector table is established, the vector table is searched through the entity relation argument roles, and the vector is adjusted in the training process; by using
Figure FDA0002389509310000036
Representing the ith entity mention in an entity relationship vector RkPlays the role of j entity relationship argument;
6-2, mentioning the entity in the sentence as a candidate word of event argument; sequentially subjecting the ith event argument candidate word (i.e. the ith entity reference) to implicit state vector
Figure FDA0002389509310000041
Implicit State vector H for the tth word identified as the trigger word in step 5-1tSentence vector F containing entity relationship and ith event argument candidate in entity relationship RkEntity relationship argument role in
Figure FDA0002389509310000042
Connecting; inputting the join vector into an event argument recognition multi-classification formula:
Figure FDA0002389509310000043
Figure FDA0002389509310000044
wherein, WAAnd bARespectively, a weight matrix and bias terms for the event argument class,
Figure FDA0002389509310000045
indicating the event type of the ith event argument candidate
Figure FDA0002389509310000046
Probability values of the role of event argument played;
Figure FDA0002389509310000047
indicating the event type of the ith event argument candidate
Figure FDA0002389509310000048
Plays the role of event argument.
CN202010109601.1A 2020-02-22 2020-02-22 Entity relationship and dependency Tree-LSTM-based combined event extraction method Active CN111353306B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010109601.1A CN111353306B (en) 2020-02-22 2020-02-22 Entity relationship and dependency Tree-LSTM-based combined event extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010109601.1A CN111353306B (en) 2020-02-22 2020-02-22 Entity relationship and dependency Tree-LSTM-based combined event extraction method

Publications (2)

Publication Number Publication Date
CN111353306A true CN111353306A (en) 2020-06-30
CN111353306B CN111353306B (en) 2020-10-16

Family

ID=71195780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010109601.1A Active CN111353306B (en) 2020-02-22 2020-02-22 Entity relationship and dependency Tree-LSTM-based combined event extraction method

Country Status (1)

Country Link
CN (1) CN111353306B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112417878A (en) * 2020-11-24 2021-02-26 北京明略昭辉科技有限公司 Entity relationship extraction method, system, electronic equipment and storage medium
CN112507077A (en) * 2020-12-15 2021-03-16 杭州电子科技大学 Event time sequence relation identification method based on relational graph attention neural network
CN112541364A (en) * 2020-12-03 2021-03-23 昆明理工大学 Chinese-transcendental neural machine translation method fusing multilevel language feature knowledge
CN112559713A (en) * 2020-12-24 2021-03-26 北京百度网讯科技有限公司 Text relevance judgment method and device, model, electronic equipment and readable medium
CN112784576A (en) * 2021-01-13 2021-05-11 哈尔滨工程大学 Text dependency syntax analysis method
CN113158667A (en) * 2021-04-09 2021-07-23 杭州电子科技大学 Event detection method based on entity relationship level attention mechanism
CN115794444A (en) * 2023-02-02 2023-03-14 广州钛动科技股份有限公司 Event communication method, device, computer equipment and computer readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693219A (en) * 2012-06-05 2012-09-26 苏州大学 Method and system for extracting Chinese event
CN104809105A (en) * 2015-05-11 2015-07-29 苏州大学 Method and system for identifying event argument and argument role based on maximum entropy
CN105260361A (en) * 2015-10-28 2016-01-20 南京邮电大学 Trigger word tagging system and method for biomedical events
US20170357625A1 (en) * 2016-06-14 2017-12-14 Northrop Grumman Systems Corporation Event extraction from documents
CN107992476A (en) * 2017-11-28 2018-05-04 苏州大学 Towards the language material library generating method and system of Sentence-level biological contexts network abstraction
CN108875809A (en) * 2018-06-01 2018-11-23 大连理工大学 The biomedical entity relationship classification method of joint attention mechanism and neural network
EP3407209A1 (en) * 2017-05-22 2018-11-28 Fujitsu Limited Apparatus and method for extracting and storing events from a plurality of heterogeneous sources
CN109635280A (en) * 2018-11-22 2019-04-16 园宝科技(武汉)有限公司 A kind of event extraction method based on mark
CN109657041A (en) * 2018-12-04 2019-04-19 南京理工大学 The problem of based on deep learning automatic generation method
CN110598001A (en) * 2019-08-05 2019-12-20 平安科技(深圳)有限公司 Method, device and storage medium for extracting association entity relationship
CN110651288A (en) * 2017-06-02 2020-01-03 苹果公司 Event extraction system and method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693219A (en) * 2012-06-05 2012-09-26 苏州大学 Method and system for extracting Chinese event
CN104809105A (en) * 2015-05-11 2015-07-29 苏州大学 Method and system for identifying event argument and argument role based on maximum entropy
CN105260361A (en) * 2015-10-28 2016-01-20 南京邮电大学 Trigger word tagging system and method for biomedical events
US20170357625A1 (en) * 2016-06-14 2017-12-14 Northrop Grumman Systems Corporation Event extraction from documents
EP3407209A1 (en) * 2017-05-22 2018-11-28 Fujitsu Limited Apparatus and method for extracting and storing events from a plurality of heterogeneous sources
CN110651288A (en) * 2017-06-02 2020-01-03 苹果公司 Event extraction system and method
CN107992476A (en) * 2017-11-28 2018-05-04 苏州大学 Towards the language material library generating method and system of Sentence-level biological contexts network abstraction
CN108875809A (en) * 2018-06-01 2018-11-23 大连理工大学 The biomedical entity relationship classification method of joint attention mechanism and neural network
CN109635280A (en) * 2018-11-22 2019-04-16 园宝科技(武汉)有限公司 A kind of event extraction method based on mark
CN109657041A (en) * 2018-12-04 2019-04-19 南京理工大学 The problem of based on deep learning automatic generation method
CN110598001A (en) * 2019-08-05 2019-12-20 平安科技(深圳)有限公司 Method, device and storage medium for extracting association entity relationship

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DIYA LI: "Biomedical Event Extraction Based on Knowledge-driven Tree-LSTM", 《HTTPS://BLENDER.CS.ILLINOIS.EDU/PAPER/BIOEVENT2019.PDF》 *
WENTAO YU: "Make It Directly: Event Extraction Based on Tree-LSTM and Bi-GRU", 《IEEE ACCESS》 *
周晶晶: "基于依存树的越南语新闻事件元素抽取技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
魏小梅: "生物事件抽取联合模型研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112417878A (en) * 2020-11-24 2021-02-26 北京明略昭辉科技有限公司 Entity relationship extraction method, system, electronic equipment and storage medium
CN112541364A (en) * 2020-12-03 2021-03-23 昆明理工大学 Chinese-transcendental neural machine translation method fusing multilevel language feature knowledge
CN112507077A (en) * 2020-12-15 2021-03-16 杭州电子科技大学 Event time sequence relation identification method based on relational graph attention neural network
CN112559713A (en) * 2020-12-24 2021-03-26 北京百度网讯科技有限公司 Text relevance judgment method and device, model, electronic equipment and readable medium
CN112559713B (en) * 2020-12-24 2023-12-01 北京百度网讯科技有限公司 Text relevance judging method and device, model, electronic equipment and readable medium
CN112784576A (en) * 2021-01-13 2021-05-11 哈尔滨工程大学 Text dependency syntax analysis method
CN112784576B (en) * 2021-01-13 2022-07-29 哈尔滨工程大学 Text dependency syntactic analysis method
CN113158667A (en) * 2021-04-09 2021-07-23 杭州电子科技大学 Event detection method based on entity relationship level attention mechanism
CN115794444A (en) * 2023-02-02 2023-03-14 广州钛动科技股份有限公司 Event communication method, device, computer equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN111353306B (en) 2020-10-16

Similar Documents

Publication Publication Date Title
CN111353306B (en) Entity relationship and dependency Tree-LSTM-based combined event extraction method
CN114020768B (en) Method for constructing SQL sentence generation model of Chinese natural language and application
CN106599032B (en) Text event extraction method combining sparse coding and structure sensing machine
US9672205B2 (en) Methods and systems related to information extraction
Demir et al. Improving named entity recognition for morphologically rich languages using word embeddings
CN105988990B (en) Chinese zero-reference resolution device and method, model training method and storage medium
Viola et al. Learning to extract information from semi-structured text using a discriminative context free grammar
CN109145260B (en) Automatic text information extraction method
CN111241294A (en) Graph convolution network relation extraction method based on dependency analysis and key words
Fonseca et al. A two-step convolutional neural network approach for semantic role labeling
Mori et al. A machine learning approach to recipe text processing
CN111274829B (en) Sequence labeling method utilizing cross-language information
CN107818141A (en) Incorporate the biomedical event extraction method of structuring key element identification
CN116628173B (en) Intelligent customer service information generation system and method based on keyword extraction
CN113821605B (en) Event extraction method
CN111091009B (en) Document association auditing method based on semantic analysis
US7627567B2 (en) Segmentation of strings into structured records
CN107526721B (en) Ambiguity elimination method and device for comment vocabularies of e-commerce products
CN111160027A (en) Cyclic neural network event time sequence relation identification method based on semantic attention
CN110245349A (en) A kind of syntax dependency parsing method, apparatus and a kind of electronic equipment
Stoeckel et al. Voting for POS tagging of Latin texts: Using the flair of FLAIR to better ensemble classifiers by example of Latin
KR101072460B1 (en) Method for korean morphological analysis
CN113158667B (en) Event detection method based on entity relationship level attention mechanism
Khan et al. A clustering framework for lexical normalization of Roman Urdu
CN114330349A (en) Specific field named entity recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200630

Assignee: Hangzhou Yuanchuan New Technology Co.,Ltd.

Assignor: HANGZHOU DIANZI University

Contract record no.: X2021330000781

Denomination of invention: Joint event extraction method based on entity relationship and dependent tree LSTM

Granted publication date: 20201016

License type: Common License

Record date: 20211206

EE01 Entry into force of recordation of patent licensing contract