CN109710919A - A kind of neural network event extraction method merging attention mechanism - Google Patents
A kind of neural network event extraction method merging attention mechanism Download PDFInfo
- Publication number
- CN109710919A CN109710919A CN201811428287.2A CN201811428287A CN109710919A CN 109710919 A CN109710919 A CN 109710919A CN 201811428287 A CN201811428287 A CN 201811428287A CN 109710919 A CN109710919 A CN 109710919A
- Authority
- CN
- China
- Prior art keywords
- event
- text
- vector
- word
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Machine Translation (AREA)
Abstract
The invention discloses a kind of neural network event extraction methods for merging attention mechanism.The present invention realizes that steps are as follows: step (1) pre-processes training sample and event text to be extracted;Step (2) cooperates with the two-way GRU network of context attention mechanism using the training sample training pre-processed;Event text to be extracted is inputted trained neural network by step (3), exports the event type of the event trigger word and prediction that extract;The present invention cooperates with two-way GRU network to analyze text using context attention mechanism, improves the recognition capability of polysemant in event trigger word identification, and this method has the ability of more accurate event category.
Description
Technical field
The invention belongs to natural language processing technique fields, are related to Event Distillation, event detection correlation technique, are specifically used for
Outgoing event is extracted in non-structured text, and is classified to event.
Background technique
Event extraction is the interested event of user to be extracted from unstructured information, and be presented to use with structuring
Family.Current main approaches have pattern match and machine learning two major classes.Pattern match can obtain higher in specific area
Performance, but transplantability is poor.Relative to pattern match, machine learning is unrelated with field, without the guidance of too many domain expert, is
Transplantability of uniting is preferable.As various textual resources are enriched constantly on the construction of related corpus and internet, the acquisition of corpus
It is no longer the bottleneck for fettering machine learning.Currently, machine learning has become the mainstream research method of event extraction.
Summary of the invention
The present invention is for there are polysemies to lead to event point of the polysemant as event trigger word in event extraction
The problem of class inaccuracy discloses and a kind of cooperates with two-way GRU mind by the event extraction method of network based on context attention mechanism.
To realize that the above technical purpose, the present invention will take technical solution below:
Step (1) training sample and event text to be extracted pretreatment, export the text and corresponding mark pre-processed
Sequence;
The two-way GRU network of step (2) training collaboration context attention mechanism;
Pretreated event text to be extracted is inputted trained two-way GRU network by step (3), is exported and is known in text
Not Chu event trigger word and prediction correspondence event type;
Preprocessing process described in step (1) is as follows;
1-1. by training sample markup information and content of text separate;
Content of text is converted into term vector by 1-2.;
1-3. exports the text and annotated sequence pre-processed;
Further, step 1-1 specifically:
XML marked content is processed into the corresponding mark of each word, concrete operations are as follows: original XML mark text passes through
Character offset is labelled with starting character position and the offset of event trigger word.First by event text by participle tool into
Row participle, while the information in text is marked according to XML, each of event text word is encoded, i.e., according to each
Whether word is that event trigger word carries out 0-38 coding (predefining 38 kinds of event types) respectively.
Further, step 1-2 specifically:
It will be walked by a large amount of news class texts training word2vec model by trained word2vec model first
The word in training data and event text to be extracted in rapid 1-1 is converted into term vector, gives subsequent neural network.
The two-way GRU network of the context attention mechanism of training collaboration described in step (2), is implemented as follows:
2-1. constructs two-way GRU neural network;
2-2. feature extraction proposes the neural network that the text exported in step (1) and annotated sequence are sent into step 2-1
Take feature;
2-3. trains context attention mechanism, the spy paid attention in force vector and step 2-2 that attention mechanism is exported
Sign vector does the final judgement of point multiplication operation output;
Further, step 2-1 specifically:
In the two-way GRU neural network model, input data is taken to 70% at random as training data, 15% conduct is tested
Data are demonstrate,proved, residue 15% is used as test data.Choose the chapter letter that GRU extracts each word in text as recurrent neural network
Breath, i.e. global characteristics.
Using the hidden layer feature exported in step 2-2, attention mechanism layer is inputted, each hidden layer vector is calculated
Attention force vector.
The attention force vector α of each word kkIt indicates are as follows:
Wherein, vector RkIt is hk(the hidden layer vector of the GRU output of word k) and Lk(term vector of word k) connection obtain to
Amount, for i value from 0 to n, n is attention window size.
It will notice that force vector carries out regularization, obtains finally paying attention to force vector α again*
Each word k is indicated are as follows:
R′k=Rkα*
It calculates hidden layer vector sum and notices that the dot product result of force vector obtains the vector of final each word, splice and calculate
The original term vector of the final corresponding vector sum of word, splicing result is inputted in classifier and obtains classification results.
The two-way GRU network of step (2) the training collaboration context attention mechanism, specific training process are as follows:
It is obtaining finally paying attention to force vector α*Afterwards, text representation is
X '=X (α*)T=(R '1..., R 'k..., R 'n)=(R1α1..., Rkαk..., Rnαn)
It is subject to attention the R ' of force vector influenceiAs the input of subsequent feedforward neural network F (), which contains one
A softmax layers for calculate in event subtype probability distribution p (t | Ri', θ):
p(t|Ri', θ)=Ft(Ri′)
Wherein t is a kind of event subtype, for given input sample Ri', the feedforward neural network that input parameter is θ
The output vector O of F.I-th of value o of OiIt is to Ri' it is predicted as the confidence level of t event subtype.This conditional probability in order to obtain
p(t|Ri', θ), we add softmax operation on all event types, and wherein m indicates all event type sums:
Finally we are by calculating ti=argmaxt(p(t|Ri', θ)) obtain word wiType of prediction ti.For what is given
All (assuming that T group) training examples (x(i), y(i)), we can define negative log-likelihood loss function:
Wherein D (θ) is the loss function for paying attention to force vector.We are using mean square error as loss function:
We update rule using stochastic gradient descent method (SGD) and AdaDelta and are trained to the batch data of grouping.
Regularization is realized by dropout.
Further, step (3) specifically:
Utilize the two-way of the trained collaboration context attention mechanism of the text input that term vector is converted in step 1-2
Event is extracted in GRU network.By original event text to be extracted and the transcription comparison for being converted to term vector, the thing that will identify that
The corresponding event trigger word of part type exports and exports the event type of prediction.
The present invention due to using the technology described above, has as follows a little:
The present invention cooperates with two-way GRU neural network to analyze text using context attention mechanism, touches in event
Sending out has good ability in word polysemy problem.
Detailed description of the invention
Fig. 1 is whole implementation program flow chart of the invention;
Fig. 2 is training pattern schematic diagram of the invention;
Fig. 3 is the specific flow chart of present invention process embodiment;
Specific embodiment
Attached drawing is unrestricted to disclose the flow diagram of preferred embodiment involved in the present invention;Below with reference to attached drawing
Technical solution of the present invention is described in detail.
A kind of textual event abstracting method cooperateing with two-way GRU based on context attention mechanism, basic step are as follows:
Pre-process training sample and event text to be extracted;
The two-way GRU neural network of training collaboration context attention mechanism;
The event text to be extracted pre-processed is inputted in trained neural network;
Export the event trigger word extracted and corresponding event type;
Wherein steps are as follows for the two-way GRU neural network of training collaboration context attention mechanism:
According to XML marked content by the textual portions progress word segmentation processing corresponding with mark in training sample, each
The corresponding event type of word individually mark file in be labeled (non-event trigger word event type be 0, predefined event
38 class of type, i.e., mark range is 0~38 in total).By the text after participle using word2vec model conversion at term vector,
Training data is divided into 3 parts, takes 70% at random as training sample, 15% as verifying sample, and 15% as test sample.
Training sample is inputted in two-way GRU neural network, random initializtion neural network parameter, according to GRU part and
Attention machined part content adjust automatically weight, training obtain two-way GRU network model.
Detailed process of the present invention such as Fig. 1 and Fig. 3.The two-way GRU mind of the collaboration context attention mechanism wherein specifically invented
Through network model such as Fig. 2.
Embodiment:
Event text to be extracted is obtained, word segmentation processing is carried out to text.Pass through trained word2vec model after processing
Word is converted to term vector.Input event trigger word and corresponding thing that trained two-way GRU neural network is extracted
Part type.
Such as Fig. 1, the two-way GRU nerve of the term vector and corresponding mark input collaboration context attention handled well is utilized
Network, the weight initialization neural network parameter that will be obtained are anti-by BP according to the part GRU and the operation of attention machined part
To the weight for propagating adjustment neuron, the two-way GRU neural network of trained collaboration context attention is obtained.And with training
Good two-way GRU model handles event text to be extracted.Specific steps include:
It is reversed that pretreated training text (term vector and corresponding event mark) two-way GRU network of input is subjected to BP
Propagate study, test iteration 50 times, select network desired output be each word in text event type.
By learning resume deep neural network system model, and event is carried out using the deep neural network system and is mentioned
It takes.
Such as attached drawing 3, the types text such as news is pre-processed, inputs trained deep neural network, carries out event
Extraction.Newsletter archive is segmented in this experiment, news passage is inputted into network, finally extracts and is sent out in each section of text
Raw event and corresponding event type.
Claims (3)
1. a kind of neural network event extraction method for merging attention mechanism, it is characterised in that include the following steps:
Step (1) training sample and event text to be extracted pretreatment, export the text and corresponding annotated sequence pre-processed;
The two-way GRU network of step (2) training collaboration context attention mechanism;
Pretreated event text to be extracted is inputted trained two-way GRU network by step (3), is exported and is identified in text
Event trigger word and prediction correspondence event type;
Preprocessing process described in step (1) is specific as follows;
1-1. by training sample markup information and content of text separate;
Content of text is converted into term vector by 1-2.;
1-3. exports the text and annotated sequence pre-processed;
The two-way GRU network of the context attention mechanism of training collaboration described in step (2), is implemented as follows:
2-1. constructs two-way GRU network;
2-2. feature extraction, by the neural network in text and annotated sequence feeding step 2-1 pre-process in step (1) into
Row feature extraction exports the feature vector of acquisition;
2-3. trains attention mechanism layer, and the feature output exported in step 2-2 is as input, for each input feature vector output note
Meaning force vector simultaneously does the final judgement of cartesian product output with the feature vector exported in step 2-2;
The correspondence event type of the event trigger word and prediction that are identified in output text described in step (3), specific as follows:
It will be converted to the text of term vector in step 1-2, inputs the two-way GRU net of trained integrating context attention mechanism
Event is extracted in network;By original event text to be extracted and the transcription comparison for being converted to term vector, the event class that will identify that
The corresponding event trigger word of type exports and exports the event type of prediction.
2. a kind of neural network event extraction method for merging attention mechanism according to claim 1, it is characterised in that
Step 2-1, step 2-2, step 2-3, are implemented as follows:
In the two-way GRU neural network model, input data is taken to 70% at random as training data, 15% as verifying number
According to residue 15% is used as test data;The chapter information that GRU extracts each word in text as recurrent neural network is chosen, i.e.,
Global characteristics;
Using the feature vector exported in step 2-2, attention mechanism layer is inputted, respective attention force vector is calculated;
The attention force vector α of each word kkIt indicates are as follows:
Wherein, vector RkIt is hkAnd LkObtained vector is connected, for i value from 0 to n, n is attention window size, hkIt is word k
The hidden layer vector of GRU output;LkIt is the term vector of word k;Again will pay attention to force vector carry out regularization, obtain final attention to
Measure α*;
Each word k is indicated are as follows:
R'k=Rkα*
Calculate hidden layer vector sum and notice that the dot product result of force vector obtains the vector of final each word, splice calculate it is final
Splicing result is inputted in classifier and obtains classification results by the original term vector of the corresponding vector sum of word.
3. a kind of textual event extraction side for cooperateing with two-way GRU based on context attention mechanism according to claim 2
Method, it is characterised in that
The two-way GRU network of step (2) the training collaboration context attention mechanism, specific training process are as follows:
It is obtaining finally paying attention to force vector α*Afterwards, text representation are as follows:
X'=X (α*)T=(R'1..., R'k..., R'n)=(R1α1..., Rkαk..., Rnαn)
It is subject to attention the R' of force vector influenceiAs the input of subsequent feedforward neural network F (), which contains one
Softmax layers for calculate in event subtype probability distribution p (t | Ri', θ):
p(t|Ri', θ)=Ft(Ri’)
Wherein t is a kind of event subtype, for given input sample Ri', the feedforward neural network F's that input parameter is θ is defeated
Outgoing vector Ο;I-th of value o of ΟiIt is to Ri' it is predicted as the confidence level of t event subtype;In order to obtain conditional probability p (t |
Ri', θ), softmax operation is added on all event types, wherein m indicates all event type sums:
Eventually by calculating ti=argmaxt(p(t|Ri', θ)) obtain word wiType of prediction ti;For given all training
Sample (x(i), y(i)), define negative log-likelihood loss function:
Wherein, D (θ) is the loss function for paying attention to force vector, using mean square error as loss function:
Rule is updated using stochastic gradient descent method and AdaDelta to be trained the batch data of grouping, passes through dropout reality
Existing regularization.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428287.2A CN109710919A (en) | 2018-11-27 | 2018-11-27 | A kind of neural network event extraction method merging attention mechanism |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811428287.2A CN109710919A (en) | 2018-11-27 | 2018-11-27 | A kind of neural network event extraction method merging attention mechanism |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109710919A true CN109710919A (en) | 2019-05-03 |
Family
ID=66254518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811428287.2A Pending CN109710919A (en) | 2018-11-27 | 2018-11-27 | A kind of neural network event extraction method merging attention mechanism |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109710919A (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110163302A (en) * | 2019-06-02 | 2019-08-23 | 东北石油大学 | Indicator card recognition methods based on regularization attention convolutional neural networks |
CN110210019A (en) * | 2019-05-21 | 2019-09-06 | 四川大学 | A kind of event argument abstracting method based on recurrent neural network |
CN110321557A (en) * | 2019-06-14 | 2019-10-11 | 广州多益网络股份有限公司 | A kind of file classification method, device, electronic equipment and storage medium |
CN110334213A (en) * | 2019-07-09 | 2019-10-15 | 昆明理工大学 | The Chinese based on bidirectional crossed attention mechanism gets over media event sequential relationship recognition methods |
CN110414498A (en) * | 2019-06-14 | 2019-11-05 | 华南理工大学 | A kind of natural scene text recognition method based on intersection attention mechanism |
CN110427615A (en) * | 2019-07-17 | 2019-11-08 | 宁波深擎信息科技有限公司 | A kind of analysis method of the financial events modification tense based on attention mechanism |
CN110472051A (en) * | 2019-07-24 | 2019-11-19 | 中国科学院软件研究所 | A kind of event detecting method indicating study based on variable quantity |
CN110532452A (en) * | 2019-07-12 | 2019-12-03 | 西安交通大学 | A kind of general crawler design method of news website based on GRU neural network |
CN110619420A (en) * | 2019-07-31 | 2019-12-27 | 广东工业大学 | Attention-GRU-based short-term residential load prediction method |
CN110633867A (en) * | 2019-09-23 | 2019-12-31 | 国家电网有限公司 | Ultra-short-term load prediction model based on GRU and attention mechanism |
CN111222330A (en) * | 2019-12-26 | 2020-06-02 | 中国电力科学研究院有限公司 | Chinese event detection method and system |
CN111277564A (en) * | 2020-01-08 | 2020-06-12 | 济南浪潮高新科技投资发展有限公司 | Enterprise network anomaly detection method and system based on dynamic storage network |
CN111475642A (en) * | 2020-02-29 | 2020-07-31 | 新华三大数据技术有限公司 | Text classification method and device and model training method |
CN111625652A (en) * | 2019-07-12 | 2020-09-04 | 杭州电子科技大学 | Attention neural network method based on multi-path dynamic mask |
CN111680510A (en) * | 2020-07-07 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Text processing method and device, computer equipment and storage medium |
CN112307761A (en) * | 2020-11-19 | 2021-02-02 | 新华智云科技有限公司 | Event extraction method and system based on attention mechanism |
CN112487171A (en) * | 2020-12-15 | 2021-03-12 | 中国人民解放军国防科技大学 | Event extraction system and method under open domain |
CN112597366A (en) * | 2020-11-25 | 2021-04-02 | 中国电子科技网络信息安全有限公司 | Encoder-Decoder-based event extraction method |
WO2021068528A1 (en) * | 2019-10-11 | 2021-04-15 | 平安科技(深圳)有限公司 | Attention weight calculation method and apparatus based on convolutional neural network, and device |
CN112686040A (en) * | 2020-12-31 | 2021-04-20 | 北京理工大学 | Event reality detection method based on graph recurrent neural network |
CN113312500A (en) * | 2021-06-24 | 2021-08-27 | 河海大学 | Method for constructing event map for safe operation of dam |
CN113761936A (en) * | 2021-08-19 | 2021-12-07 | 哈尔滨工业大学(威海) | Multi-task chapter-level event extraction method based on multi-head self-attention mechanism |
CN113946677A (en) * | 2021-09-14 | 2022-01-18 | 中北大学 | Event identification and classification method based on bidirectional cyclic neural network and attention mechanism |
CN115759036A (en) * | 2022-10-28 | 2023-03-07 | 中国矿业大学(北京) | Method for constructing recommendation-based event detection model and method for detecting event by using model |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108256968A (en) * | 2018-01-12 | 2018-07-06 | 湖南大学 | A kind of electric business platform commodity comment of experts generation method |
CN108334499A (en) * | 2018-02-08 | 2018-07-27 | 海南云江科技有限公司 | A kind of text label tagging equipment, method and computing device |
CN108733792A (en) * | 2018-05-14 | 2018-11-02 | 北京大学深圳研究生院 | A kind of entity relation extraction method |
CN108829818A (en) * | 2018-06-12 | 2018-11-16 | 中国科学院计算技术研究所 | A kind of file classification method |
-
2018
- 2018-11-27 CN CN201811428287.2A patent/CN109710919A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108256968A (en) * | 2018-01-12 | 2018-07-06 | 湖南大学 | A kind of electric business platform commodity comment of experts generation method |
CN108334499A (en) * | 2018-02-08 | 2018-07-27 | 海南云江科技有限公司 | A kind of text label tagging equipment, method and computing device |
CN108733792A (en) * | 2018-05-14 | 2018-11-02 | 北京大学深圳研究生院 | A kind of entity relation extraction method |
CN108829818A (en) * | 2018-06-12 | 2018-11-16 | 中国科学院计算技术研究所 | A kind of file classification method |
Non-Patent Citations (1)
Title |
---|
张兰霞 等: "基于双向GRU 神经网络和双层注意力机制的中文文本中人物关系抽取研究", 《计算机应用与软件》 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110210019A (en) * | 2019-05-21 | 2019-09-06 | 四川大学 | A kind of event argument abstracting method based on recurrent neural network |
CN110163302A (en) * | 2019-06-02 | 2019-08-23 | 东北石油大学 | Indicator card recognition methods based on regularization attention convolutional neural networks |
CN110163302B (en) * | 2019-06-02 | 2022-03-22 | 东北石油大学 | Indicator diagram identification method based on regularization attention convolution neural network |
CN110321557A (en) * | 2019-06-14 | 2019-10-11 | 广州多益网络股份有限公司 | A kind of file classification method, device, electronic equipment and storage medium |
CN110414498A (en) * | 2019-06-14 | 2019-11-05 | 华南理工大学 | A kind of natural scene text recognition method based on intersection attention mechanism |
CN110334213A (en) * | 2019-07-09 | 2019-10-15 | 昆明理工大学 | The Chinese based on bidirectional crossed attention mechanism gets over media event sequential relationship recognition methods |
CN110334213B (en) * | 2019-07-09 | 2021-05-11 | 昆明理工大学 | Method for identifying time sequence relation of Hanyue news events based on bidirectional cross attention mechanism |
CN111625652A (en) * | 2019-07-12 | 2020-09-04 | 杭州电子科技大学 | Attention neural network method based on multi-path dynamic mask |
CN110532452A (en) * | 2019-07-12 | 2019-12-03 | 西安交通大学 | A kind of general crawler design method of news website based on GRU neural network |
CN110532452B (en) * | 2019-07-12 | 2022-04-22 | 西安交通大学 | News website universal crawler design method based on GRU neural network |
CN110427615A (en) * | 2019-07-17 | 2019-11-08 | 宁波深擎信息科技有限公司 | A kind of analysis method of the financial events modification tense based on attention mechanism |
CN110472051A (en) * | 2019-07-24 | 2019-11-19 | 中国科学院软件研究所 | A kind of event detecting method indicating study based on variable quantity |
CN110619420A (en) * | 2019-07-31 | 2019-12-27 | 广东工业大学 | Attention-GRU-based short-term residential load prediction method |
CN110619420B (en) * | 2019-07-31 | 2022-04-08 | 广东工业大学 | Attention-GRU-based short-term residential load prediction method |
CN110633867A (en) * | 2019-09-23 | 2019-12-31 | 国家电网有限公司 | Ultra-short-term load prediction model based on GRU and attention mechanism |
WO2021068528A1 (en) * | 2019-10-11 | 2021-04-15 | 平安科技(深圳)有限公司 | Attention weight calculation method and apparatus based on convolutional neural network, and device |
CN111222330A (en) * | 2019-12-26 | 2020-06-02 | 中国电力科学研究院有限公司 | Chinese event detection method and system |
CN111222330B (en) * | 2019-12-26 | 2022-07-12 | 中国电力科学研究院有限公司 | Chinese event detection method and system |
CN111277564B (en) * | 2020-01-08 | 2022-06-28 | 山东浪潮科学研究院有限公司 | Enterprise network anomaly detection method and system based on dynamic storage network |
CN111277564A (en) * | 2020-01-08 | 2020-06-12 | 济南浪潮高新科技投资发展有限公司 | Enterprise network anomaly detection method and system based on dynamic storage network |
CN111475642A (en) * | 2020-02-29 | 2020-07-31 | 新华三大数据技术有限公司 | Text classification method and device and model training method |
CN111680510A (en) * | 2020-07-07 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Text processing method and device, computer equipment and storage medium |
CN112307761A (en) * | 2020-11-19 | 2021-02-02 | 新华智云科技有限公司 | Event extraction method and system based on attention mechanism |
CN112597366B (en) * | 2020-11-25 | 2022-03-18 | 中国电子科技网络信息安全有限公司 | Encoder-Decoder-based event extraction method |
CN112597366A (en) * | 2020-11-25 | 2021-04-02 | 中国电子科技网络信息安全有限公司 | Encoder-Decoder-based event extraction method |
CN112487171A (en) * | 2020-12-15 | 2021-03-12 | 中国人民解放军国防科技大学 | Event extraction system and method under open domain |
CN112686040A (en) * | 2020-12-31 | 2021-04-20 | 北京理工大学 | Event reality detection method based on graph recurrent neural network |
CN113312500A (en) * | 2021-06-24 | 2021-08-27 | 河海大学 | Method for constructing event map for safe operation of dam |
CN113761936A (en) * | 2021-08-19 | 2021-12-07 | 哈尔滨工业大学(威海) | Multi-task chapter-level event extraction method based on multi-head self-attention mechanism |
CN113946677A (en) * | 2021-09-14 | 2022-01-18 | 中北大学 | Event identification and classification method based on bidirectional cyclic neural network and attention mechanism |
CN115759036A (en) * | 2022-10-28 | 2023-03-07 | 中国矿业大学(北京) | Method for constructing recommendation-based event detection model and method for detecting event by using model |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109710919A (en) | A kind of neural network event extraction method merging attention mechanism | |
CN110134757A (en) | A kind of event argument roles abstracting method based on bull attention mechanism | |
US20200073882A1 (en) | Artificial intelligence based corpus enrichment for knowledge population and query response | |
CN111143550B (en) | Method for automatically identifying dispute focus based on hierarchical attention neural network model | |
CN107168945B (en) | Bidirectional cyclic neural network fine-grained opinion mining method integrating multiple features | |
CN110569508A (en) | Method and system for classifying emotional tendencies by fusing part-of-speech and self-attention mechanism | |
CN106844349B (en) | Comment spam recognition methods based on coorinated training | |
CN110334213B (en) | Method for identifying time sequence relation of Hanyue news events based on bidirectional cross attention mechanism | |
CN112231472B (en) | Judicial public opinion sensitive information identification method integrated with domain term dictionary | |
CN111046670B (en) | Entity and relationship combined extraction method based on drug case legal documents | |
CN109635288A (en) | A kind of resume abstracting method based on deep neural network | |
CN112183064B (en) | Text emotion reason recognition system based on multi-task joint learning | |
CN110532563A (en) | The detection method and device of crucial paragraph in text | |
CN112732916A (en) | BERT-based multi-feature fusion fuzzy text classification model | |
CN109783644A (en) | A kind of cross-cutting emotional semantic classification system and method based on text representation study | |
CN110427458A (en) | Five bilingual classification sentiment analysis methods of social networks based on two-door LSTM | |
CN113742733B (en) | Method and device for extracting trigger words of reading and understanding vulnerability event and identifying vulnerability type | |
CN110232123A (en) | The sentiment analysis method and device thereof of text calculate equipment and readable medium | |
CN112395417A (en) | Network public opinion evolution simulation method and system based on deep learning | |
CN112561718A (en) | Case microblog evaluation object emotion tendency analysis method based on BilSTM weight sharing | |
CN112580330B (en) | Vietnam news event detection method based on Chinese trigger word guidance | |
CN115062104A (en) | Knowledge prompt-fused legal text small sample named entity identification method | |
CN112070139A (en) | Text classification method based on BERT and improved LSTM | |
CN114722835A (en) | Text emotion recognition method based on LDA and BERT fusion improved model | |
CN115859980A (en) | Semi-supervised named entity identification method, system and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190503 |
|
RJ01 | Rejection of invention patent application after publication |