CN109408812A - A method of the sequence labelling joint based on attention mechanism extracts entity relationship - Google Patents

A method of the sequence labelling joint based on attention mechanism extracts entity relationship Download PDF

Info

Publication number
CN109408812A
CN109408812A CN201811157788.1A CN201811157788A CN109408812A CN 109408812 A CN109408812 A CN 109408812A CN 201811157788 A CN201811157788 A CN 201811157788A CN 109408812 A CN109408812 A CN 109408812A
Authority
CN
China
Prior art keywords
word
sentence
indicate
attention mechanism
entity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811157788.1A
Other languages
Chinese (zh)
Inventor
刘博�
张佳慧
史超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201811157788.1A priority Critical patent/CN109408812A/en
Publication of CN109408812A publication Critical patent/CN109408812A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The method that the sequence labelling joint based on attention mechanism that the invention discloses a kind of extracts entity relationship, first carry out the Chinese sentence corpus of magnanimity the pretreatment such as to denoise, then it is segmented, being converted into vector to single word indicates, the input as two-way length memory network in short-term in this way encodes individual character.Using two-way length, memory network can not only learn long-term and short-term Dependency Specification in short-term, the data of input layer can also be calculated by forwardly and rearwardly both direction simultaneously, to learn past contextual information and following contextual information, this is very useful to the sequence labelling of sentence.Then attention mechanism is introduced in decoding layer, so that decoding generates the information vector of available front coding stage each character hidden layer when annotated sequence, the information for making full use of list entries to carry.The entity tag probability of each word is calculated finally by softmax, can effectively obtain final annotated sequence and carries out the combination of entity and its corresponding relationship.

Description

A method of the sequence labelling joint based on attention mechanism extracts entity relationship
Technical field
The invention belongs to natural language processing technique field more particularly to the attentions of some sequence labellings and deep learning Mechanism joint extracts the entity relationship in non-structural text.
Background technique
With the arriving of big data era, various information are flooded with our life, and most of is all rambling Data or non-structural natural language text, Yao Congzhong extract useful information and are just particularly important.Information extraction one As include two subtasks being closely connected i.e. Entity recognition and Relation extraction, basic goal be exactly from structureless network or The semantic relation between name entity and entity is excavated in the text of field, is extracted information and is carried out structured storage, so as to Intuitively understand in people, receive information.Entity relation extraction is also a vital task in natural language processing simultaneously, it It is all related in multiple fields, for example, the building of domain knowledge map, information retrieval, machine translation, automatic question answering etc., all have There is stronger supporting role, thus there is biggish researching value and research significance.
According to the degree of dependence to labeled data, entity relation extraction method can be divided into supervised learning method, semi-supervised Learning method, unsupervised learning method and open abstracting method.There is the learning method of supervision to regard Relation extraction task to divide Class problem designs effective feature according to training data, to learn various disaggregated models, then uses trained classifier Projected relationship.Existing supervised learning Relation extraction method has been achieved for preferable effect, but their heavy dependence parts of speech The natural language processings marks such as mark, syntax parsing provide characteristic of division.And natural language processing annotation tool often exists greatly Mistake is measured, these mistakes will constantly propagate amplification, the final effect for influencing Relation extraction in Relation extraction system.Recently, Many researchers start the technical application by deep learning into Relation extraction.Rink et al. extracts entity first, then Identify the relationship between them, frame of this separation is easily handled two tasks all and more flexible.Socher etc. People proposes to solve the problems, such as Relation extraction using recurrent neural network and syntactic structure.This method is first by recurrent neural network Syntax parsing first is carried out to sentence and constructs a syntax tree, merges according to the syntactic structure iteration of sentence, finally obtains this The vector of sentence indicates.This method can effectively consider the syntactic structure information of sentence, but can not consider two realities well Position and semantic information of the body in sentence.Zeng et al. proposes to carry out entity relation extraction using convolutional neural networks.They Using the position vector of word and term vector as the input of convolutional neural networks, and in feature be added entity position vector and its His relevant vocabulary, enables the entity information in sentence to be preferably applied in Relation extraction.It will but this above The method that Entity recognition and Relation extraction are implemented separately ignores inner link between the two.Zheng et al. proposes one kind Entity relationship combines abstracting method end to end, and joint is extracted and is converted into sequence labelling problem, passes through long memory network in short-term Question sentence is coded and decoded, and adds biasing loss and finally obtains annotated sequence.This algorithm takes full advantage of context Information, but apply on English data set, it is very different with Chinese corpus, and the model when list entries is very long It is difficult to acquire reasonable vector expression, all contextual informations not distinguished, which limits the performances of model, lead to mould The effect of type is poor.
Summary of the invention
The technical problem to be solved by the present invention is to propose that a kind of sequence labelling joint based on attention mechanism (ATT) is taken out The method for taking entity relationship, the new dimension model proposed first according to Zheng et al., using natural language sentence as Seq2Seq List entries, word, which is converted into vector, by embedding layers indicates, and uses two-way length memory network (LSTM) in short-term It is encoded, the mark to relationship is added on the basis of former dimension model, is then equally using long short-term memory net Network exports addition attention mechanism, the sequence marked finally by softmax layers when being decoded, and thus may be used To obtain the annotated sequence to entire sentence, convenient for extracting entity relationship by the identification of sequence.
The present invention proposes a kind of method that joint extracts entity relationship towards magnanimity Chinese sentence corpus.First by magnanimity Chinese sentence corpus carry out the pretreatment such as denoising, then segmented, to single word be converted into vector indicate, in this way may be used Individual character is encoded using the input as two-way length memory network in short-term.Using two-way length, memory network can not only be learned in short-term It practises for a long time with short-term Dependency Specification, can also simultaneously calculate the data of input layer by forwardly and rearwardly both direction, thus Learn past contextual information and following contextual information, this is very useful to the sequence labelling of sentence.Then exist Decoding layer introduces attention mechanism, so that decoding generates available front coding stage each character hidden layer when annotated sequence Information vector, make full use of list entries carry information.The entity tagization for calculating each word finally by softmax is general Rate can effectively obtain final annotated sequence and carry out the combination of entity and its corresponding relationship.
To achieve the goals above, the invention adopts the following technical scheme: extracting entity to combine by annotated sequence Relationship realizes the conversion from statement sequence to annotated sequence using Seq2Seq model joint Attention mechanism.It first passes around The embedding layers of high dimension vector by the expression word in higher dimensional space (dimension in space is usually the size of dictionary) is mapped to low Vector in dimension (tens dimensions) continuous space, i.e., sentence is converted into vector indicates.Then using obtained term vector as two-way The input of long memory network in short-term is encoded, including two parallel LSTM layers: positive LSTM layers and LSTM layers reversed, difference Calculate forward and calculate backward.Finally by decoding layer length in short-term memory network by the vector generated before export at mark Sequence is infused, and attention mechanism is added wherein, allows the mark of generation that can only instead of not pay close attention to global semantic coding vector, It increases one " attention range ", indicates to pay close attention to which of list entries part when next output mark, Then next output is generated according to the region of concern.Finally decoding obtained annotated sequence should include that this word is in entity In position, corresponding relationship and the role in relationship combine to obtain entity relationship according to annotated sequence.This has just obtained one It is a that abstracting method is combined based on the entity relationship of sequence labelling and LSTM attention mechanism.
The technical solution that this method uses extracts entity relationship for a kind of sequence labelling joint based on attention mechanism Method, method includes the following steps:
Step 1, the entity relationship data set for obtaining Opening field are simultaneously pre-processed, and pretreated process is by the data Collection is divided into training set and test set two parts, this two parts all includes sentence to be processed, is divided sentence wherein included Word processing, so that sentence is converted into individual word.
The each word obtained in sentence after pretreatment is converted into vector expression by embeding layer, and is input to double by step 2 It is encoded into the long coding layer of memory network in short-term.
Memory network decodes in short-term by the length with attention mechanism for step 3, the output for obtaining coding layer, and wherein Attention mechanism is added.
Step 4 exports entity tag probability, completion and composite entity based on mark predicted vector by softmax layers And relationship, obtain triple.
Preferably, step 1 specifically includes the following steps:
Step 1.1, the entity relationship data set for obtaining Opening field, and all data concentrated to data carry out at denoising Reason, including the useless blank character of removal, capitalization are converted into small letter etc.;
Data set is divided into training set and test set by step 1.2;
Step 1.3 establishes user's Custom Dictionaries, such as long word and proper noun, and using at Harbin Institute of Technology's natural language Science and engineering tool LTP segments sentence;
Preferably, step 2 specifically includes the following steps:
Step 2.1, the corpus training Word2Vec term vector model using wikipedia, the dimension of term vector is 300;
Step 2.2, the term vector mapping matrix generated by Word2Vec, obtain the corresponding term vector of each word, whole The term vector of a sentence is expressed as { w1,w2,......wn, wnIndicate that the term vector of n-th of word indicates;
The term vector that embeding layer is converted is input to two-way LSTM coding layer by step 2.3, it includes LSTM layers of forward direction, instead To LSTM layers and articulamentum;
Step 2.4 encodes contextual information and semantic information by two-way LSTM coding layer, and positive LSTM is from w1 To wn, reversed LSTM is from wnTo w1, and the coding vector of entire sentence, calculation formula such as formula (1) are exported in the hidden layer of neuron (2) shown in (3) (4) (5) (6).
it=δ (Wwiwt+Whiht-1+Wcict-1+bi) (1)
ft=δ (Wwfwt+Whfht-1+Wcfct-1+bf) (2)
zt=tanh (Wwcwt+Whcht-1+bc) (3)
ct=ftct-1+itzt (4)
ot=δ (Wwowt+Whoht-1+Wcoct+bo) (5)
ht=ot tanh(ct) (6)
I, f, z, o of formula (1) (2) (3) (5) are respectively input gate, forget door, update door, out gate, the c in formula (4)t Indicate the cell state of t moment, the h in formula (6)tIndicate the output of t moment, W indicates relevant parameter, the W in formula (3)wcAnd Whc The parameter of the parameter of the cell state of word and the cell state of output is respectively indicated, the W in (1) (2) (5)wx、WhxAnd WcxRespectively Indicate the parameter of x words, the parameter of output and the parameter of cell state, wtIndicate that t-th of word, b indicate biasing loss, δ table Show sigmoid activation primitive.
Preferably, step 3 specifically includes the following steps:
Output after two-way LSTM coding is input to LSTM decoding by step 3.1;
Attention mechanism is added in step 3.2, makes position of the model learning context in entity and in relationship Role, the result y that final decoded result is predicted by last momentt-1, the input s at this momenttOn relevant with to this moment Hereafter annotated sequence cseqtIt obtains, shown in calculation formula such as formula (7), (8).
p(yt|y1,y2,......,yt-1, cseq) and=g (yt-1,st,cseqt) (7)
L in formula (8)xIndicate the length of sentence, aseqtjIndicate the Automobile driving of j-th of word mark in read statement Coefficient, hjIndicate the semantic coding of j-th of word, cseqtIndicate the relevant context annotated sequence of t moment.
Step 3.3, the decoded sequence of output.
Opposite with the prior art, the present invention has following clear superiority:
For the present invention when joint extracts entity relationship, will extract Task Switching is sequence labelling problem, using based on two-way The Seq2Seq model of long memory network in short-term, and entity and relationship are marked word-based attention mechanism is wherein added Note.Opposite other methods make to carry out joint in this way and extract to avoid first to extract entity and extract error caused by relationship again Accumulation problem has deepened the inner link between entity and relationship, and the Feature Engineering for not needing complexity can learn to length Short-term Dependency Specification.In addition, attention mechanism is introduced in decoding layer, can reduce the computation burden of processing higher-dimension list entries, By the subset of the selection input of structuring, data dimension is reduced, while allowing task processing system to focus more on and finding input sequence Significant useful information relevant to currently exporting in column, to improve the quality of output.In addition, sequence is added in attention mechanism In decoding process in column mark task, the accuracy of sequence labelling can be effectively improved, improves whole efficiency.To sum up institute It states, it is proposed in this paper to be had based on the entity relationship of sequence labelling and attention mechanism joint abstracting method towards magnanimity Chinese Material, the advantage that entity relationship connection is close, accuracy rate is high.
Detailed description of the invention
Fig. 1 is the flow chart of method involved in the present invention.
Fig. 2 is the Seq2Seq model structure of sequence labelling task in the present invention.
Fig. 3 is the structure chart for the attention model that present invention decoding uses.
Fig. 4 is the present invention finally to the example of sentence annotated sequence.
Specific embodiment
Yi Xiajiehejutishishili,Bing Canzhaofutu,Dui Benfamingjinyibuxiangxishuoming.
Hardware device used in the present invention has PC machine 1;
As shown in Figure 1, the present invention provides a kind of side of sequence labelling joint extraction entity relationship based on attention mechanism Method, specifically includes the following steps:
Step 1, the entity relationship data set for obtaining Opening field are simultaneously pre-processed, and pretreated process is by the data Collection is divided into training set and test set two parts, this two parts all includes sentence to be processed, is divided sentence wherein included Word processing, so that sentence is converted into individual word.
Step 1.1, the entity relationship data set of Opening field is obtained, and denoising is carried out to these data, including go Except useless blank character, capitalization are converted into small letter etc.;
Step 1.2, data set is divided into training set and test set;
Step 1.3, user's Custom Dictionaries, such as long word and proper noun are established, and using at Harbin Institute of Technology's natural language Science and engineering tool LTP segments sentence;
As shown in Fig. 2, obtaining word segmentation result after to Chinese sentence " founder of Apple Inc. is Qiao Busi " pretreatment " founder of Apple Inc. is Qiao Busi ".
Step 2, each word in sentence after pretreatment is converted into vector by embeding layer indicates, and is input to two-way length It is encoded in short-term memory network, the structure of whole Seq2Seq model is as shown in Figure 2.
Step 2.1, using the corpus training Word2Vec term vector model of wikipedia, the dimension of term vector is 300, will The word segmentation result that step 1 obtains inputs in trained term vector model;
Step 2.2, the term vector mapping matrix generated by Word2Vec, obtains the corresponding term vector of each word, word Vector can be expressed as { w1,w2,......,wn};
Step 2.3, the term vector that embeding layer is converted is input to two-way LSTM coding layer, it includes LSTM layers of forward direction, instead To LSTM layers and articulamentum;
Step 2.4, contextual information and semantic information are encoded by two-way LSTM coding layer, positive LSTM is from w1 To wn, reversed LSTM is from wnTo w1, and the coding vector of entire sentence, calculation formula such as formula (1) are exported in the hidden layer of neuron (2) shown in (3) (4) (5) (6).
it=δ (Wwiwt+Whiht-1+Wcict-1+bi) (1)
ft=δ (Wwfwt+Whfht-1+Wcfct-1+bf) (2)
zt=tanh (Wwcwt+Whcht-1+bc) (3)
ct=ftct-1+itzt (4)
ot=δ (Wwowt+Whoht-1+Wcoct+bo) (5)
ht=ot tanh(ct) (6)
I, f, z, o of formula (1) (2) (3) (5) are respectively input gate, forget door, update door, out gate, the c in (4)tTable Show the cell state of t moment, the h in (6)tIndicate the output of t moment, W indicates relevant parameter, the W in (3)wcAnd WhcTable respectively Show the parameter of the parameter of the cell state of word and the cell state of output, the W in (1) (2) (5)wx、WhxAnd WcxRespectively indicate x Word parameter, the parameter of output and the parameter of cell state, wtIndicate that t-th of word, b indicate biasing loss, δ is indicated Sigmoid activation primitive.
Step 3, by the length with attention mechanism, memory network decodes vector coding layer obtained in short-term, and wherein Attention mechanism is added.
Step 3.1, the output after two-way LSTM coding is input to LSTM decoding;
As shown in figure 3, contextual information when by sequence labelling is imagined as being by a series of<Key, Value>word is to structure At given some to be marked words Query is obtained by calculating the similitude or correlation of Query and each Key at this time Each Key corresponds to the weight coefficient of Value, is then weighted summation to Value to get final Attention number has been arrived Value.
Attention mechanism is added in step 3.2, makes position of the model learning context in entity and in relationship Role, the result y that final decoded result is predicted by last momentt-1, the input s at this momenttOn relevant with to this moment Hereafter annotated sequence cseqtIt obtains, shown in calculation formula such as formula (7), (8).
p(yt|y1,y2,......,yt-1, cseq) and=g (yt-1,st,cseqt) (7)
L in formula (8)xIndicate the length of sentence, aseqtjIndicate the Automobile driving of j-th of word mark in read statement Coefficient, hjIndicate the semantic coding of j-th of word, cseqtIndicate the relevant context annotated sequence of t moment.
Step 3.3, decoded sequence is exported.
Step 4, entity tag probability is exported based on mark predicted vector by softmax layers.
As shown in figure 4, Seq2Seq model generates the annotated sequence of a sentence, and according to mark completion and composite entity And relationship, finally obtain entity relationship triple.
Above embodiments are only exemplary embodiment of the present invention, are not used in the limitation present invention, protection scope of the present invention It is defined by the claims.Those skilled in the art can within the spirit and scope of the present invention make respectively the present invention Kind modification or equivalent replacement, this modification or equivalent replacement also should be regarded as being within the scope of the present invention.

Claims (4)

1. a kind of method that the sequence labelling joint based on attention mechanism extracts entity relationship, it is characterised in that:
Method includes the following steps:
Step 1, the entity relationship data set for obtaining Opening field are simultaneously pre-processed, and pretreated process is by the data set point For training set and test set two parts, this two parts all includes sentence to be processed, is carried out at participle to sentence wherein included Reason, so that sentence is converted into individual word;
The each word obtained in sentence after pretreatment is converted into vector expression by embeding layer, and is input to two-way length by step 2 It is encoded in the coding layer of short-term memory network;
Memory network decodes in short-term by the length with attention mechanism for step 3, the output for obtaining coding layer, and is added wherein Attention mechanism;
Step 4 exports entity tag probability, completion and composite entity and pass based on mark predicted vector by softmax layers System, obtains triple.
2. the method that a kind of sequence labelling joint based on attention mechanism according to claim 1 extracts entity relationship, It is characterized by:
Step 1 specifically includes the following steps:
Step 1.1, the entity relationship data set for obtaining Opening field, and denoising is carried out to all data that data are concentrated, Including removing useless blank character, capitalization is converted into small letter;
Data set is divided into training set and test set by step 1.2;
Step 1.3 establishes user's Custom Dictionaries, such as long word and proper noun, and utilizes Harbin Institute of Technology's natural language processing work Tool LTP segments sentence.
3. the method that a kind of sequence labelling joint based on attention mechanism according to claim 1 extracts entity relationship, It is characterized by: step 2 specifically includes the following steps:
Step 2.1, the corpus training Word2Vec term vector model using wikipedia, the dimension of term vector is 300;
Step 2.2, the term vector mapping matrix generated by Word2Vec, obtain the corresponding term vector of each word, entire language The term vector of sentence is expressed as { w1,w2,......wn, wnIndicate that the term vector of n-th of word indicates;
The term vector that embeding layer is converted is input to two-way LSTM coding layer by step 2.3, it includes LSTM layers of forward direction, reversely LSTM layers and articulamentum;
Step 2.4 encodes contextual information and semantic information by two-way LSTM coding layer, and positive LSTM is from w1To wn, Reversed LSTM is from wnTo w1, and the coding vector of entire sentence, calculation formula such as formula (1) (2) are exported in the hidden layer of neuron (3) shown in (4) (5) (6);
it=δ (Wwiwt+Whiht-1+Wcict-1+bi) (1)
ft=δ (Wwfwt+Whfht-1+Wcfct-1+bf) (2)
zt=tanh (Wwcwt+Whcht-1+bc) (3)
ct=ftct-1+itzt (4)
ot=δ (Wwowt+Whoht-1+Wcoct+bo) (5)
ht=ot tanh(ct) (6)
I, f, z, o of formula (1) (2) (3) (5) are respectively input gate, forget door, update door, out gate, the c in formula (4)tIndicate t The cell state at moment, the h in (6)tIndicate the output of t moment, W indicates relevant parameter, the W in formula (3)wcAnd WhcIt respectively indicates The parameter of the cell state of the parameter and output of the cell state of word, the W in formula (1) (2) (5)wx、WhxAnd WcxRespectively indicate x Word parameter, the parameter of output and the parameter of cell state, wtIndicate that t-th of word, b indicate biasing loss, δ is indicated Sigmoid activation primitive.
4. the method that a kind of sequence labelling joint based on attention mechanism according to claim 1 extracts entity relationship, It is characterized by: step 3 specifically includes the following steps:
Output after two-way LSTM coding is input to LSTM decoding by step 3.1;
Attention mechanism is added in step 3.2, makes position of the model learning context in entity and the role in relationship, The result y that final decoded result is predicted by last momentt-1, the input s at this momenttWith context relevant to this moment Annotated sequence cseqtIt obtains, shown in calculation formula such as formula (7), (8);
p(yt|y1,y2,......,yt-1, cseq) and=g (yt-1,st,cseqt)(7)
L in formula (8)xIndicate the length of sentence, aseqtjIndicate the Automobile driving coefficient of j-th of word mark in read statement, hjIndicate the semantic coding of j-th of word, cseqtIndicate the relevant context annotated sequence of t moment;
Step 3.3, the decoded sequence of output.
CN201811157788.1A 2018-09-30 2018-09-30 A method of the sequence labelling joint based on attention mechanism extracts entity relationship Pending CN109408812A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811157788.1A CN109408812A (en) 2018-09-30 2018-09-30 A method of the sequence labelling joint based on attention mechanism extracts entity relationship

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811157788.1A CN109408812A (en) 2018-09-30 2018-09-30 A method of the sequence labelling joint based on attention mechanism extracts entity relationship

Publications (1)

Publication Number Publication Date
CN109408812A true CN109408812A (en) 2019-03-01

Family

ID=65466650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811157788.1A Pending CN109408812A (en) 2018-09-30 2018-09-30 A method of the sequence labelling joint based on attention mechanism extracts entity relationship

Country Status (1)

Country Link
CN (1) CN109408812A (en)

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871541A (en) * 2019-03-06 2019-06-11 电子科技大学 It is a kind of suitable for multilingual multi-field name entity recognition method
CN109933806A (en) * 2019-04-01 2019-06-25 长沙理工大学 A kind of repetition generation method, system, equipment and computer readable storage medium
CN109977861A (en) * 2019-03-25 2019-07-05 中国科学技术大学 Offline handwritten form method for identifying mathematical formula
CN109992785A (en) * 2019-04-09 2019-07-09 腾讯科技(深圳)有限公司 Content calculation method, device and equipment based on machine learning
CN110110061A (en) * 2019-04-26 2019-08-09 同济大学 Low-resource languages entity abstracting method based on bilingual term vector
CN110196913A (en) * 2019-05-23 2019-09-03 北京邮电大学 Multiple entity relationship joint abstracting method and device based on text generation formula
CN110209836A (en) * 2019-05-17 2019-09-06 北京邮电大学 Remote supervisory Relation extraction method and device
CN110334219A (en) * 2019-07-12 2019-10-15 电子科技大学 The knowledge mapping for incorporating text semantic feature based on attention mechanism indicates learning method
CN110334339A (en) * 2019-04-30 2019-10-15 华中科技大学 It is a kind of based on location aware from the sequence labelling model and mask method of attention mechanism
CN110348021A (en) * 2019-07-17 2019-10-18 湖北亿咖通科技有限公司 Character string identification method, electronic equipment, storage medium based on name physical model
CN110348019A (en) * 2019-07-17 2019-10-18 南通大学 A kind of medical bodies vector method for transformation based on attention mechanism
CN110390021A (en) * 2019-06-13 2019-10-29 平安科技(深圳)有限公司 Drug knowledge mapping construction method, device, computer equipment and storage medium
CN110399433A (en) * 2019-07-23 2019-11-01 福建奇点时空数字科技有限公司 A kind of data entity Relation extraction method based on deep learning
CN110427605A (en) * 2019-05-09 2019-11-08 苏州大学 The Ellipsis recovering method understood towards short text
CN110444259A (en) * 2019-06-06 2019-11-12 昆明理工大学 Traditional Chinese medical electronic case history entity relationship extracting method based on entity relationship mark strategy
CN110502749A (en) * 2019-08-02 2019-11-26 中国电子科技集团公司第二十八研究所 A kind of text Relation extraction method based on the double-deck attention mechanism Yu two-way GRU
CN110570920A (en) * 2019-08-20 2019-12-13 华东理工大学 Entity and relationship joint learning method based on attention focusing model
CN110597948A (en) * 2019-07-11 2019-12-20 东华大学 Entity relation extraction method based on deep learning
CN110598001A (en) * 2019-08-05 2019-12-20 平安科技(深圳)有限公司 Method, device and storage medium for extracting association entity relationship
CN110705301A (en) * 2019-09-30 2020-01-17 京东城市(北京)数字科技有限公司 Entity relationship extraction method and device, storage medium and electronic equipment
CN110765774A (en) * 2019-10-08 2020-02-07 北京三快在线科技有限公司 Training method and device of information extraction model and information extraction method and device
CN110851620A (en) * 2019-10-29 2020-02-28 天津大学 Knowledge representation method based on combination of text embedding and structure embedding
CN111008279A (en) * 2019-11-27 2020-04-14 云知声智能科技股份有限公司 Entity relationship extraction method and device
CN111008283A (en) * 2019-10-31 2020-04-14 中电药明数据科技(成都)有限公司 Sequence labeling method and system based on composite boundary information
CN111046668A (en) * 2019-12-04 2020-04-21 北京信息科技大学 Method and device for recognizing named entities of multi-modal cultural relic data
CN111079433A (en) * 2019-11-29 2020-04-28 北京奇艺世纪科技有限公司 Event extraction method and device and electronic equipment
CN111090724A (en) * 2019-11-21 2020-05-01 新华智云科技有限公司 Entity extraction method capable of judging relevance between text content and entity based on deep learning
CN111125367A (en) * 2019-12-26 2020-05-08 华南理工大学 Multi-character relation extraction method based on multi-level attention mechanism
CN111143691A (en) * 2019-12-31 2020-05-12 四川长虹电器股份有限公司 Joint information extraction method and device
CN111241209A (en) * 2020-01-03 2020-06-05 北京百度网讯科技有限公司 Method and apparatus for generating information
CN111275094A (en) * 2020-01-17 2020-06-12 厦门快商通科技股份有限公司 Data labeling method, device and equipment based on machine learning
CN111274814A (en) * 2019-12-26 2020-06-12 浙江大学 Novel semi-supervised text entity information extraction method
CN111274815A (en) * 2020-01-15 2020-06-12 北京百度网讯科技有限公司 Method and device for mining entity attention points in text
CN111310472A (en) * 2020-01-19 2020-06-19 合肥讯飞数码科技有限公司 Alias generation method, device and equipment
CN111368528A (en) * 2020-03-09 2020-07-03 西南交通大学 Entity relation joint extraction method for medical texts
CN111460807A (en) * 2020-03-13 2020-07-28 平安科技(深圳)有限公司 Sequence labeling method and device, computer equipment and storage medium
CN111488728A (en) * 2020-03-12 2020-08-04 天闻数媒科技(北京)有限公司 Labeling method, device and storage medium for unstructured test question data
CN111640471A (en) * 2020-05-27 2020-09-08 牛张明 Method and system for predicting activity of drug micromolecules based on two-way long-short memory model
CN111667238A (en) * 2020-05-26 2020-09-15 南开大学 Recruitment inspiring generation method based on skill perception multi-attention mechanism
CN111767409A (en) * 2020-06-14 2020-10-13 南开大学 Entity relationship extraction method based on multi-head self-attention mechanism
WO2020220636A1 (en) * 2019-04-28 2020-11-05 平安科技(深圳)有限公司 Text data enhancement method and apparatus, electronic device, and non-volatile computer-readable storage medium
CN111914091A (en) * 2019-05-07 2020-11-10 四川大学 Entity and relation combined extraction method based on reinforcement learning
CN111950278A (en) * 2019-05-14 2020-11-17 株式会社理光 Sequence labeling method and device and computer readable storage medium
CN112015859A (en) * 2019-05-31 2020-12-01 百度在线网络技术(北京)有限公司 Text knowledge hierarchy extraction method and device, computer equipment and readable medium
CN112163092A (en) * 2020-10-10 2021-01-01 成都数之联科技有限公司 Entity and relation extraction method, system, device and medium
CN112329465A (en) * 2019-07-18 2021-02-05 株式会社理光 Named entity identification method and device and computer readable storage medium
CN112597757A (en) * 2020-12-04 2021-04-02 光大科技有限公司 Word detection method and device, storage medium and electronic device
CN112612871A (en) * 2020-12-17 2021-04-06 浙江大学 Multi-event detection method based on sequence generation model
CN112667820A (en) * 2020-12-08 2021-04-16 吉林省吉科软信息技术有限公司 Deep learning construction method for full-process traceable ecological chain supervision knowledge map
CN112765991A (en) * 2021-01-14 2021-05-07 中山大学 Deep dialogue semantic role labeling method and system based on knowledge enhancement
CN112784597A (en) * 2019-11-06 2021-05-11 阿里巴巴集团控股有限公司 Method and device for evaluating quality of article
CN112818124A (en) * 2021-02-21 2021-05-18 昆明理工大学 Entity relationship extraction method based on attention neural network
CN112836482A (en) * 2021-02-09 2021-05-25 浙江工商大学 Method and device for generating problems by sequence generation model based on template
CN113033189A (en) * 2021-04-08 2021-06-25 北京理工大学 Semantic coding method of long-short term memory network based on attention dispersion
CN113191118A (en) * 2021-05-08 2021-07-30 山东省计算中心(国家超级计算济南中心) Text relation extraction method based on sequence labeling
CN113553850A (en) * 2021-03-30 2021-10-26 电子科技大学 Entity relation extraction method based on ordered structure encoding pointer network decoding
CN113642767A (en) * 2021-07-09 2021-11-12 武汉科技大学 Multi-dimensional feature combination prediction method based on MI-VMD-DA-EDLSTM-VEC
CN113807079A (en) * 2020-06-11 2021-12-17 四川大学 End-to-end entity and relation combined extraction method based on sequence-to-sequence
CN114580408A (en) * 2022-03-10 2022-06-03 浙江理工大学 Lower link generation method and device based on double-layer attention joint learning
CN114625871A (en) * 2020-12-14 2022-06-14 四川大学 Triple classification method based on attention position joint coding
CN115019893A (en) * 2022-06-14 2022-09-06 邵阳学院 Enhancer identification method based on bidirectional long-and-short-term memory and attention mechanism
CN115510869A (en) * 2022-05-30 2022-12-23 青海师范大学 End-to-end Tibetan La lattice shallow semantic analysis method
CN116227597A (en) * 2023-05-05 2023-06-06 中国人民解放军国防科技大学 Biomedical knowledge extraction method, device, computer equipment and storage medium
CN117235286A (en) * 2023-11-10 2023-12-15 昆明理工大学 Attention-strengthening entity relation extraction model, construction method thereof and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072563A1 (en) * 2016-10-18 2018-04-26 中兴通讯股份有限公司 Knowledge graph creation method, device, and system
CN108280062A (en) * 2018-01-19 2018-07-13 北京邮电大学 Entity based on deep learning and entity-relationship recognition method and device
CN108304911A (en) * 2018-01-09 2018-07-20 中国科学院自动化研究所 Knowledge Extraction Method and system based on Memory Neural Networks and equipment
CN108460013A (en) * 2018-01-30 2018-08-28 大连理工大学 A kind of sequence labelling model based on fine granularity vocabulary representation model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072563A1 (en) * 2016-10-18 2018-04-26 中兴通讯股份有限公司 Knowledge graph creation method, device, and system
CN108304911A (en) * 2018-01-09 2018-07-20 中国科学院自动化研究所 Knowledge Extraction Method and system based on Memory Neural Networks and equipment
CN108280062A (en) * 2018-01-19 2018-07-13 北京邮电大学 Entity based on deep learning and entity-relationship recognition method and device
CN108460013A (en) * 2018-01-30 2018-08-28 大连理工大学 A kind of sequence labelling model based on fine granularity vocabulary representation model

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张晓斌等: "基于CNN和双向LSTM融合的实体关系抽取", 《网络与信息安全学报》 *
李枫林等: "基于深度学习框架的实体关系抽取研究进展", 《情报科学》 *
王红等: "基于注意力机制的LSTM 的语义关系抽取", 《计算机应用研究》 *

Cited By (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871541A (en) * 2019-03-06 2019-06-11 电子科技大学 It is a kind of suitable for multilingual multi-field name entity recognition method
CN109977861A (en) * 2019-03-25 2019-07-05 中国科学技术大学 Offline handwritten form method for identifying mathematical formula
CN109933806A (en) * 2019-04-01 2019-06-25 长沙理工大学 A kind of repetition generation method, system, equipment and computer readable storage medium
CN109933806B (en) * 2019-04-01 2024-01-30 长沙理工大学 Method, system, equipment and computer readable storage medium for generating duplicate description
CN109992785A (en) * 2019-04-09 2019-07-09 腾讯科技(深圳)有限公司 Content calculation method, device and equipment based on machine learning
CN109992785B (en) * 2019-04-09 2023-07-25 腾讯科技(深圳)有限公司 Content calculation method, device and equipment based on machine learning
CN110110061A (en) * 2019-04-26 2019-08-09 同济大学 Low-resource languages entity abstracting method based on bilingual term vector
CN110110061B (en) * 2019-04-26 2023-04-18 同济大学 Low-resource language entity extraction method based on bilingual word vectors
WO2020220636A1 (en) * 2019-04-28 2020-11-05 平安科技(深圳)有限公司 Text data enhancement method and apparatus, electronic device, and non-volatile computer-readable storage medium
CN110334339A (en) * 2019-04-30 2019-10-15 华中科技大学 It is a kind of based on location aware from the sequence labelling model and mask method of attention mechanism
CN111914091A (en) * 2019-05-07 2020-11-10 四川大学 Entity and relation combined extraction method based on reinforcement learning
CN111914091B (en) * 2019-05-07 2022-10-14 四川大学 Entity and relation combined extraction method based on reinforcement learning
CN110427605A (en) * 2019-05-09 2019-11-08 苏州大学 The Ellipsis recovering method understood towards short text
CN111950278A (en) * 2019-05-14 2020-11-17 株式会社理光 Sequence labeling method and device and computer readable storage medium
CN110209836B (en) * 2019-05-17 2022-04-26 北京邮电大学 Remote supervision relation extraction method and device
CN110209836A (en) * 2019-05-17 2019-09-06 北京邮电大学 Remote supervisory Relation extraction method and device
CN110196913A (en) * 2019-05-23 2019-09-03 北京邮电大学 Multiple entity relationship joint abstracting method and device based on text generation formula
CN112015859B (en) * 2019-05-31 2023-08-18 百度在线网络技术(北京)有限公司 Knowledge hierarchy extraction method and device for text, computer equipment and readable medium
CN112015859A (en) * 2019-05-31 2020-12-01 百度在线网络技术(北京)有限公司 Text knowledge hierarchy extraction method and device, computer equipment and readable medium
CN110444259B (en) * 2019-06-06 2022-09-23 昆明理工大学 Entity relation extracting method of traditional Chinese medicine electronic medical record based on entity relation labeling strategy
CN110444259A (en) * 2019-06-06 2019-11-12 昆明理工大学 Traditional Chinese medical electronic case history entity relationship extracting method based on entity relationship mark strategy
CN110390021A (en) * 2019-06-13 2019-10-29 平安科技(深圳)有限公司 Drug knowledge mapping construction method, device, computer equipment and storage medium
CN110597948A (en) * 2019-07-11 2019-12-20 东华大学 Entity relation extraction method based on deep learning
CN110334219A (en) * 2019-07-12 2019-10-15 电子科技大学 The knowledge mapping for incorporating text semantic feature based on attention mechanism indicates learning method
CN110348021A (en) * 2019-07-17 2019-10-18 湖北亿咖通科技有限公司 Character string identification method, electronic equipment, storage medium based on name physical model
CN110348019A (en) * 2019-07-17 2019-10-18 南通大学 A kind of medical bodies vector method for transformation based on attention mechanism
CN110348021B (en) * 2019-07-17 2021-05-18 湖北亿咖通科技有限公司 Character string recognition method based on named entity model, electronic device and storage medium
CN112329465A (en) * 2019-07-18 2021-02-05 株式会社理光 Named entity identification method and device and computer readable storage medium
CN110399433A (en) * 2019-07-23 2019-11-01 福建奇点时空数字科技有限公司 A kind of data entity Relation extraction method based on deep learning
CN110502749A (en) * 2019-08-02 2019-11-26 中国电子科技集团公司第二十八研究所 A kind of text Relation extraction method based on the double-deck attention mechanism Yu two-way GRU
CN110502749B (en) * 2019-08-02 2023-10-03 中国电子科技集团公司第二十八研究所 Text relation extraction method based on double-layer attention mechanism and bidirectional GRU
CN110598001A (en) * 2019-08-05 2019-12-20 平安科技(深圳)有限公司 Method, device and storage medium for extracting association entity relationship
CN110570920B (en) * 2019-08-20 2023-07-14 华东理工大学 Entity and relationship joint learning method based on concentration model
CN110570920A (en) * 2019-08-20 2019-12-13 华东理工大学 Entity and relationship joint learning method based on attention focusing model
CN110705301A (en) * 2019-09-30 2020-01-17 京东城市(北京)数字科技有限公司 Entity relationship extraction method and device, storage medium and electronic equipment
CN110765774A (en) * 2019-10-08 2020-02-07 北京三快在线科技有限公司 Training method and device of information extraction model and information extraction method and device
CN110765774B (en) * 2019-10-08 2021-09-17 北京三快在线科技有限公司 Training method and device of information extraction model and information extraction method and device
CN110851620A (en) * 2019-10-29 2020-02-28 天津大学 Knowledge representation method based on combination of text embedding and structure embedding
CN111008283B (en) * 2019-10-31 2023-06-20 中电药明数据科技(成都)有限公司 Sequence labeling method and system based on composite boundary information
CN111008283A (en) * 2019-10-31 2020-04-14 中电药明数据科技(成都)有限公司 Sequence labeling method and system based on composite boundary information
CN112784597A (en) * 2019-11-06 2021-05-11 阿里巴巴集团控股有限公司 Method and device for evaluating quality of article
CN111090724A (en) * 2019-11-21 2020-05-01 新华智云科技有限公司 Entity extraction method capable of judging relevance between text content and entity based on deep learning
CN111008279A (en) * 2019-11-27 2020-04-14 云知声智能科技股份有限公司 Entity relationship extraction method and device
CN111008279B (en) * 2019-11-27 2023-11-14 云知声智能科技股份有限公司 Entity relation extraction method and device
CN111079433A (en) * 2019-11-29 2020-04-28 北京奇艺世纪科技有限公司 Event extraction method and device and electronic equipment
CN111079433B (en) * 2019-11-29 2023-10-27 北京奇艺世纪科技有限公司 Event extraction method and device and electronic equipment
CN111046668A (en) * 2019-12-04 2020-04-21 北京信息科技大学 Method and device for recognizing named entities of multi-modal cultural relic data
CN111046668B (en) * 2019-12-04 2023-09-22 北京信息科技大学 Named entity identification method and device for multi-mode cultural relic data
CN111125367B (en) * 2019-12-26 2023-05-23 华南理工大学 Multi-character relation extraction method based on multi-level attention mechanism
CN111274814A (en) * 2019-12-26 2020-06-12 浙江大学 Novel semi-supervised text entity information extraction method
CN111274814B (en) * 2019-12-26 2021-09-24 浙江大学 Novel semi-supervised text entity information extraction method
CN111125367A (en) * 2019-12-26 2020-05-08 华南理工大学 Multi-character relation extraction method based on multi-level attention mechanism
CN111143691A (en) * 2019-12-31 2020-05-12 四川长虹电器股份有限公司 Joint information extraction method and device
CN111241209A (en) * 2020-01-03 2020-06-05 北京百度网讯科技有限公司 Method and apparatus for generating information
JP7112536B2 (en) 2020-01-15 2022-08-03 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Method and apparatus for mining entity attention points in text, electronic device, computer-readable storage medium and computer program
JP2021111413A (en) * 2020-01-15 2021-08-02 ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド Method and apparatus for mining entity focus in text, electronic device, computer-readable storage medium, and computer program
US11775761B2 (en) 2020-01-15 2023-10-03 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for mining entity focus in text
CN111274815A (en) * 2020-01-15 2020-06-12 北京百度网讯科技有限公司 Method and device for mining entity attention points in text
CN111274815B (en) * 2020-01-15 2024-04-12 北京百度网讯科技有限公司 Method and device for mining entity focus point in text
CN111275094A (en) * 2020-01-17 2020-06-12 厦门快商通科技股份有限公司 Data labeling method, device and equipment based on machine learning
CN111310472B (en) * 2020-01-19 2024-02-09 合肥讯飞数码科技有限公司 Alias generation method, device and equipment
CN111310472A (en) * 2020-01-19 2020-06-19 合肥讯飞数码科技有限公司 Alias generation method, device and equipment
CN111368528B (en) * 2020-03-09 2022-07-08 西南交通大学 Entity relation joint extraction method for medical texts
CN111368528A (en) * 2020-03-09 2020-07-03 西南交通大学 Entity relation joint extraction method for medical texts
CN111488728A (en) * 2020-03-12 2020-08-04 天闻数媒科技(北京)有限公司 Labeling method, device and storage medium for unstructured test question data
CN111460807A (en) * 2020-03-13 2020-07-28 平安科技(深圳)有限公司 Sequence labeling method and device, computer equipment and storage medium
CN111460807B (en) * 2020-03-13 2024-03-12 平安科技(深圳)有限公司 Sequence labeling method, device, computer equipment and storage medium
CN111667238B (en) * 2020-05-26 2022-07-29 南开大学 Recruitment elicitation generation method based on skill perception multi-attention mechanism
CN111667238A (en) * 2020-05-26 2020-09-15 南开大学 Recruitment inspiring generation method based on skill perception multi-attention mechanism
CN111640471A (en) * 2020-05-27 2020-09-08 牛张明 Method and system for predicting activity of drug micromolecules based on two-way long-short memory model
CN113807079A (en) * 2020-06-11 2021-12-17 四川大学 End-to-end entity and relation combined extraction method based on sequence-to-sequence
CN113807079B (en) * 2020-06-11 2023-06-23 四川大学 Sequence-to-sequence-based end-to-end entity and relationship joint extraction method
CN111767409A (en) * 2020-06-14 2020-10-13 南开大学 Entity relationship extraction method based on multi-head self-attention mechanism
CN112163092A (en) * 2020-10-10 2021-01-01 成都数之联科技有限公司 Entity and relation extraction method, system, device and medium
CN112163092B (en) * 2020-10-10 2022-07-12 成都数之联科技股份有限公司 Entity and relation extraction method, system, device and medium
CN112597757A (en) * 2020-12-04 2021-04-02 光大科技有限公司 Word detection method and device, storage medium and electronic device
CN112667820B (en) * 2020-12-08 2023-04-18 吉林省吉科软信息技术有限公司 Deep learning construction method for full-process traceable ecological chain supervision knowledge map
CN112667820A (en) * 2020-12-08 2021-04-16 吉林省吉科软信息技术有限公司 Deep learning construction method for full-process traceable ecological chain supervision knowledge map
CN114625871A (en) * 2020-12-14 2022-06-14 四川大学 Triple classification method based on attention position joint coding
CN114625871B (en) * 2020-12-14 2023-06-23 四川大学 Ternary grouping method based on attention position joint coding
CN112612871A (en) * 2020-12-17 2021-04-06 浙江大学 Multi-event detection method based on sequence generation model
CN112612871B (en) * 2020-12-17 2023-09-15 浙江大学 Multi-event detection method based on sequence generation model
CN112765991A (en) * 2021-01-14 2021-05-07 中山大学 Deep dialogue semantic role labeling method and system based on knowledge enhancement
CN112765991B (en) * 2021-01-14 2023-10-03 中山大学 Knowledge enhancement-based deep dialogue semantic role labeling method and system
CN112836482B (en) * 2021-02-09 2024-02-23 浙江工商大学 Method and device for generating problem by sequence generation model based on template
CN112836482A (en) * 2021-02-09 2021-05-25 浙江工商大学 Method and device for generating problems by sequence generation model based on template
CN112818124A (en) * 2021-02-21 2021-05-18 昆明理工大学 Entity relationship extraction method based on attention neural network
CN113553850A (en) * 2021-03-30 2021-10-26 电子科技大学 Entity relation extraction method based on ordered structure encoding pointer network decoding
CN113033189A (en) * 2021-04-08 2021-06-25 北京理工大学 Semantic coding method of long-short term memory network based on attention dispersion
CN113191118B (en) * 2021-05-08 2023-07-18 山东省计算中心(国家超级计算济南中心) Text relation extraction method based on sequence annotation
CN113191118A (en) * 2021-05-08 2021-07-30 山东省计算中心(国家超级计算济南中心) Text relation extraction method based on sequence labeling
CN113642767A (en) * 2021-07-09 2021-11-12 武汉科技大学 Multi-dimensional feature combination prediction method based on MI-VMD-DA-EDLSTM-VEC
CN114580408B (en) * 2022-03-10 2024-05-07 浙江理工大学 Method and device for generating downlink based on double-layer attention joint learning
CN114580408A (en) * 2022-03-10 2022-06-03 浙江理工大学 Lower link generation method and device based on double-layer attention joint learning
CN115510869A (en) * 2022-05-30 2022-12-23 青海师范大学 End-to-end Tibetan La lattice shallow semantic analysis method
CN115510869B (en) * 2022-05-30 2023-08-01 青海师范大学 End-to-end Tibetan Lager shallow semantic analysis method
CN115019893A (en) * 2022-06-14 2022-09-06 邵阳学院 Enhancer identification method based on bidirectional long-and-short-term memory and attention mechanism
CN116227597A (en) * 2023-05-05 2023-06-06 中国人民解放军国防科技大学 Biomedical knowledge extraction method, device, computer equipment and storage medium
CN117235286A (en) * 2023-11-10 2023-12-15 昆明理工大学 Attention-strengthening entity relation extraction model, construction method thereof and storage medium
CN117235286B (en) * 2023-11-10 2024-01-23 昆明理工大学 Attention-strengthening entity relation extraction model, construction method thereof and storage medium

Similar Documents

Publication Publication Date Title
CN109408812A (en) A method of the sequence labelling joint based on attention mechanism extracts entity relationship
CN108460013B (en) Sequence labeling model and method based on fine-grained word representation model
US11194972B1 (en) Semantic sentiment analysis method fusing in-depth features and time sequence models
CN109657239B (en) Chinese named entity recognition method based on attention mechanism and language model learning
CN111241294B (en) Relationship extraction method of graph convolution network based on dependency analysis and keywords
CN107526834B (en) Word2vec improvement method for training correlation factors of united parts of speech and word order
CN109003601A (en) A kind of across language end-to-end speech recognition methods for low-resource Tujia language
CN109800411A (en) Clinical treatment entity and its attribute extraction method
CN111666758B (en) Chinese word segmentation method, training device and computer readable storage medium
CN112784051A (en) Patent term extraction method
CN110083710A (en) It is a kind of that generation method is defined based on Recognition with Recurrent Neural Network and the word of latent variable structure
CN109086269B (en) Semantic bilingual recognition method based on semantic resource word representation and collocation relationship
CN110263325A (en) Chinese automatic word-cut
CN110688862A (en) Mongolian-Chinese inter-translation method based on transfer learning
CN112765952A (en) Conditional probability combined event extraction method under graph convolution attention mechanism
CN112966525B (en) Law field event extraction method based on pre-training model and convolutional neural network algorithm
CN113255320A (en) Entity relation extraction method and device based on syntax tree and graph attention machine mechanism
CN109766546A (en) A kind of natural language inference method neural network based
CN113065349A (en) Named entity recognition method based on conditional random field
CN111340006A (en) Sign language identification method and system
CN113076718B (en) Commodity attribute extraction method and system
Ma et al. Joint pre-trained Chinese named entity recognition based on bi-directional language model
CN116186241A (en) Event element extraction method and device based on semantic analysis and prompt learning, electronic equipment and storage medium
CN109960782A (en) A kind of Tibetan language segmenting method and device based on deep neural network
CN115510230A (en) Mongolian emotion analysis method based on multi-dimensional feature fusion and comparative reinforcement learning mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190301

RJ01 Rejection of invention patent application after publication