CN113051929A - Entity relationship extraction method based on fine-grained semantic information enhancement - Google Patents
Entity relationship extraction method based on fine-grained semantic information enhancement Download PDFInfo
- Publication number
- CN113051929A CN113051929A CN202110308269.6A CN202110308269A CN113051929A CN 113051929 A CN113051929 A CN 113051929A CN 202110308269 A CN202110308269 A CN 202110308269A CN 113051929 A CN113051929 A CN 113051929A
- Authority
- CN
- China
- Prior art keywords
- entity
- information
- relationship
- layer
- type
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
- G06F40/211—Syntactic parsing, e.g. based on context-free grammar [CFG] or unification grammars
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Machine Translation (AREA)
Abstract
The invention provides an entity relationship extraction method based on fine-grained semantic information enhancement, which comprises the following steps: selecting sentence context information from an input layer to form a sentence initial vector; capturing all hidden feature vectors of the text by using a BERT pre-training language model in a coding layer; acquiring the range representation of a head entity and a tail entity by means of a pointer network in an entity identification layer, and acquiring entity type information; the hidden feature vectors before being used in the relation extraction layer are spliced with entity type information to obtain hidden vector representations of head and tail entities so as to carry out relation extraction; and then adding a fine-grained class dictionary before the output layer, using an attention mechanism to obtain the weight of the relation class, connecting the output layer, and using Softmax to extract and classify the relation. Due to the existence of BERT, the model can encode sentences to extract all features, and the pointer network also provides a better solution for solving the problem of extracting the overlapping entity triples; and finally, a fine-grained category dictionary is constructed, and the problem of unbalanced relation category samples is solved by acquiring weights through an attention mechanism, so that the accuracy of entity relation extraction is improved.
Description
Technical Field
The invention belongs to the field of natural language processing.
Background
With the rapid popularization and daily activity increase of the internet, the text data on the internet is more and more, and a huge scale is formed. The secondary utilization of the existing text data and the mining of the hidden value of the existing text data become extremely eager and urgent needs. Therefore, the knowledge graph technology can be generated at the same time, and the knowledge graph technology can help people to quickly acquire information and carry out reasoning and application of the information from visual data. With knowledge map technology, people can construct a question-answering system, converse with an intelligent chatting robot, recognize voice, request machine help for reading, understanding and translating, and the like. However, the development of the knowledge graph technology cannot depart from the underlying entity relationship extraction technology, and the required entities and relationships in the triple and multi-element visual data are extracted by depending on the entity relationship. The prior entity relation extraction technology is developed from a method based on manual construction characteristic engineering and a method based on kernel function to a technology based on deep learning in recent years, and a plurality of problems are solved in the process, but still a plurality of problems exist, such as an error propagation information redundancy problem of a pipeline method, an entity triple overlapping problem, a label imbalance problem and the like. As a cornerstone of knowledge graph construction technology, the entity relation extraction technology is one of popular directions of natural language processing research.
At present, entity relationship extraction is developed to a stage of using deep learning technology, but the problems existing at present cannot be solved well. The traditional entity relation extraction technology is based on a pipeline method, an entity is extracted firstly, then the entity is used as an input extraction relation, most of models used are RNN and LSTM or improvements and variants based on the RNN and LSTM, and in view of the excellent performance of pre-training language models on other natural language processing tasks in recent years, the studied tasks of the invention also belong to one of the natural language processing tasks fundamentally, so that the invention builds a model for performing combined extraction on the entity and the relation in a text sentence by combining a pointer network on the basis of a BERT pre-training model.
Disclosure of Invention
The invention provides an entity relation extraction method based on fine-grained semantic information enhancement, and aims to solve the entity overlapping problem and improve the accuracy of extracting entity relations. The method comprises the following steps:
(1) adding context information into an input layer to construct a spliced sentence vector;
(2) capturing hidden state vector information in an encoding layer;
(3) obtaining the range information of a head entity and a tail entity in an entity identification layer, and performing Softmax to obtain entity type information;
(4) further extracting abstract features from hidden feature vectors and type information before the combination of the relation extraction layer;
(5) the relationship is classified using Softmax at the output layer.
Drawings
FIG. 1 is a model structure diagram of an entity relationship extraction method based on fine-grained semantic information enhancement.
Fig. 2 shows a BERT encoder used in the present invention, in which (a) is an overall model structure, and (b) is an internal model structure.
FIG. 3 is an example of context information employed by the present invention.
FIG. 4 is a flow chart of word vector encoding using sentences in combination with context information in accordance with the present invention.
Fig. 5 illustrates an entity type information tagging policy employed in the present invention.
FIG. 6 is a flow chart of context information and entity type information employed by the present invention.
FIG. 7 is a flowchart of fine-grained relationship class dictionary construction employed by the present invention.
FIG. 8 is an illustration of statistical representation of a sample of relationships employed in the present invention.
FIG. 9 is a diagram of an attention mechanism model employed in the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
As shown in fig. 1, the present invention is mainly based on a BERT Encoder framework and performs entity relationship extraction by combining a pointer network, and mainly comprises five parts, namely an Input Layer (Input Layer), an encoding Layer (Encoder Layer), an entity identification Layer (Decoder Layer), a relationship extraction Layer (Output Layer) and an Output Layer (Output Layer). The specific implementation mode is as follows:
the method comprises the following steps: input layer
The previous entity relationship extraction models only directly encode and embed words and vectors in sentences, but few models which can extract all characteristics of sentences for helping relationship prediction are few, so that the efficiency of relationship extraction is not high, and the problem that overlapping entity triples cannot be effectively extracted if the entities and the relationships in the sentences are numerous is also caused. Therefore, in order to improve the accuracy of entity relationship extraction and solve the problem of overlapping entity triples, the invention expands the context information of the sentence in the input layer. Fig. 3 shows an example of extraction with context information added, where "arbiba" and "marcloud" both have entity type, ID, country, attribute, and other information corresponding to them.
As shown in fig. 4, the context information encoding flow chart selects a sentence word vector and a context information vector as vectors to be encoded, and a sentence first represents the following x ═ (x) in the following manner1,x2,...,xn) Shown;
1. word vector
With e (x)i) Representing the word xiThe invention selects a BERT pre-training model to train the word vector, BThe ERT model is shown in FIG. 2 (a);
2. context information
The cross-sentence information can well help the model to predict the type of each entity and the relationship among the entities, and particularly can help the model to identify pronouns and reference relationships. The present invention therefore selects context information as the encoding layer input. First, a W-sized context window is set, which is before the current sentenceThe words are upper and lower, after the sentenceThe word is the following, resulting in the sentence x ═ (x)1,x2,...,xm) Taking the input as the input of a model, and then carrying out BERT word vector conversion as shown in a formula 1;
X=(x1,x2,...,xN) (1)
wherein x isi∈RdIs the ith word xiThe dimension of the word vector is d dimension, and X belongs to RN×dThe dimension is Nxd dimension.
Step two: coding layer
For different tasks, the coding layer and the decoding layer can be selected in different combination modes, for example, on the image processing task, a convolutional neural network is usually used for forming the coding layer, however, for the natural language processing field task of extracting the entity relationship, a cyclic neural network is usually selected, and because the semantic information capability of RNN and LSTM coded sentences is limited, all the characteristics of the sentences can not be completely extracted; therefore, the invention selects the BERT pre-training language model as the coding layer structure, and the model structure is shown in FIG. 2 (b); the coding layer outputs the hidden feature vector as shown in formula 2;
hi=BERTencoder(X) (2)
where X is a sentence embedding vector containing context information, hiHidden feature vectors output for the coding layer.
Step three: entity recognition layer
In the prior art, only entities are extracted by a relation extraction model, and then relation prediction is carried out by using entity mentions, so that the important influence of entity type information on relation extraction is ignored; therefore, the invention selects the pointer network to predict the entity range and extract the entity type information;
1. physical span range representation
We have used BERT to perform word vector coding on a sentence containing a context to represent X ═ (X)1,x2,...,xN) Then using the pointer network to identify the span range s of each entityiE S, whose span range represents he(si) The definition is shown in formula 3;
2. entity type information
Then we use this span range to represent he(si) To predict the category of the entity, as shown in equation 4;
Pe(e|si)=softmax(WeFFNN(he(si)) (4)
in which we use a two-layer feed-forward neural network with relu activation functions to make entity type predictions.
Step four: relation extraction layer
Previous research work has always used a span range to represent he(si) (head and tail entities are h)e(ss)、he(si) To make relational classification, the classification result is not ideal, and h is observede(si) The computing mode of (2) is that the entity can only capture the context representation around the entity and cannot capture the dependency relationship between entity ranges; thus, the present inventionIs not directly usede(si) Predicting the relationship between the entities by combining the context information and the entity type information, wherein the entity type marking strategy is shown in FIG. 6, and the beginning and the end of the entities are marked by using types before and after the head entity;
as shown in equation 5, whereinRepresenting a sentence input sequence including context information of W window length and entity type specific tag information, S representing a head entity, O representing a tail entity, typeiEntity type, type representing head entity ijIndicating the entity type of the tail entity j.
Will carry all informationAnd (5) sending the data into BERT for coding to obtain the range hidden representation of the head entity and the tail entity, as shown in formula 6.
WhereinIndicating that the header entity contains a starting range of type information,the same holds for the starting scope of the tail entity containing type information.
Step five: output layer
When the relation extraction model extracts the relations of each category, the extraction precision is inconsistent, which is caused by different sentence sample numbers of each relation category. Therefore, in order to solve the problem of unbalanced relation categories, the invention constructs a fine-grained category dictionary to count the number of relation samples of each data set, as shown in fig. 7 and 8; then, a weight is established for each relation by using an attention mechanism, so that the relation with less samples can be better noticed to obtain greater proportion when the relation samples are extracted; the weights are calculated for each relationship using an attention mechanism as shown in equation 7;
after getting the hidden range representation of the head and tail entities, we use hr(si,sj) Predicting the relationship type, as shown in equation 8;
Pr(r|si,sj)=softmax(Wrhr(si,sj)) (8)
wherein one of the relationship types R belongs to R { [ belongs to }, and R represents the defined total relationship type; relationship weight WrThe calculation of (d) is shown in equation 7;
finally, the total loss function in the model training process in the entity recognition stage and the relation extraction stage can be calculated and obtained as shown in a formula 9;
wherein the content of the first and second substances,belonging to span SiOf gold entity type, andthen belong to span S in the training datai、SjType of gold relationship. In the training process of the model, only the gold entity S is consideredGE.g. S, and the gold entity type is used as the input of the relation model.
Although illustrative embodiments of the present invention have been described above to facilitate the understanding of the present invention by those skilled in the art, it should be understood that the present invention is not limited in scope to the specific embodiments. Such variations are obvious and all the inventions utilizing the concepts of the present invention are intended to be protected.
Claims (6)
1. A method for extracting entity relationship based on fine-grained semantic information enhancement is characterized in that the method aims to identify entities in a text sentence, and then classifies the relationship between every two entities to output triples, and the method comprises the following steps:
step 1: adding context information into an input layer to construct a spliced sentence vector;
step 2: capturing hidden state vector information in an encoding layer;
and step 3: obtaining the range information of a head entity and a tail entity in an entity identification layer, and performing Softmax to obtain entity type information;
and 4, step 4: further extracting abstract features from hidden feature vectors and type information before the combination of the relation recognition layer;
and 5: the relationship is classified using Softmax at the output layer.
2. The method for extracting entity relationship based on fine-grained semantic information enhancement according to claim 1, wherein the step 1 of adding context information to splice sentences to construct vectors specifically includes: in the entity relation extraction task, the invention selects a sentence word vector and a context information vector as vectors to be coded, and a sentence is firstly expressed as the following x ═ x (x)1,x2,…,xn) Shown;
step 1.1 training word vectors
With e (x)i) Representing the word xiThe word vector of the invention is trained by selecting a BERT pre-training model, wherein the BERT model is shown as a figure 2 (a);
step 1.2 adding context information
The cross-sentence information can well help the model to predict the type of each entity and the relationship among the entitiesThe system can particularly help us to identify pronouns and relations; therefore, the invention selects the context information as the input of the coding layer; as shown in FIG. 3, first, a W-sized context window is set, and the current sentence is preceded by the current sentenceThe words are upper and lower, after the sentenceThe word is the following, resulting in the sentence x ═ (x)1,x2,…,xm) Taking the input as the input of a model, and then carrying out BERT word vector conversion as shown in a formula 1;
X=(x1,x2,...,xN) (1)
wherein x isi∈RdIs the ith word xiThe dimension of the word vector is d dimension, and X belongs to RN×dThe dimension is Nxd dimension.
3. The method for extracting entity relationship based on fine-grained semantic information enhancement according to claim 2, wherein the capturing hidden state vector information at the coding layer in step 2 specifically includes:
for different tasks, the coding layer and the decoding layer can be selected in different combination modes, for example, on the image processing task, a convolutional neural network is usually used for forming the coding layer, however, for the natural language processing field task of extracting the entity relationship, a cyclic neural network is usually selected, and because the semantic information capability of RNN and LSTM coded sentences is limited, all the characteristics of the sentences can not be completely extracted; therefore, the invention selects the BERT pre-training language model as the coding layer structure, and the model structure is shown in FIG. 2 (b); the coding layer outputs the hidden feature vector as shown in formula 2;
hi=BERTencoder(X) (2)
where X is a sentence embedding vector containing context information, hiOutput for coding layerAnd hiding the feature vector.
4. The method for extracting entity relationship based on fine-grained semantic information enhancement according to claim 3, wherein the step 3 of obtaining the range information of the head entity and the tail entity at the entity identification layer, and the step of obtaining the entity type information by Softmax comprises:
in the prior art, only entities are extracted by a relation extraction model, and then relation prediction is carried out by using entity mentions, so that the important influence of entity type information on relation extraction is ignored; therefore, the invention selects the pointer network to predict the entity range and extract the entity type information;
step 4.1 obtaining entity span Range representation
We have used BERT to perform word vector coding on a sentence containing a context to represent X ═ (X)1,x2,…,xN) Then using the pointer network to identify the span range s of each entityiE S, whose span range represents he(si) The definition is shown in formula 3;
step 4.2 obtaining entity type information
Then we use this span range to represent he(si) To predict the category of the entity, as shown in equation 4;
Pe(e|si)=softmax(WeFFNN(he(si)) (4)
in which we use a two-layer feed-forward neural network with relu activation functions to make entity type predictions.
5. The entity relationship extraction method based on fine-grained semantic information enhancement according to claim 4, wherein the step 4 of further extracting abstract features from the hidden feature vectors and the type information before the relationship extraction layer is synthesized is to:
previous research work has always used a span range to represent he(si) (head and tail entities are h)e(ss)、he(si) To make relational classification, the classification result is not ideal, and h is observede(si) The computing mode of (2) is that the entity can only capture the context representation around the entity and cannot capture the dependency relationship between entity ranges; thus the invention does not directly use he(si) To predict the relationship between entities, but to predict the relationship between entities by combining the context information and the entity type information, and the entity type labeling strategy is shown in fig. 5;
as shown in equation 5, whereinRepresenting a sentence input sequence including context information of W window length and entity type specific tag information, S representing a head entity, O representing a tail entity, typeiEntity type, type representing head entity ijIndicating the entity type of the tail entity j.
Will carry all informationThe data are sent to BERT for coding, and the range hidden representation of the head entity and the tail entity is obtained, as shown in a formula 6;
6. The method for extracting entity relationships based on fine-grained semantic information enhancement according to claim 5, wherein the classifying relationships in the output layer by using Softmax in step 5 is:
in order to solve the problem of low relation extraction precision caused by unbalance of relation type samples, the invention uses an attention mechanism as shown in FIG. 9, so that the relation with less samples can be better noticed to obtain larger specific gravity when the relation samples are extracted; the weights are calculated for each relationship using an attention mechanism as shown in equation 7;
after obtaining the hidden range representation of the head and tail entities in equation 6, we use hr(si,sj) Predicting the relationship type, as shown in equation 8;
Pr(r|si,sj)=softmax(Wrhr(si,sj)) (8)
wherein one of the relationship types R belongs to R { [ belongs to }, and R represents the defined total relationship type; relationship weight WrThe calculation of (d) is shown in equation 7;
finally, the total loss function in the model training process in the entity recognition stage and the relation extraction stage can be calculated and obtained as shown in a formula 9;
wherein the content of the first and second substances,belonging to span SiOf gold entity type, andthen belong to span S in the training datai、SjThe type of gold relationship of (1); in the training process of the model, only the gold entity S is consideredGE.g. S, and the gold entity type is used as the input of the relation model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110308269.6A CN113051929A (en) | 2021-03-23 | 2021-03-23 | Entity relationship extraction method based on fine-grained semantic information enhancement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110308269.6A CN113051929A (en) | 2021-03-23 | 2021-03-23 | Entity relationship extraction method based on fine-grained semantic information enhancement |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113051929A true CN113051929A (en) | 2021-06-29 |
Family
ID=76514500
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110308269.6A Pending CN113051929A (en) | 2021-03-23 | 2021-03-23 | Entity relationship extraction method based on fine-grained semantic information enhancement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113051929A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113434698A (en) * | 2021-06-30 | 2021-09-24 | 华中科技大学 | Relation extraction model establishing method based on full-hierarchy attention and application thereof |
CN113869049A (en) * | 2021-12-03 | 2021-12-31 | 北京大学 | Fact extraction method and device with legal attribute based on legal consultation problem |
CN114004330A (en) * | 2021-09-26 | 2022-02-01 | 苏州浪潮智能科技有限公司 | Recommendation system and method based on eigenvalue completion |
CN114298043A (en) * | 2021-12-24 | 2022-04-08 | 厦门快商通科技股份有限公司 | Entity standardization method, device and equipment based on joint learning and readable medium |
CN114417846A (en) * | 2021-11-25 | 2022-04-29 | 湘潭大学 | Entity relationship extraction method based on attention contribution degree and application thereof |
CN114881038A (en) * | 2022-07-12 | 2022-08-09 | 之江实验室 | Chinese entity and relation extraction method and device based on span and attention mechanism |
CN115080705A (en) * | 2022-07-20 | 2022-09-20 | 神州医疗科技股份有限公司 | Vertical domain relation extraction method and system based on dual-model enhancement |
CN116245178A (en) * | 2023-05-08 | 2023-06-09 | 中国人民解放军国防科技大学 | Biomedical knowledge extraction method and device of decoder based on pointer network |
CN116992870A (en) * | 2023-09-26 | 2023-11-03 | 山东省计算中心(国家超级计算济南中心) | Text information entity relation extraction method and system based on asymmetric kernel function |
CN117408247A (en) * | 2023-12-15 | 2024-01-16 | 南京邮电大学 | Intelligent manufacturing triplet extraction method based on relational pointer network |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103678703A (en) * | 2013-12-30 | 2014-03-26 | 中国科学院自动化研究所 | Method and device for extracting open category named entity by means of random walking on map |
CN105005794A (en) * | 2015-07-21 | 2015-10-28 | 太原理工大学 | Image pixel semantic annotation method with combination of multi-granularity context information |
CN110209770A (en) * | 2019-06-03 | 2019-09-06 | 北京邮电大学 | A kind of name entity recognition method based on policy value network and tree search enhancing |
CN110222199A (en) * | 2019-06-20 | 2019-09-10 | 青岛大学 | A kind of character relation map construction method based on ontology and a variety of Artificial neural network ensembles |
CN110298037A (en) * | 2019-06-13 | 2019-10-01 | 同济大学 | The matched text recognition method of convolutional neural networks based on enhancing attention mechanism |
CN110335594A (en) * | 2019-07-11 | 2019-10-15 | 哈尔滨工业大学 | Automatic speech recognition difficulty sample method for digging based on multi-instance learning |
CN110516055A (en) * | 2019-08-16 | 2019-11-29 | 西北工业大学 | A kind of cross-platform intelligent answer implementation method for teaching task of combination BERT |
US20200073933A1 (en) * | 2018-08-29 | 2020-03-05 | National University Of Defense Technology | Multi-triplet extraction method based on entity-relation joint extraction model |
CN111178053A (en) * | 2019-12-30 | 2020-05-19 | 电子科技大学 | Text generation method for performing generation type abstract extraction by combining semantics and text structure |
CN111368528A (en) * | 2020-03-09 | 2020-07-03 | 西南交通大学 | Entity relation joint extraction method for medical texts |
CN111488726A (en) * | 2020-03-31 | 2020-08-04 | 成都数之联科技有限公司 | Pointer network-based unstructured text extraction multi-task joint training method |
CN111709243A (en) * | 2020-06-19 | 2020-09-25 | 南京优慧信安科技有限公司 | Knowledge extraction method and device based on deep learning |
CN111859935A (en) * | 2020-07-03 | 2020-10-30 | 大连理工大学 | Method for constructing cancer-related biomedical event database based on literature |
CN112084790A (en) * | 2020-09-24 | 2020-12-15 | 中国民航大学 | Relation extraction method and system based on pre-training convolutional neural network |
CN112115687A (en) * | 2020-08-26 | 2020-12-22 | 华南理工大学 | Problem generation method combining triples and entity types in knowledge base |
CN112287119A (en) * | 2020-06-23 | 2021-01-29 | 北京理工大学 | Knowledge graph generation method for extracting relevant information of online resources |
CN113553850A (en) * | 2021-03-30 | 2021-10-26 | 电子科技大学 | Entity relation extraction method based on ordered structure encoding pointer network decoding |
-
2021
- 2021-03-23 CN CN202110308269.6A patent/CN113051929A/en active Pending
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103678703A (en) * | 2013-12-30 | 2014-03-26 | 中国科学院自动化研究所 | Method and device for extracting open category named entity by means of random walking on map |
CN105005794A (en) * | 2015-07-21 | 2015-10-28 | 太原理工大学 | Image pixel semantic annotation method with combination of multi-granularity context information |
US20200073933A1 (en) * | 2018-08-29 | 2020-03-05 | National University Of Defense Technology | Multi-triplet extraction method based on entity-relation joint extraction model |
CN110209770A (en) * | 2019-06-03 | 2019-09-06 | 北京邮电大学 | A kind of name entity recognition method based on policy value network and tree search enhancing |
CN110298037A (en) * | 2019-06-13 | 2019-10-01 | 同济大学 | The matched text recognition method of convolutional neural networks based on enhancing attention mechanism |
CN110222199A (en) * | 2019-06-20 | 2019-09-10 | 青岛大学 | A kind of character relation map construction method based on ontology and a variety of Artificial neural network ensembles |
CN110335594A (en) * | 2019-07-11 | 2019-10-15 | 哈尔滨工业大学 | Automatic speech recognition difficulty sample method for digging based on multi-instance learning |
CN110516055A (en) * | 2019-08-16 | 2019-11-29 | 西北工业大学 | A kind of cross-platform intelligent answer implementation method for teaching task of combination BERT |
CN111178053A (en) * | 2019-12-30 | 2020-05-19 | 电子科技大学 | Text generation method for performing generation type abstract extraction by combining semantics and text structure |
CN111368528A (en) * | 2020-03-09 | 2020-07-03 | 西南交通大学 | Entity relation joint extraction method for medical texts |
CN111488726A (en) * | 2020-03-31 | 2020-08-04 | 成都数之联科技有限公司 | Pointer network-based unstructured text extraction multi-task joint training method |
CN111709243A (en) * | 2020-06-19 | 2020-09-25 | 南京优慧信安科技有限公司 | Knowledge extraction method and device based on deep learning |
CN112287119A (en) * | 2020-06-23 | 2021-01-29 | 北京理工大学 | Knowledge graph generation method for extracting relevant information of online resources |
CN111859935A (en) * | 2020-07-03 | 2020-10-30 | 大连理工大学 | Method for constructing cancer-related biomedical event database based on literature |
CN112115687A (en) * | 2020-08-26 | 2020-12-22 | 华南理工大学 | Problem generation method combining triples and entity types in knowledge base |
CN112084790A (en) * | 2020-09-24 | 2020-12-15 | 中国民航大学 | Relation extraction method and system based on pre-training convolutional neural network |
CN113553850A (en) * | 2021-03-30 | 2021-10-26 | 电子科技大学 | Entity relation extraction method based on ordered structure encoding pointer network decoding |
Non-Patent Citations (8)
Title |
---|
ORIOL VINYALS等: "Pointer Networks", 《ARXIV:1506.03134V2》 * |
XIAOQI LI等: "bidirectional lstm network with ordered neurons for speech enhancement", 《INTERSPEECH》 * |
YI LUAN等: "A General Framework for Information Extraction using Dynamic Span Graphs", 《ARXIV:1904.03296V1》 * |
YI LUAN等: "Multi-Task Identification of Entities, Relations, and Coreference for Scientific Knowledge Graph Construction", 《ARXIV:1808.09602V1》 * |
ZEXUAN ZHONG等: "A Frustratingly Easy Approach for Joint Entity and Relation Extraction", 《ARXIV:2010.12812V1》 * |
任鹏程等: "依存约束的图网络实体关系联合抽取", 《计算机系统应用》 * |
胡正等: "变体上下文窗口下的词向量准确性研究", 《现代电子技术》 * |
陈彦光等: "面向法律文本的三元组抽取模型", 《网络首发地址:HTTP://KNS.CNKI.NET/KCMS/DETAIL/31.1289.TP.20200526.1313.002.HTML》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113434698A (en) * | 2021-06-30 | 2021-09-24 | 华中科技大学 | Relation extraction model establishing method based on full-hierarchy attention and application thereof |
CN113434698B (en) * | 2021-06-30 | 2022-08-02 | 华中科技大学 | Relation extraction model establishing method based on full-hierarchy attention and application thereof |
CN114004330A (en) * | 2021-09-26 | 2022-02-01 | 苏州浪潮智能科技有限公司 | Recommendation system and method based on eigenvalue completion |
CN114004330B (en) * | 2021-09-26 | 2023-11-14 | 苏州浪潮智能科技有限公司 | Recommendation system and method based on feature value completion |
CN114417846A (en) * | 2021-11-25 | 2022-04-29 | 湘潭大学 | Entity relationship extraction method based on attention contribution degree and application thereof |
CN114417846B (en) * | 2021-11-25 | 2023-12-19 | 湘潭大学 | Entity relation extraction method based on attention contribution degree |
CN113869049A (en) * | 2021-12-03 | 2021-12-31 | 北京大学 | Fact extraction method and device with legal attribute based on legal consultation problem |
CN114298043A (en) * | 2021-12-24 | 2022-04-08 | 厦门快商通科技股份有限公司 | Entity standardization method, device and equipment based on joint learning and readable medium |
CN114881038B (en) * | 2022-07-12 | 2022-11-11 | 之江实验室 | Chinese entity and relation extraction method and device based on span and attention mechanism |
CN114881038A (en) * | 2022-07-12 | 2022-08-09 | 之江实验室 | Chinese entity and relation extraction method and device based on span and attention mechanism |
CN115080705A (en) * | 2022-07-20 | 2022-09-20 | 神州医疗科技股份有限公司 | Vertical domain relation extraction method and system based on dual-model enhancement |
CN116245178A (en) * | 2023-05-08 | 2023-06-09 | 中国人民解放军国防科技大学 | Biomedical knowledge extraction method and device of decoder based on pointer network |
CN116992870A (en) * | 2023-09-26 | 2023-11-03 | 山东省计算中心(国家超级计算济南中心) | Text information entity relation extraction method and system based on asymmetric kernel function |
CN116992870B (en) * | 2023-09-26 | 2023-12-19 | 山东省计算中心(国家超级计算济南中心) | Text information entity relation extraction method and system based on asymmetric kernel function |
CN117408247A (en) * | 2023-12-15 | 2024-01-16 | 南京邮电大学 | Intelligent manufacturing triplet extraction method based on relational pointer network |
CN117408247B (en) * | 2023-12-15 | 2024-03-29 | 南京邮电大学 | Intelligent manufacturing triplet extraction method based on relational pointer network |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113051929A (en) | Entity relationship extraction method based on fine-grained semantic information enhancement | |
CN108829722B (en) | Remote supervision Dual-Attention relation classification method and system | |
CN109902145B (en) | Attention mechanism-based entity relationship joint extraction method and system | |
CN111897908A (en) | Event extraction method and system fusing dependency information and pre-training language model | |
CN113312501A (en) | Construction method and device of safety knowledge self-service query system based on knowledge graph | |
CN113468888A (en) | Entity relation joint extraction method and device based on neural network | |
CN112183064B (en) | Text emotion reason recognition system based on multi-task joint learning | |
CN113255320A (en) | Entity relation extraction method and device based on syntax tree and graph attention machine mechanism | |
CN112163089B (en) | High-technology text classification method and system integrating named entity recognition | |
CN115292463B (en) | Information extraction-based method for joint multi-intention detection and overlapping slot filling | |
CN113887229A (en) | Address information identification method and device, computer equipment and storage medium | |
CN116661805B (en) | Code representation generation method and device, storage medium and electronic equipment | |
CN114841151B (en) | Medical text entity relation joint extraction method based on decomposition-recombination strategy | |
CN113051922A (en) | Triple extraction method and system based on deep learning | |
CN113051887A (en) | Method, system and device for extracting announcement information elements | |
CN114168754A (en) | Relation extraction method based on syntactic dependency and fusion information | |
CN115688784A (en) | Chinese named entity recognition method fusing character and word characteristics | |
CN114492460B (en) | Event causal relationship extraction method based on derivative prompt learning | |
CN113051904B (en) | Link prediction method for small-scale knowledge graph | |
CN113239694B (en) | Argument role identification method based on argument phrase | |
CN113033206B (en) | Bridge detection field text entity identification method based on machine reading understanding | |
CN116562275B (en) | Automatic text summarization method combined with entity attribute diagram | |
CN116595023A (en) | Address information updating method and device, electronic equipment and storage medium | |
CN113449517B (en) | Entity relationship extraction method based on BERT gated multi-window attention network model | |
CN116562291A (en) | Chinese nested named entity recognition method based on boundary detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210629 |
|
RJ01 | Rejection of invention patent application after publication |