CN111221966A - Text semantic relation extraction method and system - Google Patents
Text semantic relation extraction method and system Download PDFInfo
- Publication number
- CN111221966A CN111221966A CN201911412034.0A CN201911412034A CN111221966A CN 111221966 A CN111221966 A CN 111221966A CN 201911412034 A CN201911412034 A CN 201911412034A CN 111221966 A CN111221966 A CN 111221966A
- Authority
- CN
- China
- Prior art keywords
- representing
- text
- lstm
- output
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Machine Translation (AREA)
Abstract
The invention discloses a text semantic relation extraction method and a text semantic relation extraction system, wherein text information is obtained, the text information is vectorized, and local features of the text are extracted; inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and acquiring the overall features of the text according to the importance; and performing feature fusion on the local features and the overall features, and outputting a classification result through a classifier. The advantages are that: based on a long-short term memory model (LSTM) network, an attention mechanism algorithm is introduced, the LSTM model is adopted to avoid the problem of long-distance dependence of CNN and RNN, the attention mechanism is adopted to better consider the problem of relevance of model input and output, text local features are fully extracted to extract entity concepts, and the speed and accuracy of extracting the power grid maintenance ontology concepts are improved.
Description
Technical Field
The invention relates to a text semantic relation extraction method and a text semantic relation extraction system, and belongs to the technical field of ontology concept extraction.
Background
With the rapid development of network technology, the internet creates a good interactive platform for people. It also faces a great challenge how to effectively extract valuable information for users in massive network big data. Semantic-based information processing can effectively solve the above-mentioned problems. As a shared conceptualization model, ontologies play a crucial role in semantic analysis. The domain ontology learning is a learning method for automatically acquiring concepts and relationships among the concepts in the domain ontology aiming at a specific domain, and along with the rapid development of the power grid industry and the internet information technology, the power grid maintenance puts higher and larger demands on the automatic learning of the domain ontology.
In the field of power grid maintenance in China, some researchers have applied ontology and semantic web technology to the field of power grid maintenance emergency management, and an emergency decision field ontology is constructed through a 'field dictionary', and is based on the field ontology, and the generation of an initial emergency maintenance scheme is realized through semantic query conversion and semantic retrieval and reasoning, so that the intelligence of emergency decision is improved.
With the progress and development of the times, the existing domain ontology is not enough to support the expression of knowledge in the power grid maintenance field, and the problems of enriching domain ontology knowledge and improving ontology extraction accuracy are urgently needed to be solved. In addition, manual ontology construction is time-consuming and inefficient work, and how to effectively improve the efficiency of ontology automatic updating is a technical problem which needs to be overcome at present.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a text semantic relation extraction method and a text semantic relation extraction system.
In order to solve the above technical problems, the present invention provides a method for extracting semantic relations of texts,
acquiring text information, vectorizing the text information, and extracting local features of the text;
inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and determining the overall features of the text according to the importance;
and performing feature fusion on the local features and the overall features, and outputting a classification result through a classifier.
Further, the training process of the bidirectional LSTM model is as follows:
acquiring an LSTM training sample, and acquiring daily overhaul application ticket data and a scheduling log of power grid scheduling as the LSTM training sample; the electric network scheduling daily overhaul application ticket and the scheduling log can be obtained from a scheduling system, and for one overhaul ticket or scheduling log, the data to be collected comprises the following data: equipment information, equipment parameters, fault information, maintenance modes and the like;
two-way LSTM models were trained using LSTM training samples:
wherein it,ft,ct,ot,htVariables g of input gate, forgetting gate, memory, output gate and hidden state of LSTM, respectivelytRepresenting the state of the cell at the current input, sigma representing the logic sigmoid activation function, tanh representing the activation function of the output, TD+m+n,nRepresenting the affine transformation of the real space RD + m + n to Rn defined by the learned parameters, D representing the dimensions of the sample vectors extracted by the extractor, m and n representing the dimensions of the embedding matrix and the LSTM network matrix, respectively, E representing the embedding matrix, E ∈ Rm*KR denotes a set of real numbers, K denotes a sample vocabulary, yt-1Representing the semantic paraphrase intermediate variable at the previous time,representing a random variable, z is a context vector, z ∈ RD;
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
⊙ denotes element multiplication, ct-1Is the state of the attention model at the last moment.
Further, c is determined by the following formulat,
atjweight given to all feature vectors for attention mechanism, hjThe feature vector sequence output for the LSTM neural network, T represents the total number of feature vectors, j represents the jth feature vector, exp (e)tj) Representing an exponential function based on a natural constant e, etjIndicating that the output characteristic value is an alignment model, representing a score of how well the input at time t matches the output at time j, etkSimilarly, a denotes the calculation etjFunction of vaIs a global weight, waThe weight, u, of the state of attention mechanism at the previous momentaThe weight of the feature vector at the previous moment.
A text semantic relation extraction system comprises an acquisition module, a determination module and an output module;
the acquisition module is used for acquiring text information, vectorizing the text information and extracting local features of the text;
the determining module is used for inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and determining the overall features of the text according to the importance;
and the output module is used for carrying out feature fusion on the local features and the overall features and outputting a classification result through the classifier.
Further, the determining module comprises a training module for acquiring an LSTM training sample, and acquiring daily overhaul application ticket data and a scheduling log of power grid scheduling as the LSTM training sample;
two-way LSTM models were trained using LSTM training samples:
wherein it,ft,ct,ot,htVariables g of input gate, forgetting gate, memory, output gate and hidden state of LSTM, respectivelytRepresenting the state of the cell at the current input, sigma representing the logic sigmoid activation function, tanh representing the activation function of the output, TD+m+n,nRepresenting the affine transformation of the real space RD + m + n to Rn defined by the learned parameters, D representing the dimensions of the sample vectors extracted by the extractor, m and n representing the dimensions of the embedding matrix and the LSTM network matrix, respectively, E representing the embedding matrix, E ∈ Rm*KR denotes a set of real numbers, K denotes a sample vocabulary, yt-1Representing the semantic paraphrase intermediate variable at the previous time,representing a random variable, z is a context vector, z ∈ RD;
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
⊙ denotes element multiplication, ct-1Is the state of the attention model at the last moment.
Further, the training module includes a memory variable determination module for determining c byt,
atjweight given to all feature vectors for attention mechanism, hjThe feature vector sequence output for the LSTM neural network, T represents the total number of feature vectors, j represents the jth feature vector, exp (e)tj) Representing an exponential function based on a natural constant e, etjIndicating that the output characteristic value is an alignment model, representing a score of how well the input at time t matches the output at time j, etkSimilarly, a denotes the calculation etjFunction of vaIs a global weight, waThe weight, u, of the state of attention mechanism at the previous momentaThe weight of the feature vector at the previous moment.
The invention achieves the following beneficial effects:
based on a long-short term memory model (LSTM) network, an attention mechanism algorithm is introduced, the LSTM model is adopted to avoid the long-distance dependence problem of CNN and RNN, the attention mechanism is adopted to better consider the relevance problem of model input and output, text local features are fully extracted to extract entity concepts, the speed and the accuracy of extracting the power grid maintenance ontology concepts are improved, a more scientific power grid maintenance scheme is formulated according to the ontology concepts and the relation extraction, the most appropriate maintenance mode is applied, the consumption of people, property and things is reduced to the greatest extent, the maintenance efficiency is improved, the safe operation level and the power supply reliability of a power grid are effectively improved, and the social benefit and the economic benefit of a power supply enterprise are improved.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below, and it should be apparent that the embodiments described below are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The technical solution of the present invention is further explained by the following embodiments.
A relation extraction method combining a long-short term memory model (LSTM) and an Attention Mechanism (Attention Mechanism) comprises the steps of firstly vectorizing text information and extracting local features of the text; then, importing the local features of the text into a bidirectional LSTM model, introducing an attention mechanism to carry out importance calculation on the relevance between the input and the output of the LSTM model, and acquiring the overall features of the text according to the importance; and then carrying out feature fusion on the local features and the overall features, and outputting a classification result through a classifier.
Training samples using LSTM:
it,ft,ct,ot,htrespectively, an input gate, a forgetting gate, a memory and output gate of the LSTM and a hidden state variable. Vector z ∈ RDIs a context vector that captures visual information associated with a particular input location, as described below. E is an element of Rm*KIs an embedding matrix, m and n represent the embedding matrix and LSTM network matrix dimensions, respectively, σ and ⊙ represent the logical sigmoid activation function and element multiplication,
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
definition of x1,x2,x3,…xT-1,xTA word joint vector sequence input by the LSTM neural network; a istjTo note the weight that the force mechanism assigns to all feature vectors, the formula is as follows:
in the formula:ct-1The state of the attention model at the last moment; v. ofaIs a global weight; h isjA feature vector sequence output for the LSTM neural network; u. ofaThe weight of the feature vector at the previous moment; w is aaWeighting of the state of attention mechanism at the previous moment
Final output state c of attention mechanism modeltThe calculation method comprises the following steps:
wherein h isjA sequence of feature vectors representing the output of the LSTM neural network; a istjWeight given to all feature vectors for attention mechanism, for atjThe calculation of (2) has different calculation modes according to different models adopted in the encoding and decoding stages.
The invention correspondingly provides a text semantic relation extraction system which comprises an acquisition module, a determination module and an output module;
the acquisition module is used for acquiring text information, vectorizing the text information and extracting local features of the text;
the determining module is used for inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and determining the overall features of the text according to the importance;
and the output module is used for carrying out feature fusion on the local features and the overall features and outputting a classification result through the classifier.
The determining module comprises a training module and a judging module, wherein the training module is used for acquiring an LSTM training sample, and acquiring electric network scheduling daily overhaul application ticket data and a scheduling log as the LSTM training sample;
two-way LSTM models were trained using LSTM training samples:
wherein it,ft,ct,ot,htVariables g of input gate, forgetting gate, memory, output gate and hidden state of LSTM, respectivelytRepresenting the state of the cell at the current input, sigma representing the logic sigmoid activation function, tanh representing the activation function of the output, TD+m+n,nRepresenting the affine transformation of the real space RD + m + n to Rn defined by the learned parameters, D representing the dimensions of the sample vectors extracted by the extractor, m and n representing the dimensions of the embedding matrix and the LSTM network matrix, respectively, E representing the embedding matrix, E ∈ Rm*KR denotes a set of real numbers, K denotes a sample vocabulary, yt-1Representing the semantic paraphrase intermediate variable at the previous time,representing a random variable, z is a context vector, z ∈ RD;
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
⊙ denotes element multiplication, ct-1Is the state of the attention model at the last moment.
The training process of the bidirectional LSTM model mainly comprises the steps of segmenting power grid maintenance information, extracting text entity characteristics, and converting the entity characteristics and the text information into word vectors. And extracting characteristic data of the training sample through the trained bidirectional LSTM model, wherein the characteristic data comprises overhaul equipment information, fault information, overhaul modes and the like, and the equipment information comprises information such as equipment name, equipment type, equipment manufacturer, equipment voltage grade and the like.
The training module includes a memory variable determination module for determining c byt,
atjweight given to all feature vectors for attention mechanism, hjThe feature vector sequence output for the LSTM neural network, T represents the total number of feature vectors, j represents the jth feature vector, exp (e)tj) Representing an exponential function based on a natural constant e, etjIndicating that the output characteristic value is an aligned model, representing a score of how well the input at time t matches the output at time j, etkSimilarly, a denotes the calculation etjFunction of vaIs the weight of the whole office, waThe weight, u, of the state of attention mechanism at the previous momentaThe weight of the feature vector at the previous moment.
The LSTM model is adopted to avoid the long-distance dependence problem of the CNN and the RNN, the attention mechanism is adopted to better consider the relevance problem of model input and output, and the relationship extraction is carried out more effectively.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to methods, devices (systems) according to embodiments of the application. It should be understood that each flow may be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart or flowcharts.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart or flowcharts.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart or flowcharts.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; while the invention has been described in detail and with reference to the foregoing examples, those skilled in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (6)
1. A text semantic relation extraction method is characterized in that,
acquiring text information, vectorizing the text information, and extracting local features of the text;
inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and determining the overall features of the text according to the importance;
and performing feature fusion on the local features and the overall features, and outputting a classification result through a classifier.
2. The text semantic relationship extraction method according to claim 1, wherein the two-way LSTM model is trained by:
acquiring an LSTM training sample, and acquiring daily overhaul application ticket data and a scheduling log of power grid scheduling as the LSTM training sample;
two-way LSTM models were trained using LSTM training samples:
wherein it,ft,ct,ot,htVariables g of input gate, forgetting gate, memory, output gate and hidden state of LSTM, respectivelytRepresenting the state of the cell at the current input, sigma representing the logic sigmoid activation function, tanh representing the activation function of the output, TD+m+n,nRepresenting the affine transformation of the real space RD + m + n to Rn defined by the learned parameters, D representing the dimensions of the sample vectors extracted by the extractor, m and n representing the dimensions of the embedding matrix and the LSTM network matrix, respectively, E representing the embedding matrix, E ∈ Rm*KR denotes a set of real numbers, K denotes a sample vocabulary, yt-1Representing the semantic paraphrase intermediate variable at the previous time,representing a random variable, z is a context vector, z ∈ RD;
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
⊙ denotes element multiplication, ct-1Is the state of the attention model at the last moment.
3. The text semantic relationship extraction method of claim 2, wherein c is determined by the following formulat,
atjweight given to all feature vectors for attention mechanism, hjThe feature vector sequence output for the LSTM neural network, T represents the total number of feature vectors, j represents the jth feature vector, exp (e)tj) Representing an exponential function based on a natural constant e, etjIndicating that the output characteristic value is an alignment model, representing a score of how well the input at time t matches the output at time j, etkSimilarly, a denotes the calculation etjFunction of vaIs a global weight, waThe weight, u, of the state of attention mechanism at the previous momentaThe weight of the feature vector at the previous moment.
4. A text semantic relation extraction system is characterized by comprising an acquisition module, a determination module and an output module;
the acquisition module is used for acquiring text information, vectorizing the text information and extracting local features of the text;
the determining module is used for inputting the local features of the text into a pre-trained bidirectional LSTM model, introducing an attention mechanism to calculate the importance of the correlation between the input and the output of the bidirectional LSTM model, and determining the overall features of the text according to the importance;
and the output module is used for carrying out feature fusion on the local features and the overall features and outputting a classification result through the classifier.
5. The text semantic relationship extraction method according to claim 4, wherein the determination module comprises a training module for obtaining LSTM training samples, and collecting power grid dispatching daily repair application ticket data and dispatching logs as LSTM training samples;
two-way LSTM models were trained using LSTM training samples:
wherein it,ft,ct,ot,htVariables g of input gate, forgetting gate, memory, output gate and hidden state of LSTM, respectivelytRepresenting the state of the cell at the current input, sigma representing the logic sigmoid activation function, tanh representing the activation function of the output, TD+m+n,nRepresenting the affine transformation of the real space RD + m + n to Rn defined by the learned parameters, D representing the dimensions of the sample vectors extracted by the extractor, m and n representing the dimensions of the embedding matrix and the LSTM network matrix, respectively, E representing the embedding matrix, E ∈ Rm*KR denotes a set of real numbers, K denotes a sample vocabulary, yt-1Representing the semantic paraphrase intermediate variable at the previous time,representing a random variable, z is a context vector, z ∈ RD;
ct=ft⊙ct-1+it⊙gt
ht=ottanh(ct)
⊙ denotes element multiplication, ct-1Is the state of the attention model at the last moment.
6. The text semantic relationship extraction method of claim 5, wherein the training module comprises a memory variable determination module for determining c by the following formulat,
atjweight given to all feature vectors for attention mechanism, hjA sequence of feature vectors output for the LSTM neural network, T representing the total number of feature vectorsJ denotes the jth feature vector, exp (e)tj) Representing an exponential function based on a natural constant e, etjIndicating that the output characteristic value is an alignment model, representing a score of how well the input at time t matches the output at time j, etkSimilarly, a denotes the calculation etjFunction of vaIs a global weight, waThe weight, u, of the state of attention mechanism at the previous momentaThe weight of the feature vector at the previous moment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911412034.0A CN111221966A (en) | 2019-12-31 | 2019-12-31 | Text semantic relation extraction method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911412034.0A CN111221966A (en) | 2019-12-31 | 2019-12-31 | Text semantic relation extraction method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111221966A true CN111221966A (en) | 2020-06-02 |
Family
ID=70825949
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911412034.0A Pending CN111221966A (en) | 2019-12-31 | 2019-12-31 | Text semantic relation extraction method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111221966A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116934468A (en) * | 2023-09-15 | 2023-10-24 | 成都运荔枝科技有限公司 | Trusted client grading method based on semantic recognition |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109214001A (en) * | 2018-08-23 | 2019-01-15 | 桂林电子科技大学 | A kind of semantic matching system of Chinese and method |
CN109389091A (en) * | 2018-10-22 | 2019-02-26 | 重庆邮电大学 | The character identification system and method combined based on neural network and attention mechanism |
US20190114320A1 (en) * | 2017-10-17 | 2019-04-18 | Tata Consultancy Services Limited | System and method for quality evaluation of collaborative text inputs |
CN109710761A (en) * | 2018-12-21 | 2019-05-03 | 中国标准化研究院 | The sentiment analysis method of two-way LSTM model based on attention enhancing |
CN109902293A (en) * | 2019-01-30 | 2019-06-18 | 华南理工大学 | A kind of file classification method based on part with global mutually attention mechanism |
CN110609897A (en) * | 2019-08-12 | 2019-12-24 | 北京化工大学 | Multi-category Chinese text classification method fusing global and local features |
-
2019
- 2019-12-31 CN CN201911412034.0A patent/CN111221966A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190114320A1 (en) * | 2017-10-17 | 2019-04-18 | Tata Consultancy Services Limited | System and method for quality evaluation of collaborative text inputs |
CN109214001A (en) * | 2018-08-23 | 2019-01-15 | 桂林电子科技大学 | A kind of semantic matching system of Chinese and method |
CN109389091A (en) * | 2018-10-22 | 2019-02-26 | 重庆邮电大学 | The character identification system and method combined based on neural network and attention mechanism |
CN109710761A (en) * | 2018-12-21 | 2019-05-03 | 中国标准化研究院 | The sentiment analysis method of two-way LSTM model based on attention enhancing |
CN109902293A (en) * | 2019-01-30 | 2019-06-18 | 华南理工大学 | A kind of file classification method based on part with global mutually attention mechanism |
CN110609897A (en) * | 2019-08-12 | 2019-12-24 | 北京化工大学 | Multi-category Chinese text classification method fusing global and local features |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116934468A (en) * | 2023-09-15 | 2023-10-24 | 成都运荔枝科技有限公司 | Trusted client grading method based on semantic recognition |
CN116934468B (en) * | 2023-09-15 | 2023-12-22 | 成都运荔枝科技有限公司 | Trusted client grading method based on semantic recognition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | Stock prediction using convolutional neural network | |
Ruangkanokmas et al. | Deep belief networks with feature selection for sentiment classification | |
CN110807084A (en) | Attention mechanism-based patent term relationship extraction method for Bi-LSTM and keyword strategy | |
CN112100397A (en) | Electric power plan knowledge graph construction method and system based on bidirectional gating circulation unit | |
CN106980650A (en) | A kind of emotion enhancing word insertion learning method towards Twitter opinion classifications | |
CN112560486A (en) | Power entity identification method based on multilayer neural network, storage medium and equipment | |
CN115408525B (en) | Letters and interviews text classification method, device, equipment and medium based on multi-level label | |
Li et al. | Emotion analysis model of microblog comment text based on CNN-BiLSTM | |
Lin et al. | BERT-SMAP: Paying attention to Essential Terms in passage ranking beyond BERT | |
CN114265937A (en) | Intelligent classification analysis method and system of scientific and technological information, storage medium and server | |
Tong et al. | A multimodel-based deep learning framework for short text multiclass classification with the imbalanced and extremely small data set | |
CN112559741B (en) | Nuclear power equipment defect record text classification method, system, medium and electronic equipment | |
CN111221966A (en) | Text semantic relation extraction method and system | |
Shen et al. | GAR: Graph adversarial representation for adverse drug event detection on Twitter | |
CN111783464A (en) | Electric power-oriented domain entity identification method, system and storage medium | |
CN116127954A (en) | Dictionary-based new work specialized Chinese knowledge concept extraction method | |
CN113157914B (en) | Document abstract extraction method and system based on multilayer recurrent neural network | |
Shan | Social network text sentiment analysis method based on CNN-BiGRU in big data environment | |
CN114357166A (en) | Text classification method based on deep learning | |
CN113762589A (en) | Power transmission and transformation project change prediction system and method | |
Ma et al. | PAI at SemEval-2023 Task 4: A general multi-label classification system with class-balanced loss function and ensemble module | |
CN110825851A (en) | Sentence pair relation discrimination method based on median conversion model | |
Pratama et al. | Performance of Lexical Resource and Manual Labeling on Long Short-Term Memory Model for Text Classification | |
Chong et al. | Objectivity and Subjectivity Classification with BERT for Bahasa Melayu | |
Zhang et al. | YUN_DE at HASOC2020 subtask A: Multi-Model Ensemble Learning for Identifying Hate Speech and Offensive Language. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |