CN113255366A - Aspect-level text emotion analysis method based on heterogeneous graph neural network - Google Patents
Aspect-level text emotion analysis method based on heterogeneous graph neural network Download PDFInfo
- Publication number
- CN113255366A CN113255366A CN202110593991.9A CN202110593991A CN113255366A CN 113255366 A CN113255366 A CN 113255366A CN 202110593991 A CN202110593991 A CN 202110593991A CN 113255366 A CN113255366 A CN 113255366A
- Authority
- CN
- China
- Prior art keywords
- text
- node
- evaluation
- embedded
- nodes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/253—Grammatical analysis; Style critique
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Abstract
The invention discloses an aspect-level text emotion analysis method based on a heterogeneous graph neural network, and belongs to the field of language processing. According to the co-occurrence relation of words and sentences in the text and the evaluation aspect contained in the sentences, a three-level graph structure network in the word-sentence-evaluation aspect is constructed; then obtaining initial embedded vector representation of each node; and then using the parameters of the graph attention network training model, continuously updating the embedded vector representation of the nodes in the graph network according to the connection relation of each node in the graph network through a multi-head attention mechanism, and finally predicting the aspect-level emotional tendency of the text. And calculating the correlation between the sentence nodes and the evaluation nodes by using a self-attention mechanism according to the finally obtained embedded vector representation of the sentence nodes and the evaluation nodes, thereby obtaining the predicted text aspect level emotional tendency. The invention effectively improves the expression capability and generalization capability of the model.
Description
Technical Field
The invention belongs to the field of language processing, and particularly relates to an aspect-level text emotion analysis method based on a heterogeneous graph neural network.
Background
Aspect-Based Sentiment Analysis (ABSA) is a fine-grained text Sentiment Analysis method, and compared with the traditional Sentiment Analysis, the ABSA can provide more detailed and rich Sentiment information. The aspect level emotion analysis is mainly used for analyzing the corresponding emotional tendency (positive, neutral and negative) of the text in different aspects, and can be divided into two sub-tasks: aspect-based sentiment classification (ATSA) and category-based sentiment classification (ACSA). For example, for the text "food is very good, i.e. the server attitude is too bad", ACSA needs to give a positive emotion for rating aspect "food" and a negative emotion for rating aspect "service".
Most models at this stage are based on attention mechanisms and neural networks. The method comprises the steps of capturing semantic information of a text by using a neural network, capturing information in an evaluation aspect by using an attention mechanism, and strengthening the attention between the text semantic and the evaluation aspect. Classified from the network structure, the existing methods can be roughly classified into the following four categories: 1. methods based on the Recurrent Neural Network (RNN), such as Wang et al, propose an attention-based long-short term memory network (LSTM) to generate an embedded representation of the evaluation aspect. 2. Convolutional Neural Network (CNN) based methods, such as Xue et al, propose convolutional gating cells to extract text features and evaluate aspect features. 3. Graph Neural Network (GNN) based approaches, such as Li et al, propose using graph neural networks to model grammatical structures of text to assist in classification. 4. Pre-training model-based methods, such as Sun, utilize a pre-training model BERT to model semantic relationships between text and evaluation aspects. The four technical schemes have the following disadvantages: first, existing models assume that input texts are independently and identically distributed, however, strong correlation often exists between texts with comment properties, which are main research objects of aspect-level emotion analysis, and ignoring the correlation results in loss of a large amount of information, so that the performance of the models is reduced. Secondly, the existing models ignore the characteristic of structural similarity between texts with the same emotion aiming at the same evaluation aspect, which results in that information between the texts cannot be shared, so that the expression capability of the models is poor. Thirdly, the existing models ignore the characteristic of semantic expression diversity among texts with the same emotion aiming at the same evaluation aspect, and the loss of the diversity information can cause the generalization capability of the models to be poor.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides an aspect-level text emotion analysis method based on a heterogeneous graph neural network.
In order to achieve the purpose, the invention adopts the following technical scheme to realize the purpose:
an aspect level text emotion analysis method based on a heterogeneous graph neural network comprises the following steps:
(1) constructing a three-level graph network structure of a word-sentence-evaluation aspect according to the co-occurrence relation of the word and the text and the evaluation aspect related in the text;
(2) initializing the embedded vector representation of each node in the graph network structure by using a pre-trained model to respectively obtain an initial embedded representation matrix of the word nodeInitial embedded representation matrix of text nodesAnd initial embedded representation matrix in evaluation
(3) Continuously updating the embedded representation of each node in the graph network structure by a multi-head self-attention mechanism according to the semantic relation between the graph network structure and the text by adopting a graph attention network GAT training model so as to continuously exchange information among the nodes, thereby obtaining the embedded representation matrix of each node in the step (t +1)Finally, the text embedded expression matrix is obtainedAnd an evaluation-wise embedded representation matrix
(4) Embedding a representation matrix with the textAnd an evaluation-side embedded representation matrixCalculating the correlation between the text and each emotional tendency in the evaluation aspect through a self-attention mechanism, taking the emotion with the maximum correlation as the predicted emotion of the text in the evaluation aspect, calculating the difference between the predicted emotion de-tendency and the text real emotional tendency through a loss function, and finally optimizing model parameters through back propagation until the proximity of the predicted emotional tendency and the text real emotional tendency is in a preset range to obtain a trained model;
(5) inputting the text to be classified into a trained model for feature extraction, calculating the correlation between the extracted text feature vector and the trained evaluation vector by adopting a self-attention mechanism, and finally classifying by using a softmax classifier.
Further, the graph network structure G tableShown as follows: g ═ Vw,Vs,Va,Ews,Esa};
Wherein, VwRepresenting word nodes contained in the text; vsRepresenting a text node; vaRepresenting evaluation aspect nodes; ewsRepresenting an edge between a word node and a text node, the weight of which represents the position of the word appearing in the text; esaRepresenting an edge between the text node and the evaluation aspect node.
Further, the specific operation of initializing the word node embedding vector in step (2) is as follows:
aiming at word nodes in a graph network structure, initializing the word nodes by using a pre-trained GloVe word vector library to obtain word embedded vectors, and splicing all the word embedded vectors to obtain a word initial embedded matrix.
Further, the specific operation of initializing the text node embedding vector in the step (2) is as follows:
aiming at a text node in a graph network structure, initializing the text node by using a pre-training language model BERT to obtain an initial embedded vector, and splicing all the initial embedded vectors of the text to obtain an initial embedded matrix of the text.
Further, the specific operation of initializing the node embedding vector in the evaluation aspect in the step (2) is as follows:
aiming at evaluation nodes in a graph network structure, coding the evaluation nodes by using one-hot coding, mapping coding vectors to a feature space by using a layer of full-connection network FCN with learnable parameters to obtain initial embedded vectors of the evaluation nodes, and splicing all the evaluation initial embedded vectors to obtain an evaluation initial embedded matrix.
Further, the specific operation of updating the embedded representation of each node in the graph network structure in the step (3) is as follows:
embedding vector h for a given node in a graph network structureiAnd a neighbor N connected to said given nodeiObtaining a new embedded vector representation for a given node using a multi-head attention mechanism
The embedded representation of a given node n in step t is notedThe embedded representation of the neighbors of a given node at step t is notedLet the embedding of a given node n at (t +1) be denotedNew embedded vector representation based on given nodesConstruction ofThe relationship between the three is as follows:
based on the initial embedded matrix corresponding to the word node, the text node and the evaluation node in the graph network structure, the embedded expression matrix of each node in the step (t +1) is obtained through repeated iteration of the formula (5)
Further, in step (4), the correlation between the text and each emotional tendency in the evaluation aspect is calculated, and the calculation formula is as follows:
wherein the content of the first and second substances,is composed ofEmbedding vectors corresponding to the ith text node;is composed ofThe embedded vector corresponding to the jth evaluation aspect; beta is aijThe attention weight between the text node vector and the evaluation node vector is obtained;embedding the representation for the text nodes after the attention weight weighting;a probability distribution representing an emotional tendency of the predicted text in the current evaluation aspect; wa,baIs a learnable parameter; softmax () is an exponential normalization function, and the calculation formula is as follows:
further, in the step (4), the predicted text emotional tendency distribution is carried outWith text true emotion labelComparing, passing through cross entropy loss functionThe difference between the two is calculated and the loss over all samples is the sum over all text nodes i and all evaluation-aspect nodes j:
and finally, continuously updating the model parameters through a back propagation algorithm until the degree of closeness between the predicted emotional tendency and the real emotional tendency of the text is in a preset range.
Compared with the prior art, the invention has the following beneficial effects:
the invention relates to an aspect level text emotion analysis method based on a heterogeneous graph neural network, which comprises the steps of constructing a three-level graph structure network of a word-sentence-evaluation aspect according to the co-occurrence relation of words and sentences in a text and the evaluation aspect contained in the sentences; then generating embedded representation of each node in the graph network, and respectively initializing word nodes, sentence nodes and evaluation nodes in the graph network by using a pre-trained language model so as to obtain initial embedded vector representation of each node; and then using the parameters of the graph attention network training model, continuously updating the embedded vector representation of the nodes in the graph network according to the connection relation of each node in the graph network through a multi-head attention mechanism, and finally predicting the aspect-level emotional tendency of the text. And calculating the correlation between the sentence nodes and the evaluation nodes by using a self-attention mechanism according to the finally obtained embedded vector representation of the sentence nodes and the evaluation nodes, thereby obtaining the predicted text aspect level emotional tendency. According to the method, the structural similarity information and semantic expression diversity information between texts with the same evaluation aspect and emotional tendency are captured by means of the graph neural network, embedded vector representation of the texts and nodes in the evaluation aspect is obtained through model training, and the expression capability and generalization capability of the model are effectively improved.
Drawings
Fig. 1 is a schematic diagram of a network structure of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Aiming at the problems of the existing model, the invention provides a method for modeling the relationship between texts with the same evaluation aspect and emotional tendency and the relationship between the texts and the evaluation aspect by combining a graph neural network, so that the model can learn the structural similarity characteristics between similar texts, thereby improving the expression capability of the model; meanwhile, the model can learn the semantic diversity characteristics among texts, so that the generalization capability of the model is improved.
The invention is described in further detail below with reference to the accompanying drawings:
step 1: constructing a three-level graph network structure of a word-sentence-evaluation aspect according to the co-occurrence relation of the word and the text and the evaluation aspect related in the text;
graph structure G is represented as: g ═ Vw,Vs,Va,Ews,EsaIn which VwRepresenting nodes of words, V, contained in the textsRepresenting nodes of text, VaRepresenting nodes in evaluation, EwsRepresenting the edges between the word nodes and the text nodes, the weights of which represent the positions of the words appearing in the text, EsaRepresenting an edge between the text node and the evaluation aspect node;
step 2: aiming at word nodes in a graph network structure, initializing the word nodes by using pre-trained GloVe word vectors to obtain word embedded vectors, and splicing all the word embedded vectors to obtain a word initial embedded matrix;
for the word w, the GloVe word vector library is consulted through the sequence number of the word w in the dictionary to obtain the initial embedded vector of the word w asWherein d iswFor the dimension of the word embedding vector, all word embedding vectors are spliced to obtain a word initial embedding matrixWhere n is the number of words.
And step 3: initializing a text node in a graph network structure by using a pre-training language model BERT to obtain an initial embedded vector of the text, and splicing all the initial embedded vectors of the text to obtain an initial embedded matrix of the text;
for the text s ═ w1,w2,…,wlWherein w isi(i e 1 … l) are the words that make up the text, l is the sentence length, and the initial embedded vector for the text s is:
Xs=MeanPooling(BERT(s))#(1)
wherein, MeanPooling represents that the final output of the BERT model is subjected to average pooling,dssplicing all the text initial embedded vectors for the dimensionality of the text embedded vectors to obtain a text initial embedded matrixWhere m is the number of texts.
And 4, step 4: aiming at evaluation nodes in a graph network structure, coding is carried out by using unique hot codes, a layer of parameter learnable full-connection network (FCN) is utilized to map coding vectors to a feature space, initial embedded vectors of the coding vectors are obtained, and all the evaluation initial embedded vectors are spliced to obtain an evaluation initial embedded matrix;
for the evaluation-side node a, the initial embedded vector is:
Xa=FCN(OneHot(a))#(2)
wherein OneHot represents one-hot encoding; the FCN represents a fully-connected network,daembedding the dimensions of the vector for evaluation; splicing all the initial embedded vectors in the evaluation aspect to obtain an initial embedded matrix in the evaluation aspectk is the number of evaluation flanks.
And 5: embedding vector h for a given node in a graph network structureiAnd neighbor node N connected with itiObtaining new embedded vectors of given nodes by using a multi-head attention mechanism, and obtaining new embedded vectors of given nodesComprises the following steps:
wherein, | | represents the splicing operation of the vector, and σ represents the ReLU activationFunction, WnIn order for the parameters to be learnable,the attention scores for node i and node j at the nth head,the calculation formula is as follows:
wherein the content of the first and second substances,for learnable parameters, eijThe weight representing the edge between node i and node j depends on where the word node appears in the text.
Embedded representation of given node n at step tThe embedded representation of its neighbors in step t is recorded asThe embedding of node n at (t +1) is shown asThe calculation formula is shown as formula (3) and is expressed as:
step 6: aiming at word nodes, text nodes and evaluation nodes in the graph network structure, corresponding initial embedding matrixes are given, and embedding in the step t is obtained through repeated iteration of a formula (5)Then, the embedded meter in step (t +1)The calculation formula is as follows:
wherein the content of the first and second substances,is the calculated intermediate value; FFN () is a feedforward network, the input of the feedforward network is the connection of the calculated intermediate value and the residual error of the embedded representation at the time t, and the advantage of doing so is that the expression capability of the model can be greatly improved and the model can be rapidly converged, and the calculation formula is as follows:
FFN(x)=max(0,xW1+b1)W2+b2
where max () denotes taking the larger of the two elements, W1,W2,b1,b2Are learnable parameters.
And 7: given the final text-embedded representation matrixAnd an evaluation-wise embedded representation matrixAnd calculating the correlation between the two by adopting a self-attention mechanism, wherein the calculation formula is as follows:
wherein the content of the first and second substances,is composed ofEmbedding vectors corresponding to the ith text node;is composed ofThe embedded vector corresponding to the jth evaluation aspect; beta is aijThe attention weight between the text node vector and the evaluation node vector represents the correlation weight distribution between the text node vector and the evaluation node vector;embedding the representation for the text nodes after the attention weight weighting;a probability distribution representing an emotional tendency of the predicted text in the current evaluation aspect; wa,baIs a learnable parameter; softmax () is an exponential normalization function, which is calculated by the formula:
and 8: distributing predicted text emotional tendencyWith text true emotion labelComparing, passing through cross entropy loss functionThe difference between the two is calculated and the loss over all samples is the sum over all text nodes i and all evaluation-aspect nodes j:
and finally, continuously updating model parameters through a back propagation algorithm, so that the predicted emotional tendency of the text is continuously close to the real emotional tendency of the text.
And step 9: model training
The gradient is updated by using an Adam optimizer, the learning rate is set to be 0.001, the first-order momentum parameter of Adam is 0.1, the second-order momentum parameter is 0.999, the number of data set training iterations (Epoch) is set to be 200, the parameters of the pre-trained BERT model are fixed, and the pre-trained GloVe word vector is 300-dimensional.
Model use:
and (3) performing feature extraction on the text input model to be classified, calculating the correlation between the extracted text feature vector and the trained evaluation aspect vector by adopting a self-attention mechanism, and finally classifying by using a softmax classifier.
Referring to fig. 1, fig. 1 is a schematic diagram of a network model of the present invention, which mainly includes a node-embedded representation initialization module, a graph attention module and a prediction module. The node embedded representation initialization module is used for initializing embedded representations of words, texts and evaluation nodes; the graph attention module is used for iteratively updating the embedded representation of the network node; the prediction module uses the final node embedding to represent the emotional tendency of the predicted text.
To measure model performance, comparative experiments were performed on five widely used public data sets, a training set of data sets, a test set partition, and the number of texts containing different emotions as shown in table 1. Table 2 shows the results of the comparison experiments, which are compared with thirteen common models in terms of index accuracy (Acc.) and F1 values, and it can be seen from the table that the models HAGNN-GloVe and HAGNN-BERT of the present invention achieve the best results in most indexes, and are greatly improved in model performance compared to the conventional methods.
TABLE 1 statistical information for data sets used to measure model performance
Table 2 shows the accuracy (Acc.) and F1 values of the comparison model on different data sets, where HAGNN-GloVe and HAGNN-BERT are two methods of the present invention, which employ different initialization data.
Table 2 comparison of model accuracy (Acc.) and F1 values on different datasets
The above-mentioned contents are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modification made on the basis of the technical idea of the present invention falls within the protection scope of the claims of the present invention.
Claims (8)
1. An aspect-level text emotion analysis method based on a heterogeneous graph neural network is characterized by comprising the following steps of:
(1) constructing a three-level graph network structure of a word-sentence-evaluation aspect according to the co-occurrence relation of the word and the text and the evaluation aspect related in the text;
(2) initializing the embedded vector representation of each node in the graph network structure by using a pre-trained model to respectively obtain an initial embedded representation matrix of the word nodeInitial embedded representation matrix of text nodesAnd initial embedded representation matrix in evaluation
(3) Continuously updating the embedded representation of each node in the graph network structure by a multi-head self-attention mechanism according to the semantic relation between the graph network structure and the text by adopting a graph attention network GAT training model so as to continuously exchange information among the nodes, thereby obtaining the embedded representation matrix of each node in the step (t +1)Finally, the text embedded expression matrix is obtainedAnd an evaluation-wise embedded representation matrix
(4) Embedding a representation matrix with the textAnd an evaluation-side embedded representation matrixCalculating the correlation between the text and each emotional tendency in the evaluation aspect through a self-attention mechanism, taking the emotion with the maximum correlation as the predicted emotion of the text in the evaluation aspect, calculating the difference between the predicted emotion de-tendency and the text real emotional tendency through a loss function, and finally optimizing model parameters through back propagation until the proximity of the predicted emotional tendency and the text real emotional tendency is in a preset range to obtain a trained model;
(5) inputting the text to be classified into a trained model for feature extraction, calculating the correlation between the extracted text feature vector and the trained evaluation vector by adopting a self-attention mechanism, and finally classifying by using a softmax classifier.
2. The method for analyzing aspect-level text emotion based on an abnormal pattern neural network of claim 1, wherein the graph network structure G is represented as: g ═ Vw,Vs,Va,Ews,Esa};
Wherein, VwRepresenting word nodes contained in the text; vsRepresenting a text node; vaRepresenting evaluation aspect nodes; ewsRepresenting an edge between a word node and a text node, the weight of which represents the position of the word appearing in the text; esaRepresenting an edge between the text node and the evaluation aspect node.
3. The method for analyzing aspect-level text emotion based on heterographic neural network of claim 1, wherein the specific operations of initializing word node embedding vectors in step (2) are as follows:
aiming at word nodes in a graph network structure, initializing the word nodes by using a pre-trained GloVe word vector library to obtain word embedded vectors, and splicing all the word embedded vectors to obtain a word initial embedded matrix.
4. The method for analyzing aspect-level text emotion based on heterographic neural network of claim 1, wherein the specific operation of initializing text node embedding vectors in step (2) is as follows:
aiming at a text node in a graph network structure, initializing the text node by using a pre-training language model BERT to obtain an initial embedded vector, and splicing all the initial embedded vectors of the text to obtain an initial embedded matrix of the text.
5. The method for analyzing the aspect-level text emotion based on the heteromorphic neural network of claim 1, wherein the specific operation of initializing the evaluation aspect node embedding vector in the step (2) is as follows:
aiming at evaluation nodes in a graph network structure, coding the evaluation nodes by using one-hot coding, mapping coding vectors to a feature space by using a layer of full-connection network FCN with learnable parameters to obtain initial embedded vectors of the evaluation nodes, and splicing all the evaluation initial embedded vectors to obtain an evaluation initial embedded matrix.
6. The method for analyzing aspect-level text emotion based on a heteromorphic neural network of claim 1, wherein the specific operation of updating the node-embedded representation in the graph network structure in step (3) is as follows:
embedding vector h for a given node in a graph network structureiAnd a neighbor N connected to said given nodeiObtaining a new embedded vector representation for a given node using a multi-head attention mechanism
The embedded representation of a given node n in step t is notedThe embedded representation of the neighbors of a given node at step t is notedLet the embedding of a given node n at (t +1) be denotedNew embedded vector representation based on given nodesConstruction ofThe relationship between the three is as follows:
7. The method for analyzing the emotion of the aspect-level text based on the heteromorphic neural network as claimed in claim 1, wherein in the step (4), the correlation between the text and each emotional tendency of the evaluation aspect is calculated by the following formula:
wherein the content of the first and second substances,is composed ofEmbedding vectors corresponding to the ith text node;is composed ofThe embedded vector corresponding to the jth evaluation aspect; beta is aijThe attention weight between the text node vector and the evaluation node vector is obtained;embedding the representation for the text nodes after the attention weight weighting;a probability distribution representing an emotional tendency of the predicted text in the current evaluation aspect; wa,baIs a learnable parameter; softmax () is an exponential normalization function, and the calculation formula is as follows:
8. the method for analyzing aspect-level text emotion based on neural network of heterogeneous map as claimed in claim 7, wherein in step (4), the method is performed in advanceMeasured text emotional tendency distributionWith text true emotion labelComparing, passing through cross entropy loss functionThe difference between the two is calculated and the loss over all samples is the sum over all text nodes i and all evaluation-aspect nodes j:
and finally, continuously updating the model parameters through a back propagation algorithm until the degree of closeness between the predicted emotional tendency and the real emotional tendency of the text is in a preset range.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110593991.9A CN113255366B (en) | 2021-05-28 | 2021-05-28 | Aspect-level text emotion analysis method based on heterogeneous graph neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110593991.9A CN113255366B (en) | 2021-05-28 | 2021-05-28 | Aspect-level text emotion analysis method based on heterogeneous graph neural network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113255366A true CN113255366A (en) | 2021-08-13 |
CN113255366B CN113255366B (en) | 2022-12-09 |
Family
ID=77185252
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110593991.9A Active CN113255366B (en) | 2021-05-28 | 2021-05-28 | Aspect-level text emotion analysis method based on heterogeneous graph neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113255366B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114491029A (en) * | 2022-01-18 | 2022-05-13 | 四川大学 | Short text similarity calculation method based on graph neural network |
CN116386895A (en) * | 2023-04-06 | 2023-07-04 | 之江实验室 | Epidemic public opinion entity identification method and device based on heterogeneous graph neural network |
CN116662554A (en) * | 2023-07-26 | 2023-08-29 | 之江实验室 | Infectious disease aspect emotion classification method based on heterogeneous graph convolution neural network |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111488734A (en) * | 2020-04-14 | 2020-08-04 | 西安交通大学 | Emotional feature representation learning system and method based on global interaction and syntactic dependency |
CN112131886A (en) * | 2020-08-05 | 2020-12-25 | 浙江工业大学 | Method for analyzing aspect level emotion of text |
CN112560432A (en) * | 2020-12-11 | 2021-03-26 | 中南大学 | Text emotion analysis method based on graph attention network |
-
2021
- 2021-05-28 CN CN202110593991.9A patent/CN113255366B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111488734A (en) * | 2020-04-14 | 2020-08-04 | 西安交通大学 | Emotional feature representation learning system and method based on global interaction and syntactic dependency |
CN112131886A (en) * | 2020-08-05 | 2020-12-25 | 浙江工业大学 | Method for analyzing aspect level emotion of text |
CN112560432A (en) * | 2020-12-11 | 2021-03-26 | 中南大学 | Text emotion analysis method based on graph attention network |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114491029A (en) * | 2022-01-18 | 2022-05-13 | 四川大学 | Short text similarity calculation method based on graph neural network |
CN116386895A (en) * | 2023-04-06 | 2023-07-04 | 之江实验室 | Epidemic public opinion entity identification method and device based on heterogeneous graph neural network |
CN116386895B (en) * | 2023-04-06 | 2023-11-28 | 之江实验室 | Epidemic public opinion entity identification method and device based on heterogeneous graph neural network |
CN116662554A (en) * | 2023-07-26 | 2023-08-29 | 之江实验室 | Infectious disease aspect emotion classification method based on heterogeneous graph convolution neural network |
CN116662554B (en) * | 2023-07-26 | 2023-11-14 | 之江实验室 | Infectious disease aspect emotion classification method based on heterogeneous graph convolution neural network |
Also Published As
Publication number | Publication date |
---|---|
CN113255366B (en) | 2022-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109284506B (en) | User comment emotion analysis system and method based on attention convolution neural network | |
CN111274398B (en) | Method and system for analyzing comment emotion of aspect-level user product | |
CN113255366B (en) | Aspect-level text emotion analysis method based on heterogeneous graph neural network | |
CN112733541A (en) | Named entity identification method of BERT-BiGRU-IDCNN-CRF based on attention mechanism | |
CN109614471B (en) | Open type problem automatic generation method based on generation type countermeasure network | |
CN111414461B (en) | Intelligent question-answering method and system fusing knowledge base and user modeling | |
CN112667818B (en) | GCN and multi-granularity attention fused user comment sentiment analysis method and system | |
CN110222163A (en) | A kind of intelligent answer method and system merging CNN and two-way LSTM | |
CN112232087B (en) | Specific aspect emotion analysis method of multi-granularity attention model based on Transformer | |
CN112560432A (en) | Text emotion analysis method based on graph attention network | |
CN112784532B (en) | Multi-head attention memory system for short text sentiment classification | |
CN111291556A (en) | Chinese entity relation extraction method based on character and word feature fusion of entity meaning item | |
CN114757182A (en) | BERT short text sentiment analysis method for improving training mode | |
CN113705238B (en) | Method and system for analyzing aspect level emotion based on BERT and aspect feature positioning model | |
CN115831102A (en) | Speech recognition method and device based on pre-training feature representation and electronic equipment | |
CN114547299A (en) | Short text sentiment classification method and device based on composite network model | |
CN112988970A (en) | Text matching algorithm serving intelligent question-answering system | |
CN114254645A (en) | Artificial intelligence auxiliary writing system | |
CN114036298B (en) | Node classification method based on graph convolution neural network and word vector | |
CN115935975A (en) | Controllable-emotion news comment generation method | |
CN114722835A (en) | Text emotion recognition method based on LDA and BERT fusion improved model | |
CN112989803B (en) | Entity link prediction method based on topic vector learning | |
JPH0934863A (en) | Information integral processing method by neural network | |
CN112560440A (en) | Deep learning-based syntax dependence method for aspect-level emotion analysis | |
CN113342964B (en) | Recommendation type determination method and system based on mobile service |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |