CN114781352A - Emotion analysis method based on association between grammar dependency type and aspect - Google Patents

Emotion analysis method based on association between grammar dependency type and aspect Download PDF

Info

Publication number
CN114781352A
CN114781352A CN202210373785.1A CN202210373785A CN114781352A CN 114781352 A CN114781352 A CN 114781352A CN 202210373785 A CN202210373785 A CN 202210373785A CN 114781352 A CN114781352 A CN 114781352A
Authority
CN
China
Prior art keywords
sentence
representation
aspects
inter
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210373785.1A
Other languages
Chinese (zh)
Other versions
CN114781352B (en
Inventor
刘辉
马祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202210373785.1A priority Critical patent/CN114781352B/en
Priority claimed from CN202210373785.1A external-priority patent/CN114781352B/en
Publication of CN114781352A publication Critical patent/CN114781352A/en
Application granted granted Critical
Publication of CN114781352B publication Critical patent/CN114781352B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • G06F16/355Class or cluster creation or modification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/253Grammatical analysis; Style critique
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses an emotion analysis method based on association between grammar dependency type and aspects, which comprises the steps of S1, obtaining a text needing to be analyzed, and converting the text into word vector representation through a pre-training model; s2, splicing the specific aspect with each word represented by the sentence, introducing a grammar dependency type, and giving greater weight to the important dependency type in the grammar dependency tree; s3 using sentence expression with aspect perception and grammar dependence type as the input of model; s4, introducing an inter-aspect correlation matrix, and acquiring a sentence representation containing inter-aspect correlation through a graph convolution network; s5, merging the sentence expression containing the association between the aspects and the sentence expression containing the aspect characteristics; and S6, after sentence expressions containing aspect information and the association between the aspects are obtained, judging the emotion polarity of the target aspect by combining the query vector. The invention improves the capability of identifying information and grammar dependency relation among aspects of the model.

Description

Emotion analysis method based on association between grammar dependency type and aspect
Technical Field
The invention relates to an emotion analysis method based on association between grammar dependency types and aspects, and mainly relates to the field of natural language processing.
Background
The development of electronic commerce generates a large amount of comment texts with emotional polarities, and the comment texts have important commercial values and attract researchers to research emotional analysis. In general, a sentence contains several different aspects, which may have the same or different emotions between them. In the past, all aspects are generally considered independently in aspect level emotion analysis, emotional connection among the aspects is split, and the analysis is limited, even under some conditions, emotion judgment of a target aspect can be analyzed by means of emotions of other aspects. For example, in the sentence "the nuisverylimed-ithnkwecounted4or5entries", the meaning word "entries" has no definite emotion word, and the emotion of the meaning word "entries" cannot be judged only by considering the latter half of the sentence, but can be judged to be negative by the negative emotion of the meaning word "menu". In recent years, research on facet emotion analysis has been rapidly developed. Korean tiger et al propose to use knowledge-graph to incorporate background knowledge into text to provide a large amount of context information for sentences. Li zhang et al propose a BERT-based memory network model that fully interacts the output of the memory network with the attention of [ CLS ] vectors of the terms, avoiding the loss of important information. Gorgeous et al propose to screen out the initially acquired features using a self-attention mechanism, then send the screened features to CNN using different convolution kernels to extract different local features, and finally screen out important information via the self-attention mechanism. Lin et al propose to use location information to select features of a deep memory network and design a cross-facet module to obtain emotional associations between facets. Liang et al corrected the syntax dependency tree, centered on the aspect words, enhanced the dependency graph, and constructed the inter-aspect association graph to obtain the inter-aspect emotional dependency. However, most of the existing research only focuses on the emotion of a certain aspect of a sentence, and ignores the relationship between the aspects. Meanwhile, most researches use the original syntax dependency tree, and the influence of different dependency types on the emotion polarity is not considered.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an emotion analysis method based on the association between the grammar dependency type and the aspect, which solves the problem that the emotion association between different aspects cannot be considered in the traditional method, and meanwhile, by introducing the grammar dependency type, the grammar dependency type which is more important for judging target aspect emotion can be given with larger weight, so that the emotion information and the grammar information between the aspects are effectively combined, and the accuracy and the stability of aspect-level emotion analysis on different data sets are improved.
In order to achieve the purpose, the technical scheme of the invention is as follows: an emotion analysis method based on association between a grammatical dependency type and an aspect, the method specifically comprising the steps of:
s1, obtaining a text to be analyzed, and converting the text into word vector representation through a pre-training model;
s2, the specific aspect is spliced with each word represented by the sentence, so that the sentence representation only focuses on the emotion of the specific aspect each time, the grammar dependency type is introduced, important dependency types in the grammar dependency tree are given greater weight, the sentence representation containing more grammar information is obtained through the graph convolution network, and then the sentence representation with aspect perception and grammar dependency type is improved by adopting the attention layer;
s3, the sentence representation with the aspect perception and grammar dependency type is used as the input of the model, the target aspect is used as the query vector, the non-target aspect enables the aspect characteristics to be further propagated in the sentence representation through the Bi-GRU, and then the inter-aspect attention is calculated between the sentence representation containing more aspect characteristics and the query vector;
s4, introducing an inter-aspect correlation matrix, and acquiring a sentence representation containing inter-aspect correlation through a graph convolution network;
s5, merging sentence expression containing inter-aspect correlation and sentence expression containing inter-aspect characteristics, and controlling the quantity of introduced inter-aspect correlation characteristics by using inter-aspect correlation coefficients;
s6, after sentence representation containing aspect information and inter-aspect correlation is obtained, inter-aspect attention is used for controlling influence of non-target aspects on target aspects to obtain final sentence representation, and the emotion polarity of the target aspects is judged by combining the query vector.
Preferably, the step S2 gives greater weight to the important dependency types in the syntax dependency tree specifically as follows: and obtaining a sentence representation containing aspect information by using the Bi-GRU, then introducing a grammar dependency type, obtaining the sentence representation containing grammar characteristics through a graph convolution network, and adopting an attention layer to expand the influence of words which play an important role in judging the specific aspect emotion.
Preferably, the sentence representation containing the aspect information comprises the following specific processes:
general terms aiEmbedding into a word vector representation of a sentence, resulting in a word vector representation with facet words:
Figure BDA0003585115820000031
passing through Bi-GRU, it is named as GRU1To obtain
Figure BDA0003585115820000032
Is represented by a hidden representation
Figure BDA0003585115820000033
Namely, it is
Figure BDA0003585115820000034
The specific construction process of the graph convolution network focusing on grammar dependence types comprises the following steps: first, using SpaCy to obtain syntax dependency information, a list of dependency tuples (w) can be usedi,wj,cij) Is shown by cijRepresents the word wiWith another word wjThe type of dependency of (2); the syntax dependency information is then represented by an adjacency matrix A, in which the element a is presentijRepresenting a word wiWith another word wjThe relationship between, if there are edges connected, aij1 is ═ 1; no edges are connected, then aij=0;
Element c in the relation dependency type matrixijWord-in-word mapping to themGo into
Figure BDA0003585115820000035
Different weights are given to each edge by the following method
Figure BDA0003585115820000036
The following formula is calculated:
Figure BDA0003585115820000037
Figure BDA0003585115820000038
Figure BDA0003585115820000039
wherein the content of the first and second substances,
Figure BDA00035851158200000310
and
Figure BDA00035851158200000311
respectively represent words wiAnd the word wjHidden state of layer (l-1) < th >, original
Figure BDA0003585115820000041
And
Figure BDA0003585115820000042
from GRU1An output of (d);
final word wiThe output at layer l via the graph-convolution network is as follows:
Figure BDA0003585115820000043
wherein, W(l)And b(l)Representing trainable parameters in a l-level graph convolutionNumber, σ denotes the activation function ReLU; representation of sentences with aspect information obtained after TFGCN
Figure BDA0003585115820000044
Representing;
in representing facet-aware sentence representations, an attention layer is introduced to improve facet-aware sentence representation
Figure BDA0003585115820000045
The specific operation is shown in the following formula:
Figure BDA0003585115820000046
α=softmax(z),
Figure BDA0003585115820000047
wherein
Figure BDA0003585115820000048
Figure BDA0003585115820000049
bsIs a scalar.
Preferably, the step S3 of calculating inter-aspect attention is specifically: and taking the sentence representation containing the inter-aspect association and the grammar dependency type as an input, passing the sentence representation through a Bi-GRU, further transmitting aspect information in the sentence representation to obtain a sentence representation containing more aspect features, taking the target aspect as a query vector, and calculating the attention between the target aspect and the non-target aspect.
Preferably, the inter-computational attention specific procedure: firstly, the method is to
Figure BDA00035851158200000410
Input to another hidden layer of size D0Gated cyclic unit GRU2The preparation method comprises the following steps of (1) performing; it is composed ofHe passes through GRU2The resulting hidden representation is
Figure BDA0003585115820000051
For target aspect representation
Figure BDA0003585115820000052
Converting the query vector into a query vector q by using a full connection layer; the query vector is obtained as follows:
Figure BDA0003585115820000053
wherein
Figure BDA0003585115820000054
To obtain the correlation between the query vector and other vectors, the following method is adopted for calculation:
Figure BDA0003585115820000055
β=softm ax(za)
wherein the content of the first and second substances,
Figure BDA0003585115820000056
βirepresenting an attention score between the target aspect and the other aspects;
the inter-aspect attention module based on the target aspect comprises the following specific processes:
will be provided with
Figure BDA0003585115820000057
Input to another hidden layer of size D0Gated cyclic unit GRU2In, and through GRU in other respects2The resulting hidden representation is
Figure BDA0003585115820000058
For object aspect representation
Figure BDA0003585115820000059
Converting the query vector q into a query vector q by using a full connection layer, and calculating attention between a target aspect and other aspects; the query vector is obtained as follows:
Figure BDA00035851158200000510
wherein
Figure BDA00035851158200000511
To obtain the correlation between the query vector and other vectors, the following method is adopted for calculation:
Figure BDA0003585115820000061
β=so ftm ax(za)
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003585115820000062
βian attention score between the target aspect and the other aspects is represented.
Preferably, the sentence expression associated with step S4 is specifically: and obtaining a correlation matrix between the aspects, and then using the correlation matrix and the sentence representation of the aspect characteristics as input of a graph volume network to obtain the sentence representation of the correlation between the aspects through the graph volume network.
Preferably, the sentence representation of the association between the aspects is as follows:
firstly, constructing an adjacency matrix between aspects for acquiring context correlation between the aspects, wherein the specific construction method is shown as the following formula:
Figure BDA0003585115820000063
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003585115820000069
set of words, p, representing aspects of the objecttIs the starting position of the target aspect;
Figure BDA0003585115820000065
a set of words representing other aspects;
forming a non-directional adjacency matrix, i.e.
Figure BDA0003585115820000066
After the inter-aspect adjacency matrix is acquired, the inter-aspect adjacency matrix is
Figure BDA0003585115820000067
And through GRU2Resulting hidden representation with aspect information
Figure BDA0003585115820000068
As input of the graph volume network, acquiring mutual emotional characteristics of a specific aspect and other aspects; hidden representation of ith node in ith layer in graph convolution network
Figure BDA0003585115820000071
The updates of (c) are as follows:
Figure BDA0003585115820000072
wherein, ReLU represents an activation function,
Figure BDA0003585115820000073
representing the hidden representation of the ith node at level l-1, W(l)And b(l)Representing trainable parameters in the convolution of the l-layer diagram;
the hidden representation with the inter-aspect association information finally obtained through the graph convolution network is as follows:
Figure BDA0003585115820000074
preferably, the sentence expression of step S5 containing both the aspect features and the associations between the aspects is specifically: combining sentence representations containing aspect characteristics and sentence representations related between aspects, and introducing a correlation coefficient between aspects to control the amount of introduced correlation information between aspects, and finally obtaining the sentence representations simultaneously containing the aspect characteristics and the correlation between aspects;
meanwhile, the sentence expression containing the aspect characteristics and the association between the aspects is as follows:
the sentence representation with rich aspect information can be obtained by the above process
Figure BDA0003585115820000075
And sentence representation Q with inter-aspect associationsInter(ii) a To make full use of
Figure BDA0003585115820000076
And QInterThe two representations are combined to discover the mutual relationship between the two representations, and the combined sentence expression is shown as the following formula:
Figure BDA0003585115820000077
wherein the coefficient gamma e [0,1] represents how much of the characteristics between the introduction aspects.
Preferably, the final sentence representation of step S6 is embodied as: multiplying the sentence representation containing the aspect features and the association between the aspects with the attention between the aspects to control the influence of different non-target aspects on the target aspects to obtain a final sentence representation, adding the final sentence representation with the query vector of the target aspect, and inputting the final sentence representation to a softmax layer to obtain a final representation of emotion analysis.
Preferably, the final representation for emotion analysis is embodied as:
firstly, the belt is provided with a squareSentence representation Q of interplanar associationsInterMultiplying with inter-aspect attention beta to obtain final output of sentence representation
Figure BDA0003585115820000081
The query vector q for the target facet is then added to the output to generate a target facet sentence representation with rich inter-facet association information, which is then sorted by softmax, which contains C classes, to obtain the final representation p for emotion analysis, as shown below:
ρ=softmax((q+o)Wρ+bρ)
wherein the weight matrix WρAnd bias bρIs trainable; after the final representation is obtained, the emotion polarity is predicted as follows:
Figure BDA0003585115820000082
wherein
Figure BDA0003585115820000083
Is the predicted emotion polarity.
The technical principle and the beneficial effects of the invention are as follows:
(1) according to the method, more important grammar dependency relationship is given more weight through the grammar dependency type, so that the model can focus on the grammar features beneficial to target aspect emotion analysis, and the grammar features of the target aspect are enriched.
(2) The sentence representation of the non-target aspect further passes through the Bi-GRU, so that after the aspect information is further transmitted in the sentence representation, the target aspect is used as a query vector to calculate inter-aspect attention between the non-target aspect and the non-target aspect, and the sentence representation of the non-target aspect with larger emotional association with the target aspect can play a more important role.
(3) According to the invention, the inter-aspect correlation matrix is introduced, and after sentence representation of inter-aspect correlation is obtained through graph convolution, the inter-aspect correlation coefficient is introduced to control the amount of the introduced inter-aspect correlation information, so that the introduced inter-aspect information is beneficial to improving emotion analysis of a target aspect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings required to be used in the description of the embodiments will be briefly introduced below, it is obvious that the drawings in the following description are only three of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is an emotion analysis model based on associations between grammatical dependency types and aspects;
FIG. 2 is a specific aspect sentence representation with aspect information and grammar dependency types;
FIG. 3 is a construction of adjacency matrix and syntax-dependent type matrix.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are merely preferred embodiments of the present invention, rather than all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
As shown in FIG. 1, the present disclosure provides an emotion analysis method based on association between syntax dependency type and aspect, including:
s1, obtaining a text to be analyzed, converting the text into word vector representation through a pre-training model, and representing the text to be analyzed as a word vector, specifically: converting the text into word vectors which can be recognized by a computer by using a GloVe pre-training model;
s2, the specific aspect is spliced with each word represented by the sentence, so that the sentence representation only focuses on the emotion of the specific aspect each time, the grammar dependency type is introduced, important dependency types in the grammar dependency tree are given greater weight, the sentence representation containing more grammar information is obtained through the graph convolution network, and then the sentence representation with aspect perception and grammar dependency type is improved by adopting the attention layer;
s3, the sentence representation is used as an input of a model, a target aspect is used as a query vector, a non-target aspect enables aspect features to be further propagated in the sentence representation through the Bi-GRU, and then inter-aspect attention is calculated between the obtained sentence representation containing more aspect features and the query vector;
s4, introducing an inter-aspect correlation matrix, and acquiring sentence representation containing inter-aspect correlation through a graph convolution network;
s5, merging sentence expression containing inter-aspect correlation and sentence expression containing inter-aspect characteristics, and controlling the quantity of introduced inter-aspect correlation characteristics by using inter-aspect correlation coefficients;
s6, after the sentence representation containing the aspect information and the association between the aspects is obtained, the influence of the non-target aspect on the target aspect is controlled by the attention between the aspects to obtain the final sentence representation, and the emotion polarity of the target aspect is judged by combining the query vector.
Further, the syntax dependency type is used to give a greater weight to the important dependency types in the syntax dependency tree, which is specifically represented as: and obtaining a sentence representation containing aspect information by using the Bi-GRU, then introducing a grammar dependency type, obtaining the sentence representation containing grammar characteristics through a graph convolution network, and adopting an attention layer to expand the influence of words which play an important role in judging the specific aspect emotion.
The inter-aspect attention module based on the target aspect specifically comprises: the sentence representation which simultaneously contains inter-aspect association and grammar dependency types is used as input and is passed through Bi-GRU, so that aspect information is further propagated in the sentence representation to obtain the sentence representation containing more aspect features, and the target aspect is used as a query vector to calculate attention between the target aspect and the non-target aspect.
The sentence representation of the association between the aspects is embodied as follows: and obtaining a correlation matrix between the aspects, and then using the correlation matrix and the sentence representation of the aspect characteristics as input of a graph volume network to obtain the sentence representation of the correlation between the aspects through the graph volume network.
The sentence expression simultaneously containing aspect characteristics and correlation among aspects is specifically as follows: and merging the sentence representation containing the aspect characteristics and the sentence representation associated between the aspects, and introducing an inter-aspect association coefficient to control the amount of the introduced inter-aspect association information so as to finally obtain the sentence representation simultaneously containing the aspect characteristics and the inter-aspect associations.
And the final representation used for emotion analysis is embodied as follows: and simultaneously multiplying the sentence representation containing the aspect characteristics and the association between the aspects by the attention between the aspects to control the influence of different non-target aspects on the target aspects to obtain a final sentence representation, adding the final sentence representation with the query vector of the target aspect, and inputting the final sentence representation into a softmax layer to obtain a final representation of emotion analysis.
Further, the specific process of obtaining sentences containing aspect information is as follows:
general terms aiEmbedding into a word vector representation of a sentence, resulting in a word vector representation with facet words:
Figure BDA0003585115820000111
then passing through Bi-GRU, and naming it as GRU1To obtain
Figure BDA0003585115820000112
Is represented by a hidden representation
Figure BDA0003585115820000113
Namely that
Figure BDA0003585115820000114
Then, the specific construction process of the graph convolution network focusing on the grammar dependence type comprises the following steps: first, using SpaCy to obtain syntax dependency information, a list of dependency tuples (w) can be usedi,wj,cij) Is shown by cijRepresents the word wiWith another word wjIn accordance withA type of lysine; the syntax dependency information is then represented by the adjacency matrix A, in which the element a is presentijRepresenting a word wiWith another word wjThe relationship between, if there are edges connected, aij1 is ═ 1; no edges are connected, then aij=0。
To fully utilize the dependency type, element c in the dependency type matrix of the relationship is usedijWord embedding mapped to them
Figure BDA0003585115820000115
Since each edge in the graph contributes differently to the emotion, in combination with the dependency type, the following method is adopted in the present embodiment to give different weight to each edge. Weight of each edge
Figure BDA0003585115820000121
The following formula is calculated:
Figure BDA0003585115820000122
Figure BDA0003585115820000123
Figure BDA0003585115820000124
wherein the content of the first and second substances,
Figure BDA0003585115820000125
and
Figure BDA0003585115820000126
respectively represent words wiAnd the word wjHidden state of layer (l-1), original
Figure BDA0003585115820000127
And
Figure BDA0003585115820000128
from GRU1To output of (c).
Final word wiThe output formula at the l layer through the graph convolution network is as follows:
Figure BDA0003585115820000129
wherein, W(l)And b(l)Represents the trainable parameter in the l-level graph convolution, σ represents the activation function ReLU. The above process incorporates dependency types into the graph convolution network and gives each edge different weights with the dependency types, enabling to focus on contextual information more important to the face word emotion analysis. Representation of sentences with aspect information obtained after TFGCN
Figure BDA00035851158200001210
And (4) showing.
Then, for a specific aspect a for expansioniThe influence of words with important function for emotion judgment is introduced into an attention layer to improve sentence representation of aspect perception when representing sentence representation of aspect perception
Figure BDA00035851158200001211
The specific operation is shown in the following formula:
Figure BDA00035851158200001212
α=softmax(z),
Figure BDA00035851158200001213
wherein
Figure BDA0003585115820000131
Figure BDA0003585115820000132
bsIs a scalar.
Further, the inter-aspect attention module based on the target aspect specifically includes:
firstly, the method is to
Figure BDA0003585115820000133
Input to another hidden layer of size D0Gated cyclic unit GRU2The purpose of the method is to enable the aspect information to be sufficiently spread in the sentence representation, and obtain more sentence representations with the aspect information. Otherwise through GRU2The resulting hidden representation is
Figure BDA0003585115820000134
For target aspect representation
Figure BDA0003585115820000135
The present embodiment uses a fully connected layer to convert it into a query vector q for computing attention between the target aspect and other aspects. The query vector is obtained as follows:
Figure BDA0003585115820000136
wherein
Figure BDA0003585115820000137
In order to obtain the correlation between the query vector and other vectors, the following method is adopted for calculation:
Figure BDA0003585115820000138
β=softm ax(za)
wherein the content of the first and second substances,
Figure BDA0003585115820000139
βirepresenting between target aspects and other aspectsThe attention score is obtained by the attention mechanism, and other aspects having a large correlation with the target aspect are weighted more, and the attention score plays a more important role in determining the emotion of the target aspect.
Further, the sentence representation of the association between the aspects includes the following specific calculation processes:
firstly, constructing an adjacency matrix between aspects for acquiring context correlation between the aspects, wherein the specific construction method is shown as the following formula:
Figure BDA0003585115820000141
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003585115820000142
set of words, p, representing aspects of the objecttIs the starting position of the target aspect.
Figure BDA0003585115820000143
A set of words representing other aspects. To obtain richer dependency information of the input sentence, the present embodiment constructs a non-directional adjacency matrix, i.e.
Figure BDA0003585115820000144
After the inter-aspect adjacency matrix is acquired, the inter-aspect adjacency matrix is
Figure BDA0003585115820000145
And through GRU2Resulting hidden representation with aspect information
Figure BDA0003585115820000146
As input to the graph convolution network to obtain the mutual emotional characteristics of the specific aspect and other aspects. Hidden representation of ith node in ith layer in graph convolution network
Figure BDA0003585115820000147
The updates of (2) are as follows:
Figure BDA0003585115820000148
wherein, ReLU represents an activation function,
Figure BDA0003585115820000149
representing a hidden representation of the ith node at level l-1, W(l)And b(l)Representing trainable parameters in the convolution of the l-level graph.
Finally, the hidden representation with the correlation information between the aspects obtained through the graph convolution network is shown as
Figure BDA00035851158200001410
Further, the sentence expression simultaneously containing aspect features and associations between aspects specifically includes:
the sentence representation with rich aspect information can be obtained by the above process
Figure BDA00035851158200001411
And sentence representation Q with inter-aspect associationsInter. To make full use of
Figure BDA0003585115820000151
And QInterIn this section, the two representations are combined to find the correlation between the two representations, and the sentence representation after combination is shown as the following expression.
Figure BDA0003585115820000152
Wherein the coefficient gamma e [0,1] represents how much of the characteristics between the introduction aspects.
Further, the final representation for emotion analysis is specifically represented as:
to introduce inter-aspect associations for further sentiment analysis, more accurately modeling the sentence representation of the target aspect, first one will bringSentence representation Q of inter-aspect associationsInterMultiplying the inter-aspect attention beta to obtain the final output of sentence representation
Figure BDA0003585115820000153
The query vector q for the target aspect is then added to the output to generate a target aspect sentence representation with rich inter-aspect association information, and then a softmax classifier containing C classes (C is generally 2 or 3 in the emotion analysis task, and in this embodiment is 3) is used to obtain a final representation p for emotion analysis, as shown in the following formula:
ρ=softmax((q+o)Wρ+bρ)
wherein the weight matrix WρAnd bias bρIs trainable.
After the final representation is obtained, the emotion polarity is predicted as shown in the following equation.
Figure BDA0003585115820000154
Wherein
Figure BDA0003585115820000155
Is the predicted emotional polarity.
The problem that emotion association between different aspects cannot be considered in the traditional method is solved, meanwhile, by introducing the grammar dependency type, higher weight can be given to the grammar dependency type which is more important for emotion judgment of the target aspect, emotion information and grammar information between the aspects are effectively combined, and accuracy and stability of aspect level emotion analysis on different data sets are improved.
In order to verify the effectiveness of the model provided by the embodiment of the invention in solving the aspect-level emotion analysis task, three public data sets are adopted for carrying out experiments:
including both REST14 and LAP14 data sets. In addition, TWITTER datasets built from social networking sites have also been employed. Each data set is labeled in advance and is divided into a training set and a testing set, the data of the training set is used for training a proper model, and the testing set is used for judging the generalization ability of the trained model. Each dataset contains three categories, positive, neutral and negative, respectively.
Setting experimental parameters: initial representations of sentences and aspect words are obtained by using GloVe word vectors and BERT word vectors respectively, wherein the input dimension of the GloVe word vectors is 300 generally, and the input dimension of the BERT word vectors is 768 generally. If the loss function of the model does not decrease for 5 epochs consecutively, the training is terminated by adopting an early-stop mechanism. For the weight matrix and the bias matrix, random initialization is performed by uniformly distributing U (-0.01,0.01), and the number of layers of the graph convolution network is set to 2. Other super parameter settings are shown in table 1.
TABLE 1 setting of hyper-parameters
Figure BDA0003585115820000161
Accuracy (Accuracy, Acc) and Macro-F1 (F1 value for short) widely used in aspect-level emotion analysis are adopted as evaluation indexes of the model. Accuracy refers to the proportion of correctly predicted samples to the total number of samples. Macro-F1 takes the average of all classes of F1 as the F1 value for the whole sample. The larger the value of the index is, the better the classification effect is.
The calculation of the accuracy Acc and F1 values is shown below.
Figure BDA0003585115820000162
Figure BDA0003585115820000163
Figure BDA0003585115820000164
Figure BDA0003585115820000171
Where TN represents true negative samples, FN represents false negative samples, FP represents false positive samples, and TP represents true samples. P represents precision and R represents recall.
First, to verify the superiority of the model herein, comparative experiments were performed on three data sets, with the results shown in table 2.
TABLE 2 accuracy and F1 values for each model on the TWITTER dataset
Figure BDA0003585115820000172
TABLE 3 accuracy (%) (in% of the different models) on REST14 and LAP14
Figure BDA0003585115820000173
Table 4F 1 values (%) (on REST14 and LAP 14) for the different models
Figure BDA0003585115820000174
Figure BDA0003585115820000181
Note: TF-IAGCN is the model provided by the invention, and the bold data shows the best effect.
Secondly, to further verify the superiority of the model proposed in the present invention in emotion analysis of sentences containing multiple aspects, the REST14 and LAP14 data sets were further divided according to the Single Aspect (SA) and Multiple Aspects (MA) contained in the sentences, as shown in table 5. The accuracy of each model in single and multiple aspects of the REST14 and LAP14 datasets is shown in table 6.
TABLE 5 distribution of single and multiple aspects in LAP14 and REST14 for individual emotion categories
Figure BDA0003585115820000182
TABLE 6 accuracy (%) -of each model in different aspect numbers
Figure BDA0003585115820000183
Thirdly, in order to verify the effectiveness of several modules proposed by the present invention, several sets of ablation experiments were performed as follows, and the results are shown in table 7.
w/oTFGCN, wherein TFGCN represents a graph volume network of a focus syntax type, and w/oTFGCN represents hidden representation obtained by GRU directly when sentence representation of an embedding aspect is constructed, and the hidden representation is input to an attention layer to obtain sentence representation of embedding certain aspect.
w/oIaatt, IAatt represents attention between aspects, w/oIAatt represents that attention between aspects is not calculated, and sentence representations containing specific aspects and sentence representations related between aspects are directly fused and spliced with a query vector of a target aspect to predict emotion.
W/oagcn, IAGCN representing a graph convolution network with inter-aspect associations, w/oagcn representing not extracting sentence representations with inter-aspect associations through the graph convolution network with inter-aspect associations, using only sentence representations containing a particular aspect, and generating a final representation with inter-aspect attention.
The final results of the ablation experiments are shown in table 7.
Table 7 ablation experimental results (%)
Figure BDA0003585115820000191
The embodiment provides an emotion analysis model based on the association between the grammatical dependency types and the aspects, and the superiority of the emotion analysis model is verified through the embodiment.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. An emotion analysis method based on association between a grammatical dependency type and an aspect, the method specifically comprising the steps of:
s1, acquiring a text to be analyzed, and converting the text into word vector representation through a pre-training model;
s2, the specific aspect is spliced with each word represented by the sentence, so that the sentence represents that only the emotion of the specific aspect is focused at each time, the grammar dependency type is introduced, important dependency types in the grammar dependency tree are endowed with larger weight, the sentence representation containing more grammar information is obtained through a graph convolution network, and then the sentence representation with the aspect perception and the grammar dependency type is improved by adopting an attention layer;
s3, the sentence representation with the aspect perception and grammar dependency type is used as the input of the model, the target aspect is used as the query vector, the non-target aspect enables the aspect characteristics to be further propagated in the sentence representation through the Bi-GRU, and then the inter-aspect attention is calculated between the sentence representation containing more aspect characteristics and the query vector;
s4, introducing an inter-aspect correlation matrix, and acquiring a sentence representation containing inter-aspect correlation through a graph convolution network;
s5, merging the sentence expression containing the inter-aspect association and the sentence expression containing the aspect characteristics, and controlling the introduction amount of the inter-aspect association characteristics by using the inter-aspect association coefficient;
s6, after the sentence representation containing the aspect information and the association between the aspects is obtained, the influence of the non-target aspect on the target aspect is controlled by the attention between the aspects to obtain the final sentence representation, and the emotion polarity of the target aspect is judged by combining the query vector.
2. The method of claim 1, wherein the emotion analysis method comprises the following steps: step S2 gives more weight to important dependency types in the syntax dependency tree specifically as: and obtaining a sentence representation containing aspect information by using the Bi-GRU, then introducing a grammar dependency type, obtaining the sentence representation containing grammar characteristics through a graph convolution network, and adopting an attention layer to expand the influence of words which play an important role in judging the specific aspect emotion.
3. The method of claim 2, wherein the emotion analysis method comprises the following steps: the specific process of sentence representation containing aspect information is as follows:
general terms aiEmbedding into a word vector representation of a sentence to obtain a word vector representation with aspect words:
Figure FDA0003585115810000021
passing through Bi-GRU, it is named as GRU1To obtain
Figure FDA0003585115810000022
Is hidden representation of
Figure FDA0003585115810000023
Namely, it is
Figure FDA0003585115810000024
The specific construction process of the graph convolution network focusing on grammar dependence types comprises the following steps: first, using SpaCy to obtain syntax dependency information, a list of dependency tuples (w) can be usedi,wj,cij) To represent cijRepresenting a word wiWith another word wjThe type of dependency of (2); the syntax dependency information is then represented by the adjacency matrix A, in which the element a is presentijRepresents the word wiWith another word wjThe relationship between, if there are edges connected, aij1 is ═ 1; no edges are connected, then aij=0;
Make the relationship dependent onElement c in type matrixijWord embedding mapped to them
Figure FDA0003585115810000025
Different weights are given to each edge by the following method
Figure FDA0003585115810000026
The following formula is calculated:
Figure FDA0003585115810000027
Figure FDA0003585115810000028
Figure FDA0003585115810000029
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA00035851158100000210
and
Figure FDA00035851158100000211
respectively represent words wiAnd the word wjHidden state of layer (l-1), original
Figure FDA00035851158100000212
And
Figure FDA00035851158100000213
derived from GRU1An output of (d);
final word wiThe output at layer l over the graph convolution network is as follows:
Figure FDA0003585115810000031
wherein, W(l)And b(l)Represents a trainable parameter in the convolution of the l-layer graph, and sigma represents an activation function ReLU; representation of sentences with aspect information obtained after TFGCN
Figure FDA0003585115810000032
Represents;
in representing facet-aware sentence representations, an attention layer is introduced to improve facet-aware sentence representation
Figure FDA0003585115810000033
The specific operation is shown in the following formula:
Figure FDA0003585115810000034
α=softmax(z),
Figure FDA0003585115810000035
wherein
Figure FDA0003585115810000036
Figure FDA0003585115810000037
bsIs a scalar.
4. The method of claim 2, wherein the emotion analysis method comprises: step S3 calculates inter-aspect attention specifically as: and taking the sentence representation containing the inter-aspect association and the grammar dependency type as an input, passing the sentence representation through a Bi-GRU, further transmitting aspect information in the sentence representation to obtain a sentence representation containing more aspect features, taking the target aspect as a query vector, and calculating the attention between the target aspect and the non-target aspect.
5. The method of claim 4, wherein the emotion analysis method comprises the following steps: the specific flow of attention among calculation aspects is as follows: firstly, the first step is to
Figure FDA0003585115810000041
Input to another hidden layer of size D0Gated cyclic unit GRU2Performing the following steps; other aspects go through GRU2The resulting hidden representation is
Figure FDA0003585115810000042
For object aspect representation
Figure FDA0003585115810000043
Converting the query vector into a query vector q by using a full connection layer; the query vector is obtained as follows:
Figure FDA0003585115810000044
wherein
Figure FDA0003585115810000045
In order to obtain the correlation between the query vector and other vectors, the following method is adopted for calculation:
Figure FDA0003585115810000046
β=softmax(za)
wherein the content of the first and second substances,
Figure FDA0003585115810000047
βirepresenting an attention score between the target aspect and the other aspects;
the inter-aspect attention module based on the target aspect comprises the following specific processes:
will be provided with
Figure FDA0003585115810000048
Input to another hidden layer of size D0Gated cyclic unit GRU2In other aspects through GRU2The obtained hidden representation is
Figure FDA0003585115810000049
For target aspect representation
Figure FDA00035851158100000410
Converting the query vector q into a query vector q by using a full connection layer, and calculating attention between a target aspect and other aspects; the query vector is obtained as follows:
Figure FDA00035851158100000411
wherein
Figure FDA0003585115810000051
To obtain the correlation between the query vector and other vectors, the following method is adopted for calculation:
Figure FDA0003585115810000052
β=softmax(za)
wherein the content of the first and second substances,
Figure FDA0003585115810000053
βian attention score between the target aspect and the other aspects is represented.
6. The method of claim 4, wherein the emotion analysis method comprises the following steps: the sentence associated in step S4 is specifically represented as: and obtaining a correlation matrix between the aspects, and then using the correlation matrix and the sentence representation of the aspect characteristics as input of the graph volume network, and obtaining the sentence representation of the correlation between the aspects through the graph volume network.
7. The method of claim 6, wherein the emotion analysis method comprises: the sentence expression of the association between the aspects comprises the following specific calculation processes:
an adjacency matrix between the aspects is first constructed, for obtaining context dependencies between the aspects,
the specific construction method is shown as the following formula:
Figure FDA0003585115810000054
wherein the content of the first and second substances,
Figure FDA0003585115810000057
set of words, p, representing aspects of the objecttIs the starting position of the target aspect;
Figure FDA0003585115810000056
a set of words representing other aspects;
form a non-directional contiguous matrix, i.e.
Figure FDA0003585115810000061
After obtaining the inter-aspect adjacency matrix, the inter-aspect adjacency matrix is
Figure FDA0003585115810000062
And through GRU2Resulting hidden representation with aspect information
Figure FDA0003585115810000063
As input to the graph convolution network, to obtain the mutual emotional characteristics of the specific aspect and other aspects; hidden representation of ith node in ith layer in graph convolution network
Figure FDA0003585115810000064
The updates of (2) are as follows:
Figure FDA0003585115810000065
wherein, ReLU represents an activation function,
Figure FDA0003585115810000066
representing a hidden representation of the ith node at level l-1, W(l)And b(l)Representing trainable parameters in the convolution of the l-layer graph;
the hidden representation with the inter-aspect association information finally obtained through the graph convolution network is as follows:
Figure FDA0003585115810000067
8. the method of claim 6, wherein the emotion analysis method comprises the following steps: step S5 is embodied as a sentence expression including both aspect features and associations between aspects: combining sentence representations containing aspect characteristics and sentence representations related between aspects, and introducing a correlation coefficient between aspects to control the amount of introduced correlation information between aspects, and finally obtaining the sentence representations simultaneously containing the aspect characteristics and the correlation between aspects;
meanwhile, the sentence expression containing the aspect characteristics and the association between the aspects is as follows:
the sentence representation with rich aspect information can be obtained by the above process
Figure FDA0003585115810000068
And sentence representation Q with inter-aspect associationsInter(ii) a To make full use of
Figure FDA0003585115810000071
And QInterThe two representations are combined to discover the correlation between the two representations, and the combined sentence is expressed by the following formula:
Figure FDA0003585115810000072
where the coefficient γ ∈ [0,1] indicates how much inter-aspect features are introduced.
9. The method of claim 8, wherein the emotion analysis method comprises the following steps: the final sentence representation of step S6 is specifically: multiplying the sentence representation containing the aspect features and the association between the aspects with the attention between the aspects to control the influence of different non-target aspects on the target aspects to obtain a final sentence representation, adding the final sentence representation with the query vector of the target aspect, and inputting the final sentence representation to a softmax layer to obtain a final representation of emotion analysis.
10. The method of claim 9, wherein the emotion analysis method comprises: the final representation for emotion analysis is embodied as:
first, a sentence with an association between aspects is represented as QInterMultiplying with inter-aspect attention beta to obtain final output of sentence representation
Figure FDA0003585115810000073
The query vector q for the target aspect is then added to the output to produce a query vector with abundanceThe target aspect sentence representation of the rich inter-aspect association information is then used with a softmax classifier containing C classes to obtain the final representation p for emotion analysis, as shown in the following formula:
ρ=softmax((q+o)Wρ+bρ)
wherein the weight matrix WρAnd bias bρIs trainable; after the final representation is obtained, the emotion polarity is predicted as shown in the following formula:
Figure FDA0003585115810000081
wherein
Figure FDA0003585115810000082
Is the predicted emotional polarity.
CN202210373785.1A 2022-04-07 Emotion analysis method based on grammar dependence type and inter-aspect association Active CN114781352B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210373785.1A CN114781352B (en) 2022-04-07 Emotion analysis method based on grammar dependence type and inter-aspect association

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210373785.1A CN114781352B (en) 2022-04-07 Emotion analysis method based on grammar dependence type and inter-aspect association

Publications (2)

Publication Number Publication Date
CN114781352A true CN114781352A (en) 2022-07-22
CN114781352B CN114781352B (en) 2024-06-28

Family

ID=

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115936077A (en) * 2022-12-30 2023-04-07 湖北工业大学 Dependency tree based aspect level emotion analysis interactive convolution network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866405A (en) * 2019-11-14 2020-03-06 电子科技大学 Statement information-based aspect level emotion classification method
US20200356724A1 (en) * 2019-05-06 2020-11-12 University Of Electronic Science And Technology Of China Multi-hop attention and depth model, method, storage medium and terminal for classification of target sentiments
CN111985205A (en) * 2020-08-05 2020-11-24 重庆大学 Aspect level emotion classification model
CN111985245A (en) * 2020-08-21 2020-11-24 江南大学 Attention cycle gating graph convolution network-based relation extraction method and system
CN112417157A (en) * 2020-12-15 2021-02-26 华南师范大学 Emotion classification method of text attribute words based on deep learning network
CN112528672A (en) * 2020-12-14 2021-03-19 北京邮电大学 Aspect-level emotion analysis method and device based on graph convolution neural network
US20210256355A1 (en) * 2020-02-13 2021-08-19 International Business Machines Corporation Evolving graph convolutional networks for dynamic graphs
CN113723075A (en) * 2021-08-28 2021-11-30 重庆理工大学 Specific target emotion analysis method for enhancing and counterlearning of fused word shielding data
KR20210156152A (en) * 2020-06-17 2021-12-24 주식회사 엔씨소프트 Method and apparatus for relation extraction between entities
US20220092267A1 (en) * 2020-09-23 2022-03-24 Jingdong Digits Technology Holding Co., Ltd. Method and system for aspect-level sentiment classification by graph diffusion transformer

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200356724A1 (en) * 2019-05-06 2020-11-12 University Of Electronic Science And Technology Of China Multi-hop attention and depth model, method, storage medium and terminal for classification of target sentiments
CN110866405A (en) * 2019-11-14 2020-03-06 电子科技大学 Statement information-based aspect level emotion classification method
US20210256355A1 (en) * 2020-02-13 2021-08-19 International Business Machines Corporation Evolving graph convolutional networks for dynamic graphs
KR20210156152A (en) * 2020-06-17 2021-12-24 주식회사 엔씨소프트 Method and apparatus for relation extraction between entities
CN111985205A (en) * 2020-08-05 2020-11-24 重庆大学 Aspect level emotion classification model
CN111985245A (en) * 2020-08-21 2020-11-24 江南大学 Attention cycle gating graph convolution network-based relation extraction method and system
US20220092267A1 (en) * 2020-09-23 2022-03-24 Jingdong Digits Technology Holding Co., Ltd. Method and system for aspect-level sentiment classification by graph diffusion transformer
CN112528672A (en) * 2020-12-14 2021-03-19 北京邮电大学 Aspect-level emotion analysis method and device based on graph convolution neural network
CN112417157A (en) * 2020-12-15 2021-02-26 华南师范大学 Emotion classification method of text attribute words based on deep learning network
CN113723075A (en) * 2021-08-28 2021-11-30 重庆理工大学 Specific target emotion analysis method for enhancing and counterlearning of fused word shielding data

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
WAQAR ALI等: "aspect-level sentiment analysis based on bidirectional-GRU in Slot", IEEE ACCESS, 6 May 2021 (2021-05-06), pages 69938 - 69950, XP011854365, DOI: 10.1109/ACCESS.2021.3078114 *
YING YANG等: "a joint model for aspect-category sentiment analysis with textGCN and Bi-GRU", 2020 IEEE FIFTH INTERNATIONAL CONFERENCE ON DATA SCIENCE IN CYBERSPACE, 30 July 2020 (2020-07-30), pages 1 - 10 *
夏鸿斌等: "面向特定方面情感分析的图卷积过度注意(ASGCN-AOA)模型", 中文信息学报, vol. 36, no. 3, 15 March 2022 (2022-03-15), pages 1 - 8 *
王嘉宁;何怡;朱仁煜;刘婷婷;高明;: "基于远程监督的关系抽取技术", 华东师范大学学报(自然科学版), no. 05, 25 September 2020 (2020-09-25), pages 113 - 130 *
胡朝举;赵晓伟;: "基于词向量技术和混合神经网络的情感分析", 计算机应用研究, vol. 35, no. 12, 12 December 2017 (2017-12-12), pages 3556 - 3559 *
马祥: "基于深度学习的评论文本方面级情感分析研究", 万方数据, 31 May 2023 (2023-05-31), pages 1 - 30 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115936077A (en) * 2022-12-30 2023-04-07 湖北工业大学 Dependency tree based aspect level emotion analysis interactive convolution network
CN115936077B (en) * 2022-12-30 2023-09-15 湖北工业大学 Dependency tree-based aspect-level emotion analysis interactive convolution network

Similar Documents

Publication Publication Date Title
CN108363753B (en) Comment text emotion classification model training and emotion classification method, device and equipment
CN108319686B (en) Antagonism cross-media retrieval method based on limited text space
CN108536681B (en) Intelligent question-answering method, device, equipment and storage medium based on emotion analysis
CN109933664B (en) Fine-grained emotion analysis improvement method based on emotion word embedding
CN106503192B (en) Name entity recognition method and device based on artificial intelligence
Feng et al. Incorporating commonsense knowledge into abstractive dialogue summarization via heterogeneous graph networks
CN108984530A (en) A kind of detection method and detection system of network sensitive content
US11645479B1 (en) Method for AI language self-improvement agent using language modeling and tree search techniques
Abro et al. Natural language understanding for argumentative dialogue systems in the opinion building domain
CN108052625B (en) Entity fine classification method
CN109726745A (en) A kind of sensibility classification method based on target incorporating description knowledge
CN111538841B (en) Comment emotion analysis method, device and system based on knowledge mutual distillation
CN115860006A (en) Aspect level emotion prediction method and device based on semantic syntax
CN109284389A (en) A kind of information processing method of text data, device
CN105468731A (en) Preprocessing method of text sentiment analysis characteristic verification
CN114443846A (en) Classification method and device based on multi-level text abnormal composition and electronic equipment
CN111858919A (en) Text classification method and device and computer readable storage medium
Wu et al. Detecting malicious social robots with generative adversarial networks
CN116821307A (en) Content interaction method, device, electronic equipment and storage medium
WO2023245523A1 (en) Method and apparatus for generating training data
CN116757195A (en) Implicit emotion recognition method based on prompt learning
CN111368524A (en) Microblog viewpoint sentence recognition method based on self-attention bidirectional GRU and SVM
CN116257616A (en) Entity relation extraction method and system for music field
CN114781352A (en) Emotion analysis method based on association between grammar dependency type and aspect
Sriram et al. An enhanced approach for classifying emotions using customized decision tree algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination