CN115906861B - Sentence emotion analysis method and device based on interaction aspect information fusion - Google Patents

Sentence emotion analysis method and device based on interaction aspect information fusion Download PDF

Info

Publication number
CN115906861B
CN115906861B CN202211296582.3A CN202211296582A CN115906861B CN 115906861 B CN115906861 B CN 115906861B CN 202211296582 A CN202211296582 A CN 202211296582A CN 115906861 B CN115906861 B CN 115906861B
Authority
CN
China
Prior art keywords
tested
sentence
statement
layer
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211296582.3A
Other languages
Chinese (zh)
Other versions
CN115906861A (en
Inventor
蔡倩华
陈秉良
张良均
薛云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Teddy Intelligent Technology Co ltd
South China Normal University
Original Assignee
Guangdong Teddy Intelligent Technology Co ltd
South China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Teddy Intelligent Technology Co ltd, South China Normal University filed Critical Guangdong Teddy Intelligent Technology Co ltd
Priority to CN202211296582.3A priority Critical patent/CN115906861B/en
Publication of CN115906861A publication Critical patent/CN115906861A/en
Application granted granted Critical
Publication of CN115906861B publication Critical patent/CN115906861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Machine Translation (AREA)

Abstract

The invention relates to the field of emotion analysis, in particular to a sentence emotion analysis method based on interaction aspect information fusion.

Description

Sentence emotion analysis method and device based on interaction aspect information fusion
Technical Field
The invention relates to the field of emotion analysis, in particular to a sentence emotion analysis method, device and equipment based on interaction aspect information fusion and a storage medium.
Background
In aspect level emotion-classification (aspect-level sentiment classification) tasks, the inclusion of multiple aspect words in a sentence is a difficulty of the task.
At present, the existing model takes emotion classification of words in different aspects in sentences as independent tasks, influences among the words in different aspects are not considered, importance of multi-element information is ignored, and therefore accurate emotion analysis on the sentences cannot be accurately carried out.
Disclosure of Invention
Based on the above, the application aims to provide a sentence emotion analysis method, a sentence emotion analysis device, sentence emotion analysis equipment and a sentence emotion analysis storage medium based on interaction aspect information fusion, which are used for predicting emotion tendencies of the current aspect words by expanding the relation of different aspect words by a neural network model and combining other aspect word information in the sentence, so that global information extraction is improved, and emotion analysis is performed on the sentence more comprehensively, thereby improving the accuracy and stability of emotion analysis of the sentence.
In a first aspect, an embodiment of the present application provides a sentence emotion analysis method based on interaction information fusion, including the following steps:
obtaining a sentence to be tested, wherein the sentence to be tested comprises a plurality of words, the sentence to be tested is sent to a coding layer in a preset neural network model, the bidirectional features of the plurality of words of the sentence to be tested are obtained, and the bidirectional features of the plurality of words of the sentence to be tested are combined to obtain sentence bidirectional features of the sentence to be tested;
constructing a dependency syntax tree of the statement to be tested, and constructing a first adjacency matrix of the statement to be tested according to the sentence bidirectional characteristics of the statement to be tested and the dependency syntax tree; inputting sentence bidirectional features of the sentence to be tested and a first adjacency matrix into a graph rolling network layer in the neural network model to obtain graph rolling features of the sentence to be tested;
Obtaining syntax distance data between different nodes of a dependency syntax tree of the statement to be tested, constructing a second adjacent matrix of the statement to be tested according to the syntax distance data, inputting graph convolution characteristics of the statement to be tested and the second adjacent matrix into a hierarchical selection attention network layer in the neural network model, and obtaining interactive aspect characteristics of the statement to be tested;
inputting the graph convolution characteristics and the interaction aspect characteristics of the statement to be tested into an information fusion layer of the neural network model to obtain fusion characteristics of the statement to be tested;
and inputting the fusion characteristics of the statement to be tested into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be tested.
In a second aspect, an embodiment of the present application provides a sentence emotion analysis device based on interaction information fusion, including:
the sentence bidirectional feature calculation module is used for obtaining a sentence to be detected, wherein the sentence to be detected comprises a plurality of words, the sentence to be detected is sent to a coding layer in a preset neural network model to obtain bidirectional features of the plurality of words of the sentence to be detected, and the bidirectional features of the plurality of words of the sentence to be detected are combined to obtain sentence bidirectional features of the sentence to be detected;
The picture convolution feature calculation module is used for constructing a dependency syntax tree of the sentence to be tested, and constructing a first adjacent matrix of the sentence to be tested according to the sentence bidirectional feature of the sentence to be tested and the dependency syntax tree; inputting sentence bidirectional features of the sentence to be tested and a first adjacency matrix into a graph rolling network layer in the neural network model to obtain graph rolling features of the sentence to be tested;
the interaction aspect feature calculation module is used for acquiring the syntactic distance data between different nodes of the dependency syntactic tree of the statement to be tested, constructing a second adjacent matrix of the statement to be tested according to the syntactic distance data, inputting the graph convolution feature of the statement to be tested and the second adjacent matrix into the hierarchical selection attention network layer in the neural network model, and acquiring the interaction aspect feature of the statement to be tested;
the feature fusion module is used for inputting the graph convolution features and the interaction aspect features of the statement to be tested into the information fusion layer of the neural network model to obtain fusion features of the statement to be tested;
and the emotion analysis module is used for inputting the fusion characteristics of the statement to be detected into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be detected.
In a third aspect, an embodiment of the present application provides a computer apparatus, including: a processor, a memory, and a computer program stored on the memory and executable on the processor; the computer program when executed by the processor implements the steps of the sentence emotion analysis method based on interaction aspect information fusion as described in the first aspect.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a computer program is stored, where the computer program implements the steps of the sentence emotion analysis method based on interaction aspect information fusion according to the first aspect when the computer program is executed by a processor.
According to the sentence emotion analysis method, device and equipment based on interaction aspect information fusion and the storage medium, through expanding the relation of the neural network model to different aspect words and combining other aspect word information in the sentence, emotion tendencies of the current aspect words are predicted, global information extraction is improved, emotion analysis is conducted on the sentence more comprehensively, and therefore accuracy and stability of emotion analysis of the sentence are improved.
For a better understanding and implementation, the present application is described in detail below with reference to the drawings.
Drawings
FIG. 1 is a schematic flow chart of a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of S1 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of S2 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of S2 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of S3 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of S4 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 7 is a schematic flow chart of S5 in a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a sentence emotion analysis device based on interaction information fusion according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. The word "if"/"if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination", depending on the context.
Referring to fig. 1, fig. 1 is a schematic flow chart of a sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, where the method includes the following steps:
s1: obtaining a sentence to be tested, obtaining bidirectional features of a plurality of words of the sentence to be tested by the sentence to be tested to a coding layer in a preset neural network model, and combining the bidirectional features of the plurality of words of the sentence to be tested to obtain sentence bidirectional features of the sentence to be tested.
The execution body of the sentence emotion analysis method based on the interaction aspect information fusion is analysis equipment (hereinafter referred to as analysis equipment) of the sentence emotion analysis method based on the interaction aspect information fusion, and in an optional embodiment, the analysis equipment may be a computer equipment may be a server, or a server cluster formed by combining multiple computer equipment.
The sentence to be tested comprises a plurality of words, wherein the words are some entity described in the sentence and can be nouns, adjectives and the like; the words include contextual words and aspect words.
In this embodiment, the analysis device may obtain a sentence to be tested input by a user, send the sentence to be tested to a coding layer in a preset neural network model, obtain bidirectional features of a plurality of words of the sentence to be tested, and combine the bidirectional features of the plurality of words of the sentence to be tested to obtain the sentence bidirectional features of the sentence to be tested.
Referring to fig. 2, fig. 2 is a schematic flow chart of step S1 in the sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, including steps S11 to S12, specifically including the following steps:
s11: inputting the statement to be tested into a word embedding module in the coding layer to obtain a word embedding matrix of the statement to be tested, and splicing the word embedding matrix with a preset tag matrix to obtain an input matrix of the statement to be tested.
The word embedding matrix comprises word embedding vectors of all words in the statement to be tested, and the tag matrix comprises tag embedding vectors corresponding to all words in the statement to be tested;
the sentence of the sentence to be tested is expressed as:
wherein S is sentence representation of the sentence to be tested, comprisingnVectors corresponding to individual context wordsAnd is composed ofmVectors corresponding to terms composed of terms ++>
The word embedding module can adopt a BERT (Bidirectional Encoder Representation from Transformers) model or a GloVe model. In this embodiment, the analysis device inputs the sentence to be tested into the word embedding module in the coding layer, so as to obtain the word embedding matrix of the sentence to be tested.
To supplement syntax information, the analysis device is initialized by randomEmbedding a dimension part-of-feature (PoS) tag as a preset tag matrix, and splicing to obtain an input matrix of the statement to be detected>
S12: inputting the input matrix of the sentence to be tested into a semantic information coding module in the coding layer for coding, obtaining the forward features and the backward features of each word in the sentence to be tested, and splicing to obtain the bidirectional features of a plurality of words in the sentence to be tested.
In order to acquire the upper and lower sign information of a sentence to be tested, in this embodiment, the analysis device inputs an input matrix of the sentence to be tested into a semantic information encoding module in the encoding layer, encodes the input matrix by a preset encoding function, and respectively acquires the forward features and backward features of each word in the sentence to be tested to capture the feature information of the sentence in different reading directions, which is specifically as follows:
in the method, in the process of the invention,for the forward sentence feature +.>For the backward sentence feature +.>Is a coding function;
splicing the forward features and the backward features of each word in the statement to be tested to obtain the bidirectional features of a plurality of words of the statement to be tested, wherein the bidirectional features are as follows:
In the method, in the process of the application,is the bi-directional feature.
Combining the two-way features of the plurality of words of the sentence to be tested to obtain the sentence two-way features of the sentence to be tested, wherein the sentence two-way features are as follows:
in the method, in the process of the application,Hand the sentence bidirectional characteristics are the sentence bidirectional characteristics of the sentence to be tested.
S2: constructing a dependency syntax tree of the statement to be tested, and constructing a first adjacency matrix of the statement to be tested according to the sentence bidirectional characteristics of the statement to be tested and the dependency syntax tree; and inputting the sentence bidirectional characteristics of the sentence to be tested and the first adjacency matrix into a graph rolling network layer in the neural network model to obtain the graph rolling characteristics of the sentence to be tested.
In this embodiment, the analysis device constructs a dependency syntax tree of the sentence to be tested, and constructs a first adjacency matrix of the sentence to be tested according to the sentence bidirectional feature of the sentence to be tested and the dependency syntax tree; and inputting the sentence bidirectional characteristics of the sentence to be tested and the first adjacency matrix into a graph rolling network layer in the neural network model to obtain the graph rolling characteristics of the sentence to be tested.
Referring to fig. 3, fig. 3 is a schematic flow chart of step S2 in the sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, including steps S21 to S24, specifically including the following steps:
S21: acquiring an initial dependency syntax tree, respectively setting each word in the statement to be tested at each node of the initial dependency syntax tree, acquiring dependency relationship types among different nodes in the dependency syntax tree of the statement to be tested, and constructing the dependency syntax tree of the statement to be tested;
the initial dependency syntax tree includes a plurality of nodes, and in this embodiment, the analysis device acquires the initial dependency syntax tree, and sets each word in the statement to be tested at each node of the initial dependency syntax tree.
In order to obtain the dependency relationship types among different nodes in the dependency syntax tree of the statement to be tested, the analysis equipment establishes a word list according to all the dependency relationship types in the preset corpusConstructing a dependency syntax tree of the statement to be tested; wherein (1)>For the number of dependency types>Embedding dimensions for dependency types;
s22: obtaining dependency relation embedded vectors among different nodes in the dependency syntax tree of the statement to be tested according to the dependency relation type, and obtaining bidirectional features of words corresponding to all nodes in the dependency syntax tree of the statement to be tested according to the sentence bidirectional features of the statement to be tested;
In this embodiment, the analysis device acquires the dependency relation embedded vector between different nodes in the dependency syntax tree of the sentence to be tested by randomly initializing the dependency relation type and subsequent training to learn the dependency relation type table matrix according to the dependency relation type, and acquires the bidirectional features of the word corresponding to each node in the dependency syntax tree of the sentence to be tested from the sentence bidirectional features of the sentence to be tested.
S23: according to the dependency relation embedding vectors among different nodes in the dependency syntax tree of the statement to be tested, the bidirectional features of words corresponding to the nodes and a preset attention score calculation algorithm, attention scores among different nodes of the dependency syntax tree of the statement to be tested are calculated.
The attention score calculation algorithm is as follows:
in the method, in the process of the invention,is the first of the dependency syntax treeiThe individual nodejAttention score between individual nodes, +.>Is a LeakyReLu function, +.>Is the firstiBi-directional character of word corresponding to each node, +.>Is a first training parameter; />For dot product symbol, ++>Is the first of the dependency syntax treeiThe individual nodejEmbedding a vector in the dependency relationship among the individual nodes;
since there is no dependency relationship in the actual dependency syntax tree from the ring-loop connection, in this embodiment, the analysis device embeds the vectors according to the dependency relationship between different nodes in the dependency syntax tree of the statement to be tested, and the bidirectional features of the words corresponding to the respective nodes, considers the relationship information of the two node information and the edges connected by the edges of the dependency syntax tree in the calculation of the attention, calculates the attention score between the different nodes of the dependency syntax tree of the statement to be tested according to a preset attention score calculation algorithm, and calculates the attention scalar score of the edge by summing the nonlinear function and the dimension. By setting a special dependency relationship 'Self-correlation' as the dependency relationship of the Self-link nodes, attention scores among different nodes of the dependency syntax tree of the sentence to be tested are calculated, and the information of the sentence to be tested in the syntax aspect is better extracted, so that emotion analysis is more accurately carried out on the sentence to be tested.
S24: and obtaining the attention probability among the different nodes of the dependency syntax tree of the statement to be tested according to the attention scores among the different nodes of the dependency syntax tree of the statement to be tested and a preset normalization algorithm, and constructing a first adjacency matrix of the statement to be tested.
The normalization algorithm is as follows:
in the method, in the process of the application,is the first of the dependency syntax treeiThe individual nodejProbability of attention between individual nodes, +.>Is the first of the dependency syntax treeiA set of nodes of the individual nodes,xrepresenting the first syntax tree associated with the dependencyiConnected with each nodeA first jointxAnd each node.
In this embodiment, the analysis device obtains the attention probability between different nodes of the dependency syntax tree of the statement to be tested according to the attention score between different nodes of the dependency syntax tree of the statement to be tested and a preset normalization algorithm, and constructs a first adjacency matrix of the statement to be tested.
In an optional embodiment, the graph rolling network layer is a multi-layer graph rolling network layer, referring to fig. 4, fig. 4 is a schematic flow chart of S2 in the sentence emotion analysis method based on interaction aspect information fusion according to an embodiment of the present application, and further includes step S25, specifically as follows:
S25: and obtaining the graph convolution characteristics of the sentence to be tested, which are output by each layer of the multi-layer graph convolution network layer, according to the sentence bidirectional characteristics of the sentence to be tested, the first adjacency matrix and a preset first graph convolution characteristic calculation algorithm.
The number of layers of the graph convolution network is similar to the function of the convolution window size in the traditional convolution neural network combined with the number of layers of the network, and is a determining element of the size of the receptive field of the observed entity in the neural network. In the model training process, as the receptive field is increased, the situation of information distortion can be caused by nonlinear superposition of information in the layer number superposition process of the graph rolling network.
In order to reduce the influence caused by information distortion, in this embodiment, the analysis device obtains the convolution feature of the sentence to be tested output by each layer of the multi-layer graph convolution network layer according to the sentence bidirectional feature of the sentence to be tested, the first adjacency matrix and a preset first convolution feature calculation algorithm, where the first convolution feature calculation algorithm is:
in the method, in the process of the invention,for the multi-layer drawing rollThe graph convolution feature of the statement under test of the output of the lamination network layer, HFor the sentence bi-directional feature of the sentence to be tested, < + >>For the adjacency matrix of the statement under test, +.>A function is calculated for the first convolution feature,kindicating the number of layers of the multi-layer graph convolutional network layer.
S3: obtaining syntax distance data between different nodes of a dependency syntax tree of the statement to be tested, constructing a second adjacent matrix of the statement to be tested according to the syntax distance data, inputting graph convolution characteristics of the statement to be tested and the second adjacent matrix into a hierarchical selection attention network layer in the neural network model, and obtaining interactive aspect characteristics of the statement to be tested;
based on the habit background of people for reading sequentially, the influence degree of different aspect words on the current aspect words is determined by the position relation among the aspect words; the closer the position between aspect words is, the greater the influence on the current aspect word should be, so in this embodiment, the analysis device obtains the syntactic distance data between different nodes of the dependency syntax tree of the sentence to be tested, and constructs a second adjacency matrix of the sentence to be tested according to the syntactic distance data, where the second adjacency matrix is specifically as follows:
in the method, in the process of the invention, Is the first of the dependency syntax treeiThe individual nodejSyntactic distance data between the individual nodes,pindex for the position of the aspect word, +.>Is the firstjWord of individual node->Expressed as aspect words>Expressed as a further term of art or words,otherwiserepresented as a word of a context,lenand the sentence length of the sentence to be tested is the sentence length of the sentence to be tested.
And inputting the graph convolution characteristics of the statement to be tested and the second adjacency matrix into a hierarchical selection attention network layer in the neural network model to obtain the interactive aspect characteristics of the statement to be tested.
Referring to fig. 5, fig. 5 is a schematic flow chart of step S3 in the sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, including steps S31 to S34, specifically including the following steps:
s31: inputting the graph convolution characteristics of the to-be-tested sentences output by each layer of the multi-layer graph convolution network layer into a hierarchical selection attention network layer in the neural network model, and carrying out mask processing and average pooling processing on the graph convolution characteristics of the to-be-tested sentences output by each layer of the multi-layer graph convolution network layer to obtain first average pooling characteristics corresponding to a plurality of layers of the to-be-tested sentences.
In order to select information of different levels in the multi-layer graph convolutional network layer and select aspect level information of interaction level, mask processing and average pooling processing are required to be carried out on graph convolution characteristics of the to-be-tested statement output by each layer of the multi-layer graph convolutional network layer.
In this embodiment, the analysis device inputs the graph convolution feature of the to-be-tested sentence output by each layer of the multi-layer graph convolutional network layer into a hierarchical selective attention network layer in the neural network model, and performs mask processing and average pooling processing on the graph convolution feature of the to-be-tested sentence output by each layer of the multi-layer graph convolutional network layer to obtain first average pooling features corresponding to a plurality of layers of the to-be-tested sentence, which is specifically as follows:
in the method, in the process of the invention,for the mask parameter(s),τis the starting position of the aspect word,mis the number of terms of the aspect,nis the number of words to be used,tposition index for word, denoted as the firsttIndividual words->The first average pooling characteristics corresponding to a plurality of layers of the statement to be tested are obtained;AveragePooling() Is an average pooling function.
S32: and according to a preset splicing algorithm, carrying out splicing processing on the first average pooling characteristics corresponding to the layers of the statement to be detected, and obtaining the hierarchical splicing characteristics of the statement to be detected.
The splicing algorithm is as follows:
in the method, in the process of the invention,splicing the characteristics of the hierarchy of the statement to be tested, < >>Is a splicing function;
in this embodiment, the analysis device performs a stitching process on the first average pooling feature corresponding to the plurality of layers of the statement to be tested according to a preset stitching algorithm, so as to obtain a hierarchical stitching feature of the statement to be tested.
S33: multiplying the hierarchical concatenation characteristics of the to-be-tested sentence with a preset second training parameter to obtain an attention parameter of the to-be-tested sentence, and obtaining the hierarchical selection characteristics of the to-be-tested sentence according to the attention parameter of the to-be-tested sentence, the picture convolution characteristics of the to-be-tested sentence output by each layer of the multi-layer picture convolution network layer and a preset hierarchical selection algorithm.
The hierarchy selection algorithm is as follows:
in the method, in the process of the invention,selecting a feature for the hierarchy of the statement under test, < >>For the attention parameter of the statement under test, < +.>Is a preset third training parameter;
in this embodiment, the analysis device multiplies the hierarchical concatenation feature of the to-be-detected sentence by a preset second training parameter to obtain an attention parameter of the to-be-detected sentence, and obtains the hierarchical selection feature of the to-be-detected sentence according to the attention parameter of the to-be-detected sentence, the graph convolution feature of the to-be-detected sentence output by each layer of the multi-layer graph convolution network layer, and a preset hierarchical selection algorithm.
S34: and obtaining the interactive aspect characteristics of the statement to be tested according to the hierarchical selection characteristics of the statement to be tested, the second adjacent matrix and a preset second graph convolution characteristic calculation algorithm.
The second graph convolution feature calculation algorithm is as follows:
in the method, in the process of the invention,the interactive aspect features of the statement under test for the output of the multi-layer graph convolutional network layer,for the language to be testedSecond adjacency matrix of sentence,>a function is calculated for the second convolution feature.
In this embodiment, the analysis device obtains the interactive aspect features of the to-be-detected sentence according to the hierarchical selection features of the to-be-detected sentence, the second adjacent matrix and the preset second graph convolution feature calculation algorithm, and only needs to exchange information with other aspect words once by the current aspect words, so as to prevent expanding the receptive field of the graph convolution network and avoid interfering the interaction of the aspect information, thereby improving the accuracy of emotion analysis of the to-be-detected sentence.
S4: and inputting the graph convolution characteristics and the interaction aspect characteristics of the statement to be tested into an information fusion layer of the neural network model to obtain fusion characteristics of the statement to be tested.
In this embodiment, the analysis device inputs the graph convolution feature and the interaction feature of the statement to be tested into the information fusion layer of the neural network model, so as to obtain the fusion feature of the statement to be tested.
Referring to fig. 6, fig. 6 is a schematic flow chart of step S4 in the sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, including steps S41 to S42, specifically including the following steps:
s41: and obtaining initial fusion characteristics of the statement to be tested according to the graph convolution characteristics, the interaction aspect characteristics and a preset characteristic fusion algorithm of the statement to be tested.
In order to reserve the syntax information and the interaction aspect information of the statement to be tested to the greatest extent, in this embodiment, the analysis device obtains an initial fusion feature of the statement to be tested according to a graph convolution feature, an interaction aspect feature and a preset feature fusion algorithm of the statement to be tested, where the feature fusion algorithm is:
in the method, in the process of the application,the initial fusion characteristics of the statement to be tested are that gamma is a preset super parameter;
s5: and inputting the fusion characteristics of the statement to be tested into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be tested.
In this embodiment, the analysis device inputs the fusion feature of the statement to be detected into the emotion analysis layer of the neural network model, and obtains an emotion analysis result of the statement to be detected.
In an alternative embodiment, the emotion analysis layer includes a pooling layer and an activation layer, referring to fig. 7, fig. 7 is a schematic flow diagram of step S5 in the sentence emotion analysis method based on interaction information fusion according to an embodiment of the present application, including steps S51 to S52, specifically as follows:
s51: and inputting the fusion characteristics of the statement to be tested into the pooling layer for mask processing and average pooling processing, and obtaining second average pooling characteristics of the statement to be tested.
In order to further improve the accuracy of emotion analysis, in this embodiment, an analysis device performs mask processing and average pooling processing on the fusion feature of the statement to be tested to obtain a second average pooling feature of the statement to be tested, which is specifically as follows:
in the method, in the process of the application,and pooling the features for the second average of the statement under test.
S52: inputting the second average pooling feature of the sentence to be tested into the activation layer, obtaining emotion classification polarity probability distribution vectors according to the second average pooling feature of the sentence to be tested and a preset emotion analysis algorithm, obtaining emotion polarities corresponding to the dimension with the maximum probability according to the emotion classification polarity probability distribution vectors, and taking the emotion polarities as emotion analysis results of the sentence to be tested.
In this embodiment, the analysis device inputs the second average pooling feature of the sentence to be tested into the activation layer, and obtains the emotion classification polarity probability distribution vector according to the second average pooling feature of the sentence to be tested and a preset emotion analysis algorithm, where the emotion analysis algorithm is:
in the method, in the process of the application,yclassifying a polar probability distribution vector for said emotion,for a fourth trainable parameter, which is preset, < +.>Is a preset bias parameter.
According to the emotion classification polarity probability distribution vector, emotion polarities corresponding to the dimension with the largest probability are obtained and used as emotion analysis results of the sentences to be detected, wherein the emotion polarities comprise positive, neutral and negative, and concretely, when u= [ u positive, u negative, u neutral ] = [0.1,0.7,0.2] is calculated, the probability is the maximum u negative, and the emotion polarity corresponding to the dimension with the largest probability is the negative and is used as the emotion analysis results of the text data set to be detected.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a sentence emotion analysis device based on interaction information fusion according to an embodiment of the present application, where the device may implement all or a part of the sentence emotion analysis device based on interaction information fusion through software, hardware or a combination of both, and the device 8 includes:
A sentence bidirectional feature calculation module 81, configured to obtain a sentence to be tested, where the sentence to be tested includes a plurality of words, send the sentence to be tested to a coding layer in a preset neural network model, obtain bidirectional features of the plurality of words of the sentence to be tested, and combine the bidirectional features of the plurality of words of the sentence to be tested to obtain sentence bidirectional features of the sentence to be tested;
a graph convolution feature calculation module 82, configured to construct a dependency syntax tree of the sentence to be tested, and construct a first adjacency matrix of the sentence to be tested according to the sentence bidirectional feature of the sentence to be tested and the dependency syntax tree; inputting sentence bidirectional features of the sentence to be tested and a first adjacency matrix into a graph rolling network layer in the neural network model to obtain graph rolling features of the sentence to be tested;
the interaction aspect feature calculation module 83 is configured to obtain syntax distance data between different nodes of the dependency syntax tree of the statement to be tested, construct a second adjacency matrix of the statement to be tested according to the syntax distance data, input the graph convolution feature of the statement to be tested and the second adjacency matrix to a hierarchical selection attention network layer in the neural network model, and obtain the interaction aspect feature of the statement to be tested;
The feature fusion module 84 is configured to input the graph convolution feature and the interaction aspect feature of the to-be-detected sentence into an information fusion layer of the neural network model, so as to obtain a fusion feature of the to-be-detected sentence;
and the emotion analysis module 85 is used for inputting the fusion characteristics of the statement to be detected into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be detected.
In this embodiment, a sentence to be measured is obtained through a sentence bidirectional feature calculation module, the sentence to be measured includes a plurality of words, the sentence to be measured is sent to a coding layer in a preset neural network model to obtain bidirectional features of the plurality of words of the sentence to be measured, and the bidirectional features of the plurality of words of the sentence to be measured are combined to obtain sentence bidirectional features of the sentence to be measured; constructing a dependency syntax tree of the statement to be tested through a graph convolution feature calculation module, and constructing a first adjacent matrix of the statement to be tested according to the sentence bidirectional features of the statement to be tested and the dependency syntax tree; inputting sentence bidirectional features of the sentence to be tested and a first adjacency matrix into a graph rolling network layer in the neural network model to obtain graph rolling features of the sentence to be tested; obtaining syntax distance data among different nodes of a dependency syntax tree of the statement to be tested through an interaction aspect feature calculation module, constructing a second adjacent matrix of the statement to be tested according to the syntax distance data, inputting the graph convolution feature of the statement to be tested and the second adjacent matrix into a hierarchical selection attention network layer in the neural network model, and obtaining the interaction aspect feature of the statement to be tested; inputting the graph convolution characteristics and the interaction aspect characteristics of the statement to be tested into an information fusion layer of the neural network model through a characteristic fusion module to obtain fusion characteristics of the statement to be tested; and inputting the fusion characteristics of the statement to be tested into an emotion analysis layer of the neural network model through an emotion analysis module to obtain an emotion analysis result of the statement to be tested. Through expanding the relation of the neural network model to the words in different aspects, the emotion tendencies of the words in the current aspect are predicted by combining the word information in other aspects in the sentences, the extraction of global information is improved, emotion analysis is carried out on the sentences more comprehensively, and therefore the accuracy and the stability of the emotion analysis of the sentences are improved.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a computer device according to an embodiment of the present application, where the computer device 9 includes: a processor 91, a memory 92, and a computer program 93 stored on the memory 92 and executable on the processor 91; the computer device may store a plurality of instructions adapted to be loaded and executed by the processor 91 to perform the method steps of fig. 1 to 7, and the specific implementation procedure may be referred to in the specific description of fig. 1 to 7, which is not repeated herein.
Wherein processor 91 may include one or more processing cores. The processor 91 performs various functions of the sentence emotion analyzing device 8 based on the interaction aspect information fusion and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 92 and calling data in the memory 92 using various interfaces and various parts within the line connection server, and alternatively, the processor 91 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programble Logic Array, PLA). The processor 91 may integrate one or a combination of several of a central processing unit 91 (Central Processing Unit, CPU), an image processor 91 (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the touch display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 91 and may be implemented by a single chip.
The Memory 92 may include a random access Memory 92 (Random Access Memory, RAM) or a Read-Only Memory 92 (Read-Only Memory). Optionally, the memory 92 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 92 may be used to store instructions, programs, code, a set of codes, or a set of instructions. The memory 92 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 92 may also optionally be at least one memory device located remotely from the aforementioned processor 91.
The embodiment of the present application further provides a storage medium, where the storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executed by the processor, and the specific execution process may refer to the specific description of fig. 1 to 7, and will not be described herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc.
The present invention is not limited to the above-described embodiments, but, if various modifications or variations of the present invention are not departing from the spirit and scope of the present invention, the present invention is intended to include such modifications and variations as fall within the scope of the claims and the equivalents thereof.

Claims (9)

1. A sentence emotion analysis method based on interaction aspect information fusion is characterized by comprising the following steps:
obtaining a sentence to be tested, wherein the sentence to be tested comprises a plurality of words, the sentence to be tested is sent to a coding layer in a preset neural network model, the bidirectional features of the plurality of words of the sentence to be tested are obtained, the bidirectional features of the plurality of words of the sentence to be tested are combined, and the sentence bidirectional features of the sentence to be tested are obtained, wherein the words comprise aspect words;
constructing a dependency syntax tree of the statement to be tested, and constructing a first adjacency matrix of the statement to be tested according to the sentence bidirectional characteristics of the statement to be tested and the dependency syntax tree; inputting sentence bidirectional features of the to-be-tested sentences and a first adjacency matrix into a graph rolling network layer in the neural network model, wherein the graph rolling network layer is a plurality of graph rolling network layers, and obtaining graph rolling features of the to-be-tested sentences output by each layer of the plurality of graph rolling network layers;
Obtaining the syntactic distance data between different nodes of the dependency syntactic tree of the statement to be tested, and constructing a second adjacency matrix of the statement to be tested according to the syntactic distance data;
inputting the graph convolution characteristics of the to-be-tested statement output by each layer of the multi-layer graph convolution network layer into a layer selection attention network layer in the neural network model, and performing mask processing and average pooling processing on the graph convolution characteristics of the to-be-tested statement output by each layer of the multi-layer graph convolution network layer to obtain first average pooling characteristics corresponding to a plurality of layers of the to-be-tested statement, wherein the first average pooling characteristics are specifically as follows:
in the method, in the process of the invention,for the mask parameter(s),τis the starting position of the aspect word,mis the number of terms of the aspect,nis the number of words to be used,tposition index for word, denoted as the firsttIndividual words->The first average pooling characteristics corresponding to a plurality of layers of the statement to be tested are obtained;AveragePooling() Is an average pooling function;
according to a preset splicing algorithm, carrying out splicing processing on first average pooling features corresponding to a plurality of layers of the statement to be detected to obtain hierarchical splicing features of the statement to be detected, wherein the splicing algorithm is as follows:
in the method, in the process of the invention, Splicing the characteristics of the hierarchy of the statement to be tested, < >>Is a splicing function;
multiplying the hierarchical concatenation feature of the sentence to be tested with a preset second training parameter to obtain an attention parameter of the sentence to be tested, and obtaining a hierarchical selection feature of the sentence to be tested according to the attention parameter of the sentence to be tested, the picture convolution feature of the sentence to be tested output by each layer of a multi-layer picture convolution network layer and a preset hierarchical selection algorithm, wherein the hierarchical selection algorithm is as follows:
in the method, in the process of the invention,selecting a feature for the hierarchy of the statement under test, < >>As the attention parameter of the statement under test,for a third training parameter, which is preset, +.>Convolving the network layer for the multi-layer graphkThe graph convolution characteristics of the statement to be tested are output by the layer;
according to the hierarchical selection feature of the statement to be tested, a second adjacent matrix and a preset second graph convolution feature calculation algorithm, obtaining the interactive aspect feature of the statement to be tested, wherein the second graph convolution feature calculation algorithm is as follows:
in the method, in the process of the invention,the interactive aspect features of the statement under test for the output of the multi-layer graph convolution network layer,/->For a second adjacency matrix of said statement under test, < > >Computing a function for the second convolution feature;
inputting the graph convolution characteristics and the interaction aspect characteristics of the statement to be tested into an information fusion layer of the neural network model to obtain fusion characteristics of the statement to be tested;
and inputting the fusion characteristics of the statement to be tested into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be tested.
2. The sentence emotion analysis method based on interaction aspect information fusion according to claim 1, characterized in that: the coding layer comprises a word embedding module and a semantic information coding module;
the step of obtaining the bidirectional characteristics of a plurality of words of the statement to be tested by the statement to be tested to a coding layer in a preset neural network model comprises the following steps:
inputting the sentence to be tested into a word embedding module in the coding layer to obtain a word embedding matrix of the sentence to be tested, and splicing the word embedding matrix with a preset tag matrix to obtain an input matrix of the sentence to be tested, wherein the word embedding matrix comprises word embedding vectors of all words in the sentence to be tested, and the tag matrix comprises tag embedding vectors corresponding to all words in the sentence to be tested;
Inputting the input matrix of the sentence to be tested into a semantic information coding module in the coding layer for coding, obtaining the forward features and the backward features of each word in the sentence to be tested, and splicing to obtain the bidirectional features of a plurality of words in the sentence to be tested.
3. The sentence emotion analysis method based on interaction aspect information fusion according to claim 2, wherein the constructing a dependency syntax tree of the sentence to be tested, constructing a first adjacency matrix of the sentence to be tested according to sentence bidirectional features of the sentence to be tested and the dependency syntax tree, includes the steps of:
acquiring an initial dependency syntax tree, wherein the initial dependency syntax tree comprises a plurality of nodes, each word in the statement to be tested is respectively arranged at each node of the initial dependency syntax tree, dependency relationship types among different nodes in the dependency syntax tree of the statement to be tested are acquired, and the dependency syntax tree of the statement to be tested is constructed;
obtaining dependency relation embedded vectors among different nodes in the dependency syntax tree of the statement to be tested according to the dependency relation type, and obtaining bidirectional features of words corresponding to all nodes in the dependency syntax tree of the statement to be tested according to the sentence bidirectional features of the statement to be tested;
According to dependency relation embedding vectors among different nodes in the dependency syntax tree of the statement to be tested, bidirectional features of words corresponding to the nodes and a preset attention score calculation algorithm, calculating attention scores among the different nodes of the dependency syntax tree of the statement to be tested, wherein the attention score calculation algorithm is as follows:
in the method, in the process of the invention,is the first of the dependency syntax treeiThe individual nodejAttention score between individual nodes, +.>Is a LeakyReLu function, +.>Is the firstiBi-directional character of word corresponding to each node, +.>Is a first training parameter; />For dot product symbol, ++>Is the first of the dependency syntax treeiThe individual nodejEmbedding a vector in the dependency relationship among the individual nodes;
obtaining attention probabilities among different nodes of the dependency syntax tree of the statement to be tested according to the attention scores among different nodes of the dependency syntax tree of the statement to be tested and a preset normalization algorithm, and constructing a first adjacency matrix of the statement to be tested, wherein the normalization algorithm is as follows:
in the method, in the process of the invention,is the first of the dependency syntax treeiThe individual nodejProbability of attention between individual nodes, +.>Is the first of the dependency syntax tree iA set of nodes of the individual nodes,xrepresenting the first syntax tree associated with the dependencyiThe first node is connected withxAnd each node.
4. The sentence emotion analysis method based on interaction aspect information fusion according to claim 3, wherein the inputting of the sentence bidirectional feature of the sentence to be tested and the first adjacency matrix to the convolution network layer in the neural network model, the convolution network layer being a multi-layer convolution network layer, the obtaining of the convolution feature of the sentence to be tested output by each layer of the multi-layer convolution network layer includes the steps of:
obtaining the graph convolution characteristics of the sentence to be tested output by each layer of the multi-layer graph convolution network layer according to the sentence bidirectional characteristics of the sentence to be tested, a first adjacent matrix and a preset first graph convolution characteristic calculation algorithm, wherein the first graph convolution characteristic calculation algorithm is as follows:
in the method, in the process of the invention,the graph convolution characteristics of the statement under test for the output of the multi-layer graph convolution network layer,Hfor the sentence bi-directional feature of the sentence to be tested, < + >>For the adjacency matrix of the statement under test, +.>A function is calculated for the first convolution feature,kindicating the number of layers of the multi-layer graph convolutional network layer.
5. The sentence emotion analysis method based on interaction aspect information fusion according to claim 4, characterized in that: the words include context words, and the second adjacency matrix is specifically as follows:
in the method, in the process of the invention,is the first of the dependency syntax treeiThe individual nodejSyntactic distance data between the individual nodes,pindex for the position of the aspect word, +.>Is the firstjWord of individual node->Expressed as aspect words>Expressed as a further term of art or words,otherwiserepresented as a word of a context,lenand the sentence length of the sentence to be tested is the sentence length of the sentence to be tested.
6. The sentence emotion analysis method based on interaction aspect information fusion of claim 5, wherein the inputting the graph convolution feature and interaction aspect feature of the sentence to be tested into the information fusion layer of the neural network model to obtain the fusion feature of the sentence to be tested includes the steps of:
obtaining fusion characteristics of the statement to be tested according to graph convolution characteristics, interaction aspect characteristics and a preset characteristic fusion algorithm of the statement to be tested, wherein the characteristic fusion algorithm is as follows:
in the method, in the process of the invention,and the fusion characteristic of the statement to be detected, wherein gamma is a preset super parameter.
7. The sentence emotion analysis method based on interaction aspect information fusion according to claim 6, characterized in that: the emotion analysis layer comprises a pooling layer and an activation layer, the fusion characteristics of the statement to be detected are input into the emotion analysis layer of the neural network model, and the emotion analysis result of the statement to be detected is obtained, and the method comprises the following steps:
Inputting the fusion characteristics of the statement to be tested into the pooling layer for mask processing and average pooling processing, and obtaining second average pooling characteristics of the statement to be tested, wherein the second average pooling characteristics are specifically as follows:
in the method, in the process of the invention,pooling features for a second average of the statement under test;
inputting the second average pooling feature of the sentence to be tested into the activation layer, obtaining emotion classification polarity probability distribution vectors according to the second average pooling feature of the sentence to be tested and a preset emotion analysis algorithm, obtaining emotion polarities corresponding to the dimension with the maximum probability according to the emotion classification polarity probability distribution vectors, and taking the emotion polarities as emotion analysis results of the sentence to be tested, wherein the emotion analysis algorithm is as follows:
in the method, in the process of the invention,yclassifying a polar probability distribution vector for said emotion,for a fourth trainable parameter, which is preset, < +.>Is a preset bias parameter.
8. Sentence emotion analysis device based on interaction aspect information fusion, characterized by comprising:
the sentence bidirectional feature calculation module is used for obtaining a sentence to be detected, wherein the sentence to be detected comprises a plurality of words, the sentence to be detected is sent to a coding layer in a preset neural network model, the bidirectional features of the plurality of words of the sentence to be detected are obtained, the bidirectional features of the plurality of words of the sentence to be detected are combined, and the sentence bidirectional features of the sentence to be detected are obtained, wherein the words comprise aspect words;
The picture convolution feature calculation module is used for constructing a dependency syntax tree of the sentence to be tested, and constructing a first adjacent matrix of the sentence to be tested according to the sentence bidirectional feature of the sentence to be tested and the dependency syntax tree; inputting sentence bidirectional features of the to-be-tested sentences and a first adjacency matrix into a graph rolling network layer in the neural network model, wherein the graph rolling network layer is a plurality of graph rolling network layers, and obtaining graph rolling features of the to-be-tested sentences output by each layer of the plurality of graph rolling network layers;
the interactive aspect characteristic calculation module is used for acquiring the syntactic distance data between different nodes of the dependency syntactic tree of the statement to be tested and constructing a second adjacency matrix of the statement to be tested according to the syntactic distance data;
inputting the graph convolution characteristics of the to-be-tested statement output by each layer of the multi-layer graph convolution network layer into a layer selection attention network layer in the neural network model, and performing mask processing and average pooling processing on the graph convolution characteristics of the to-be-tested statement output by each layer of the multi-layer graph convolution network layer to obtain first average pooling characteristics corresponding to a plurality of layers of the to-be-tested statement, wherein the first average pooling characteristics are specifically as follows:
In the method, in the process of the invention,for the mask parameter(s),τis the starting position of the aspect word,mis the number of terms of the aspect,nis the number of words to be used,tposition index for word, denoted as the firsttIndividual words->The first average pooling characteristics corresponding to a plurality of layers of the statement to be tested are obtained;AveragePooling() Is an average pooling function;
according to a preset splicing algorithm, carrying out splicing processing on first average pooling features corresponding to a plurality of layers of the statement to be detected to obtain hierarchical splicing features of the statement to be detected, wherein the splicing algorithm is as follows:
in the method, in the process of the invention,splicing the characteristics of the hierarchy of the statement to be tested, < >>Is a splicing function;
multiplying the hierarchical concatenation feature of the sentence to be tested with a preset second training parameter to obtain an attention parameter of the sentence to be tested, and obtaining a hierarchical selection feature of the sentence to be tested according to the attention parameter of the sentence to be tested, the picture convolution feature of the sentence to be tested output by each layer of a multi-layer picture convolution network layer and a preset hierarchical selection algorithm, wherein the hierarchical selection algorithm is as follows:
in the method, in the process of the invention,selecting a feature for the hierarchy of the statement under test, < >>As the attention parameter of the statement under test,is a preset third training parameter;
Obtaining interactive aspect characteristics of the statement to be tested according to the graph rolling characteristics of the statement to be tested, a second adjacent matrix and a preset second graph rolling characteristic calculation algorithm, wherein the second graph rolling characteristic calculation algorithm is as follows:
in the method, in the process of the invention,-graph convolution characteristics of said statement under test for the output of said multilayered graph convolution network layer, and>for the adjacency matrix of the statement under test, +.>Computing a function for the second convolution feature;
the feature fusion module is used for inputting the graph convolution features and the interaction aspect features of the statement to be tested into the information fusion layer of the neural network model to obtain fusion features of the statement to be tested;
and the emotion analysis module is used for inputting the fusion characteristics of the statement to be detected into an emotion analysis layer of the neural network model to obtain an emotion analysis result of the statement to be detected.
9. A computer device, comprising: a processor, a memory, and a computer program stored on the memory and executable on the processor; the computer program, when executed by the processor, implements the steps of the interactive aspect information fusion-based sentence emotion analysis method as recited in any one of claims 1 to 7.
CN202211296582.3A 2022-10-21 2022-10-21 Sentence emotion analysis method and device based on interaction aspect information fusion Active CN115906861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211296582.3A CN115906861B (en) 2022-10-21 2022-10-21 Sentence emotion analysis method and device based on interaction aspect information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211296582.3A CN115906861B (en) 2022-10-21 2022-10-21 Sentence emotion analysis method and device based on interaction aspect information fusion

Publications (2)

Publication Number Publication Date
CN115906861A CN115906861A (en) 2023-04-04
CN115906861B true CN115906861B (en) 2023-09-26

Family

ID=86475359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211296582.3A Active CN115906861B (en) 2022-10-21 2022-10-21 Sentence emotion analysis method and device based on interaction aspect information fusion

Country Status (1)

Country Link
CN (1) CN115906861B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116822522B (en) * 2023-06-13 2024-05-28 连连银通电子支付有限公司 Semantic analysis method, semantic analysis device, semantic analysis equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444338A (en) * 2020-02-28 2020-07-24 腾讯科技(深圳)有限公司 Text processing device, storage medium and equipment
CN112528672A (en) * 2020-12-14 2021-03-19 北京邮电大学 Aspect-level emotion analysis method and device based on graph convolution neural network
CN113535904A (en) * 2021-07-23 2021-10-22 重庆邮电大学 Aspect level emotion analysis method based on graph neural network
CN115048938A (en) * 2022-06-13 2022-09-13 华南师范大学 Statement emotion analysis method and device based on semantic and syntax dual channels
CN115204183A (en) * 2022-09-19 2022-10-18 华南师范大学 Knowledge enhancement based dual-channel emotion analysis method, device and equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444338A (en) * 2020-02-28 2020-07-24 腾讯科技(深圳)有限公司 Text processing device, storage medium and equipment
CN112528672A (en) * 2020-12-14 2021-03-19 北京邮电大学 Aspect-level emotion analysis method and device based on graph convolution neural network
CN113535904A (en) * 2021-07-23 2021-10-22 重庆邮电大学 Aspect level emotion analysis method based on graph neural network
CN115048938A (en) * 2022-06-13 2022-09-13 华南师范大学 Statement emotion analysis method and device based on semantic and syntax dual channels
CN115204183A (en) * 2022-09-19 2022-10-18 华南师范大学 Knowledge enhancement based dual-channel emotion analysis method, device and equipment

Also Published As

Publication number Publication date
CN115906861A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
CN114676704B (en) Sentence emotion analysis method, device and equipment and storage medium
CN110781273B (en) Text data processing method and device, electronic equipment and storage medium
CN110363084A (en) A kind of class state detection method, device, storage medium and electronics
CN116402063B (en) Multi-modal irony recognition method, apparatus, device and storage medium
CN115168592B (en) Statement emotion analysis method, device and equipment based on aspect categories
CN115587597B (en) Sentiment analysis method and device of aspect words based on clause-level relational graph
CN116258145B (en) Multi-mode named entity recognition method, device, equipment and storage medium
CN116151263B (en) Multi-mode named entity recognition method, device, equipment and storage medium
CN114419351A (en) Image-text pre-training model training method and device and image-text prediction model training method and device
CN115906861B (en) Sentence emotion analysis method and device based on interaction aspect information fusion
CN113094533A (en) Mixed granularity matching-based image-text cross-modal retrieval method
CN117633516B (en) Multi-mode cynics detection method, device, computer equipment and storage medium
CN110852071A (en) Knowledge point detection method, device, equipment and readable storage medium
CN115659987B (en) Multi-mode named entity recognition method, device and equipment based on double channels
CN115905524B (en) Emotion analysis method, device and equipment integrating syntax and semantic information
CN115906863B (en) Emotion analysis method, device, equipment and storage medium based on contrast learning
CN114970666B (en) Spoken language processing method and device, electronic equipment and storage medium
CN115905518A (en) Emotion classification method, device and equipment based on knowledge graph and storage medium
CN115033700A (en) Cross-domain emotion analysis method, device and equipment based on mutual learning network
CN116029294B (en) Term pairing method, device and equipment
CN111274788A (en) Dual-channel joint processing method and device
CN116484869B (en) Multi-mode named entity recognition method, device, equipment and storage medium
CN112989801B (en) Sequence labeling method, device and equipment
CN115712726B (en) Emotion analysis method, device and equipment based on double word embedding
CN117892140B (en) Visual question and answer and model training method and device thereof, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant