CN113705243A - Emotion analysis method - Google Patents

Emotion analysis method Download PDF

Info

Publication number
CN113705243A
CN113705243A CN202110997775.0A CN202110997775A CN113705243A CN 113705243 A CN113705243 A CN 113705243A CN 202110997775 A CN202110997775 A CN 202110997775A CN 113705243 A CN113705243 A CN 113705243A
Authority
CN
China
Prior art keywords
training
emotion
text set
training text
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110997775.0A
Other languages
Chinese (zh)
Inventor
罗瑜
吴晓华
令狐阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202110997775.0A priority Critical patent/CN113705243A/en
Publication of CN113705243A publication Critical patent/CN113705243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses an emotion analysis method, which comprises the steps of obtaining a training text set, preprocessing the training text set to obtain a training emotion score vector set, training a preset neural network based on the training emotion score vector set and a preset semantic word vector set, performing emotion analysis on a text to be analyzed based on the trained neural network, preprocessing the training text set to obtain a training emotion score vector, performing word segmentation on the training text set, performing length cutting or correction on the training text set subjected to word segmentation based on the length of the preset text to obtain a standard training text set, determining the training emotion score vector set corresponding to the standard training text set based on an emotion dictionary, realizing the specific meaning of the analyzed text, combining an emotion dictionary with the neural network, and effectively analyzing irregular texts, the emotion analysis method is high in accuracy and high in operation efficiency.

Description

Emotion analysis method
Technical Field
The invention belongs to the technical field of artificial intelligence, and particularly relates to an emotion analysis method.
Background
With the rise of social media, a large amount of information expressing various emotions and tendencies of people is generated in the process of information interaction such as micro-blogs, comments and the like, and through the analysis of emotional information, the preference of a user can be better analyzed, and the development trend of things can be predicted. Currently, emotion analysis has grown into one of the most active research fields in the field of natural language processing, and is widely applied to the fields of marketing and public opinion analysis.
Current methods of sentiment analysis can be divided into two categories: emotion dictionary based methods and machine learning based methods. The emotion dictionary method is based on dictionaries and rules, achieves the effect of emotion analysis by searching various emotion words in a text and calculating the emotion tendency score of each sentence, and is wide in corpus application range but limited by the quality and coverage of the emotion dictionary. The method of machine learning also includes a feature-based method and a deep learning-based method. The method based on the characteristics selects the characteristics from a large amount of linguistic data to express the text, and then uses machine learning methods such as a Support Vector Machine (SVM), a decision tree and the like to analyze the emotion, and is limited by the selection of the characteristics. The deep learning method realizes the emotion analysis of the text by carrying out classification training through the training set, avoids manual feature extraction work, can effectively process the problem of the context association of the text, has strong discrimination capability and feature self-learning capability, is suitable for the characteristics of high dimensionality, no label and big data, and is also the most widely used method at present.
However, the prior art reduces the dependence on feature engineering and linguistic knowledge, but neglects the specific meaning of the text, resulting in low emotion analysis accuracy.
Therefore, how to improve the accuracy of emotion analysis is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The invention aims to solve the technical problem that the emotion analysis accuracy is not high in the prior art, and provides an emotion analysis method.
The technical scheme of the invention is as follows: an emotion analysis method, comprising the steps of:
s1, obtaining a training text set, and preprocessing the training text set to obtain a training emotion score vector set;
s2, training a preset neural network based on the training emotion score vector set and a preset semantic word vector set;
and S3, performing emotion analysis on the text to be analyzed based on the trained neural network.
Further, the step S1 of preprocessing the training text set to obtain the training emotion score vector specifically includes the following sub-steps:
s11, performing word segmentation processing on the training text set;
s12, performing length cutting or correction on the training text set subjected to word segmentation processing based on the preset text length to obtain a standard training text set;
and S13, determining a training emotion score vector set corresponding to the standard training text set based on the emotion dictionary.
Further, the step S12 specifically includes the following sub-steps:
s121, cutting the training texts with the length being larger than the preset length in the training text set after word segmentation;
s122, filling the front ends of the training texts with the length smaller than the preset length in the training text set after word segmentation processing with 0;
and S123, taking the training text set processed in the step S121 and the step S122 as a standard training text set.
Further, the step S13 specifically includes the following sub-steps:
s131, matching and judging each word in the standard training text set with the emotion dictionary, if the matching is successful, executing a step S132, and if the matching is unsuccessful, executing a step S133;
s132, replacing the successfully matched words in the standard training text set with corresponding emotion polarity scores, and then entering the step S134;
s133, replacing the words which are unsuccessfully matched in the standard training text set with 0, and then entering the step S134;
and S134, taking the text set obtained by replacing all the words in the standard training text set as a training emotion score vector set.
Further, the step S2 specifically includes the following sub-steps:
s21, taking the training emotion score vector set as the input of an encoder of the neural network, and obtaining a first feature vector sequence;
s22, taking the preset semantic word vector set as the input of the encoder, and obtaining a second feature vector sequence;
s23, splicing the first feature vector sequence and the second feature vector sequence to obtain a third feature vector sequence, taking the third feature vector sequence as the input of a decoder of the neural network, and ending the decoding process when an ending identifier is decoded;
s24, training the neural network by using a cross entropy loss function on the basis of the steps S21 to S23 to obtain the trained neural network.
Further, the step S3 specifically includes the following sub-steps:
s31, preprocessing the text to be analyzed to obtain an emotion score vector;
s32, taking the emotion score vector and the preset semantic word vector set as input of the trained neural network, and obtaining a corresponding output score;
and S33, obtaining an emotion analysis result based on the output score.
Compared with the prior art, the invention has the following beneficial effects:
(1) the invention obtains a training emotion score vector set by obtaining a training text set and preprocessing the training text set, trains a preset neural network based on the training emotion score vector set and a preset semantic word vector set, analyzes the emotion of a text to be analyzed based on the trained neural network, and preprocesses the training text set to obtain the training emotion score vector, specifically comprises the steps of segmenting the training text set, cutting or correcting the length of the training text set after the segmentation processing based on the preset text length to obtain a standard training text set, determining the training emotion score vector set corresponding to the standard training text set based on an emotion dictionary, realizing the specific meaning of the analysis text, combining the emotion dictionary with the neural network, effectively analyzing irregular texts, and having higher accuracy in emotion analysis, and the operation efficiency is high.
(2) The method for combining the emotion score vector with the preset semantic word vector is adopted, the emotion score vector can obtain the accurate emotional tendency of words, the neural network can effectively capture long-distance semantics, the semantic dependency relationship among the words can be better learned, and a more accurate text emotion analysis result can be realized.
Drawings
FIG. 1 is a schematic flow chart of an emotion analysis method according to an embodiment of the present invention;
fig. 2 is a schematic diagram illustrating the determination of the first feature vector sequence and the second feature vector sequence according to the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic flow chart of an emotion analysis method provided in an embodiment of the present application, where the method includes the following steps:
and step S1, acquiring a training text set, and preprocessing the training text set to obtain a training emotion score vector set.
In this embodiment of the application, the preprocessing the training text set in the step S1 to obtain the training emotion score vector specifically includes the following sub-steps:
s11, performing word segmentation processing on the training text set;
s12, performing length cutting or correction on the training text set subjected to word segmentation processing based on the preset text length to obtain a standard training text set;
and S13, determining a training emotion score vector set corresponding to the standard training text set based on the emotion dictionary.
In this embodiment, the step S12 specifically includes the following sub-steps:
s121, cutting the training texts with the length being larger than the preset length in the training text set after word segmentation;
s122, filling the front ends of the training texts with the length smaller than the preset length in the training text set after word segmentation processing with 0;
and S123, taking the training text set processed in the step S121 and the step S122 as a standard training text set.
In this embodiment, the step S13 specifically includes the following sub-steps:
s131, matching and judging each word in the standard training text set with the emotion dictionary, if the matching is successful, executing a step S132, and if the matching is unsuccessful, executing a step S133;
s132, replacing the successfully matched words in the standard training text set with corresponding emotion polarity scores, and then entering the step S134;
s133, replacing the words which are unsuccessfully matched in the standard training text set with 0, and then entering the step S134;
and S134, taking the text set obtained by replacing all the words in the standard training text set as a training emotion score vector set.
And step S2, training a preset neural network based on the training emotion score vector set and a preset semantic word vector set.
In a specific application scenario, the simulation experiment uses 39661 samples of chinese text, where 15510 positive data items and 24151 negative data items. 3967 pieces of data are randomly selected from the corpus according to the proportion of 9 to 1 as test data, and the rest are training data. The test data has 1586 positive data and 2381 negative data.
Wherein, the jieba word segmentation tool is used for carrying out Chinese word segmentation operation on 39661 texts.
And (3) carrying out length analysis on 39661 texts to obtain the length distribution of all texts, removing overlong texts, and selecting a length numerical value capable of covering 95% of the texts as the length of text cutting.
The text with the length smaller than the cutting length is filled with 0 at the front end, and the text with the length larger than the cutting length is deleted.
In the embodiment of the invention, the lengths of 39661 texts are normally distributed, the length of the longest text is 137 words, and 95% of texts can be covered when the length is 97, so that the influence caused by noise can be effectively avoided.
Matching each word with the emotion dictionary, if the word exists, replacing the word with the emotion polarity score in the emotion dictionary, if the word does not exist, replacing the word with 0, and calculating the formula as follows:
S=(v1,v2,…,vn)
Sout=(g1,g2,…,gn)
wherein S is the original sentence, vnThe words in the nth sentence after word segmentation are included, Sout is the converted sentence, and g is the emotion score in the emotion dictionary corresponding to each v.
In the embodiment of the invention, because the used corpus is the microblog corpus, the microblog emotion dictionary obtained by performing SO-PMI algorithm on a large amount of microblog data is selected as the emotion dictionary.
In this embodiment, the step S2 specifically includes the following sub-steps:
s21, taking the training emotion score vector set as the input of an encoder of the neural network, and obtaining a first feature vector sequence;
s22, taking the preset semantic word vector set as the input of the encoder, and obtaining a second feature vector sequence;
s23, splicing the first feature vector sequence and the second feature vector sequence to obtain a third feature vector sequence, taking the third feature vector sequence as the input of a decoder of the neural network, and ending the decoding process when an ending identifier is decoded;
s24, training the neural network by using a cross entropy loss function on the basis of the steps S21 to S23 to obtain the trained neural network.
In a specific application scenario, the neural network model encoder comprises a multilayer perceptron (MLP) and a bidirectional long-short term memory network (BiLSTM), and the neural network decoder adopts the multilayer perceptron.
The obtaining of the first feature vector sequence and the second feature vector sequence may be shown in fig. 2, where the word vector representation in fig. 2 is a preset semantic word vector, the emotional feature is the first feature vector sequence, and the semantic feature is the second feature vector sequence, and the obtaining of the first feature vector sequence may specifically be as follows:
1. constructing a multilayer perceptron model by using full connection layers with Relu as three activation functions with the unit numbers of 256, 128 and 16 respectively;
2. a Dropout algorithm with a parameter p of 0.3 was used for each fully connected layer to avoid overfitting, and the formula was:
r=Bernoulli(p)
x′i=xi×r
z=wx′i+b
x=f(z)
wherein the Bernoulli (p) function generates a masking factor r with a value of 0 or 1 with a probability p, such that a certain neuron stops working with the probability p, xiIs an activation value of the ith cell, x'iThe output value of the ith unit controlled by the masking factor r, z is the output of the ith unit, w is the weight, b is the offset, x is the output of the activation function, and f (z) represents the Relu activation function. The loss results from the training are then propagated back through the modified network. After a small batch of training samples has performed this process, the corresponding parameters (w, b) are updated on neurons that have not been stopped according to a stochastic gradient descent method.
3. 39661 emotion score vectors are used as input of a multilayer perceptron encoder, the encoded 16-dimensional feature vector sequence is obtained through output, and the calculation formula is as follows:
hout=f(wx+b)
hmlp=(h1,h2,…,h16)
where w is the weight of the perceptron, b is the bias, the function f is the activation function, hmlpThe emotion score vector is a 16-dimensional hidden layer feature vector sequence after being learned by a multi-layer perceptron.
The second feature vector sequence may be obtained as follows:
1. and training the microblog text corpus by using skip-Gram to obtain a 300-dimensional pre-training word vector, namely a pre-setting semantic word vector set.
2. Constructing a bidirectional 64-cell LSTM network allows the entire model to learn the context information of the text. The output vector of the forward LSTM network and the reverse LSTM network is connected to be used as the final output of the BiLSTM network, and the calculation formula is as follows:
Figure BDA0003234413960000061
Figure BDA0003234413960000062
Figure BDA0003234413960000063
wherein the content of the first and second substances,
Figure BDA0003234413960000064
and
Figure BDA0003234413960000065
respectively represent the forward direction
Figure BDA0003234413960000066
And in the reverse direction
Figure BDA0003234413960000067
Output of hBRepresents the output of BilSTM, h represents the output of the hidden layer in LSTM, x is the input vector, and C represents the state of the cell in LSTM.
3. Taking the pre-trained 300-dimensional word vector and the pre-processed text as the input of a bidirectional LSTM model encoder, and outputting to obtain a 16-dimensional feature vector sequence after encoding, wherein the calculation formula is as follows:
hBiLSTM=(h1,h2,…,h16)
wherein h isBiLSTMIs a 16-dimensional hidden layer characteristic vector sequence h obtained by bidirectional LSTM learning of a text by using a pre-training word vectoriRepresenting the output in the ith dimension.
Splicing the first feature vector sequence and the second feature vector sequence may specifically be as follows:
1. constructing a two-classification layer using a sigmoid activation function.
2. Connecting the output of the MLP and the output of the BilSTM by using a Concat mode as input, combining the emotion score attribute of the emotion dictionary with semantic expression learned by word vectors, and calculating the formula as follows:
xinput=[hBiLSTM,hmlp]
wherein x isinputNamely a 32-dimensional characteristic vector sequence h obtained by concat splicingBiLsTMIs a 16-dimensional semantic feature vector sequence h obtained by bidirectional LSTM learning of a text by using a pre-training word vectormlpThe method is a 16-dimensional emotion feature vector sequence obtained by a multilayer perceptron after emotion score vectors obtained by a text through an emotion dictionary.
And step S3, performing emotion analysis on the text to be analyzed based on the trained neural network.
In this embodiment, the step S3 specifically includes the following sub-steps:
s31, preprocessing the text to be analyzed to obtain an emotion score vector;
s32, taking the emotion score vector and the preset semantic word vector set as input of the trained neural network, and obtaining a corresponding output score;
and S33, obtaining an emotion analysis result based on the output score.
It should be noted that the emotion analysis results include two types, one is positive and the other is negative, which is actually a classification problem of two types. The output score is actually a probability that it is biased towards positive emotions or towards negative emotions.
In a specific application scenario, firstly, a text to be analyzed is converted into an emotion score vector, then, the emotion score vector is used as input of a trained neural network, and a coded 16-dimensional feature vector sequence, namely a first feature vector sequence, is obtained, wherein a calculation formula is as follows:
hout=f(wx+b)
hmlp=(h1,h2,…,h16)
where w is the weight, b is the bias, the function f is the activation function, hmlpIs a 16-dimensional hidden layer feature vector sequence of emotion score vectors after learning by a multi-layer perceptron, wherein x is h1Is as follows.
Then, the preset semantic word vector is used as the input of the trained bidirectional LSTM model encoder, and the encoded 16-dimensional feature vector sequence, namely the second feature vector sequence, is obtained through output, wherein the calculation formula is as follows:
Figure BDA0003234413960000071
Figure BDA0003234413960000072
Figure BDA0003234413960000073
hBiLSTM=(h1,h2,…,h16)
wherein h isBiLSTMIs a 16-dimensional hidden layer characteristic vector sequence obtained by the text through bidirectional LSTM learning by using pre-training word vectors,
Figure BDA0003234413960000074
and
Figure BDA0003234413960000075
respectively represent the forward direction
Figure BDA0003234413960000076
And in the reverse direction
Figure BDA0003234413960000077
Output of hBRepresents the output of BilSTM, h represents the output of the hidden layer in LSTM, x is the input vector, and C represents the state of the cell in LSTM.
And then splicing the first characteristic vector sequence and the second characteristic vector sequence, wherein the specific application scene is as follows:
1. connecting the output of the MLP with the 16-dimensional feature vector sequence output by the BilSTM in a Concat mode to obtain a 32-dimensional feature vector sequence as input, combining the emotion score attribute of an emotion dictionary with semantic representation learned by word vectors, and adopting the calculation formula as follows:
xinput=[hBiLSTM,hmlp]
wherein x isinputNamely a 32-dimensional characteristic vector sequence obtained by concat splicing.
2. The difference between the real distribution and the predicted distribution is measured by adopting a cross entropy loss function, and the calculation formula is as follows:
Figure BDA0003234413960000078
wherein x is a sample label sequence, z is a probability sequence of the sample belonging to the positive class label, and xiIs the label of the ith sample, ziFor the ith sample belonging to the positive classThe probability of a label, n, is the number of samples.
The accuracy, recall rate and F1 value of the emotion analysis method based on the combination of the emotion dictionary and the neural network provided by the invention are further described in two specific experimental examples.
(1) Emotion score vector effects
In this specific experimental example, the positive accuracy rate, the negative recall rate and the F value of the emotion score vector are respectively improved by 12.71%, 17.89% and 7.17%, and the F value and the recall rate in the positive data are lower than those of the emotion dictionary method. The result is mainly due to the influence of partial double negation and degree adverbs, the emotion dictionary method simultaneously uses an emotion dictionary, a negative word dictionary and a degree adverb dictionary to carry out emotion analysis, the MLP method only uses the emotion dictionary, wrong judgment is carried out on partial double negation sentences, and a good distinguishing effect is achieved on normal negative sentences, so that the recall rate and the F value of the MLP method are obviously improved in negative direction data, the invention can effectively learn the characteristic of emotion scores in the emotion dictionary to carry out emotion analysis, and the comparison result is shown in Table 1:
TABLE 1 Emotion score vector Effect
Figure BDA0003234413960000081
(2) Overall model effect
Compared with a BilSTM method using word vectors, the emotion analysis method based on the combination of the emotion dictionary and the neural network improves the overall accuracy of emotion analysis. On the aspect of positive data, the accuracy and the F value of the method are improved by 3.59 percent and 0.82 percent compared with that of the BilSTM method, on the aspect of negative data, the recall rate and the F value of the method are improved by 2.64 percent and 1.16 percent, and the overall accuracy is improved by 1.16 percent compared with that of the BilSTM method. The promotion in the positive data and the negative data is exactly the same as the promotion part of the MLP method and the emotion dictionary method based on the emotion score vector, which reflects that the emotion score of the emotion dictionary is used as vector representation of the text, then MLP learning is used, the obtained hidden layer representation information can be correctly added into the hidden layer representation information of the BiLSTM learning pre-training word vector and accords with the expected promotion emotion analysis capability, and the comparison result is shown in Table 2:
TABLE 2 Overall model Emotion analysis results
Figure BDA0003234413960000082
Figure BDA0003234413960000091
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.

Claims (6)

1. An emotion analysis method, comprising the steps of:
s1, obtaining a training text set, and preprocessing the training text set to obtain a training emotion score vector set;
s2, training a preset neural network based on the training emotion score vector set and a preset semantic word vector set;
and S3, performing emotion analysis on the text to be analyzed based on the trained neural network.
2. An emotion analysis method as claimed in claim 1, wherein, in said step S1, the preprocessing of said training text set to obtain a training emotion score vector specifically includes the following sub-steps:
s11, performing word segmentation processing on the training text set;
s12, performing length cutting or correction on the training text set subjected to word segmentation processing based on the preset text length to obtain a standard training text set;
and S13, determining a training emotion score vector set corresponding to the standard training text set based on the emotion dictionary.
3. An emotion analysis method as claimed in claim 2, wherein said step S12 includes the following sub-steps:
s121, cutting the training texts with the length being larger than the preset length in the training text set after word segmentation;
s122, filling the front ends of the training texts with the length smaller than the preset length in the training text set after word segmentation processing with 0;
and S123, taking the training text set processed in the step S121 and the step S122 as a standard training text set.
4. An emotion analysis method as claimed in claim 2, wherein said step S13 includes the following sub-steps:
s131, matching and judging each word in the standard training text set with the emotion dictionary, if the matching is successful, executing a step S132, and if the matching is unsuccessful, executing a step S133;
s132, replacing the successfully matched words in the standard training text set with corresponding emotion polarity scores, and then entering the step S134;
s133, replacing the words which are unsuccessfully matched in the standard training text set with 0, and then entering the step S134;
and S134, taking the text set obtained by replacing all the words in the standard training text set as a training emotion score vector set.
5. An emotion analysis method as claimed in claim 1, wherein said step S2 includes the following sub-steps:
s21, taking the training emotion score vector set as the input of an encoder of the neural network, and obtaining a first feature vector sequence;
s22, taking the preset semantic word vector set as the input of the encoder, and obtaining a second feature vector sequence;
s23, splicing the first feature vector sequence and the second feature vector sequence to obtain a third feature vector sequence, taking the third feature vector sequence as the input of a decoder of the neural network, and ending the decoding process when an ending identifier is decoded;
s24, training the neural network by using a cross entropy loss function on the basis of the steps S21 to S23 to obtain the trained neural network.
6. An emotion analysis method as claimed in claim 1, wherein said step S3 includes the following sub-steps:
s31, preprocessing the text to be analyzed to obtain an emotion score vector;
s32, taking the emotion score vector and the preset semantic word vector set as input of the trained neural network, and obtaining a corresponding output score;
and S33, obtaining an emotion analysis result based on the output score.
CN202110997775.0A 2021-08-27 2021-08-27 Emotion analysis method Pending CN113705243A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110997775.0A CN113705243A (en) 2021-08-27 2021-08-27 Emotion analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110997775.0A CN113705243A (en) 2021-08-27 2021-08-27 Emotion analysis method

Publications (1)

Publication Number Publication Date
CN113705243A true CN113705243A (en) 2021-11-26

Family

ID=78656223

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110997775.0A Pending CN113705243A (en) 2021-08-27 2021-08-27 Emotion analysis method

Country Status (1)

Country Link
CN (1) CN113705243A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN108614875A (en) * 2018-04-26 2018-10-02 北京邮电大学 Chinese emotion tendency sorting technique based on global average pond convolutional neural networks
CN108647219A (en) * 2018-03-15 2018-10-12 中山大学 A kind of convolutional neural networks text emotion analysis method of combination sentiment dictionary
CN108763326A (en) * 2018-05-04 2018-11-06 南京邮电大学 A kind of sentiment analysis model building method of the diversified convolutional neural networks of feature based
CN108984523A (en) * 2018-06-29 2018-12-11 重庆邮电大学 A kind of comment on commodity sentiment analysis method based on deep learning model
CN109902177A (en) * 2019-02-28 2019-06-18 上海理工大学 Text emotion analysis method based on binary channels convolution Memory Neural Networks
CN111460146A (en) * 2020-03-23 2020-07-28 南京邮电大学 Short text classification method and system based on multi-feature fusion
CN111930940A (en) * 2020-07-30 2020-11-13 腾讯科技(深圳)有限公司 Text emotion classification method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN108647219A (en) * 2018-03-15 2018-10-12 中山大学 A kind of convolutional neural networks text emotion analysis method of combination sentiment dictionary
CN108614875A (en) * 2018-04-26 2018-10-02 北京邮电大学 Chinese emotion tendency sorting technique based on global average pond convolutional neural networks
CN108763326A (en) * 2018-05-04 2018-11-06 南京邮电大学 A kind of sentiment analysis model building method of the diversified convolutional neural networks of feature based
CN108984523A (en) * 2018-06-29 2018-12-11 重庆邮电大学 A kind of comment on commodity sentiment analysis method based on deep learning model
CN109902177A (en) * 2019-02-28 2019-06-18 上海理工大学 Text emotion analysis method based on binary channels convolution Memory Neural Networks
CN111460146A (en) * 2020-03-23 2020-07-28 南京邮电大学 Short text classification method and system based on multi-feature fusion
CN111930940A (en) * 2020-07-30 2020-11-13 腾讯科技(深圳)有限公司 Text emotion classification method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIAOHUA WU ET AL.: "Sentiment Analysis of Weak-RuleText Based on the Combination of Sentiment Lexicon and Neural Network", 《2021 IEEE 6TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYTICS》 *
刘树春 等著: "《深度实践OCR 基于深度学习的文字识别》", 31 May 2020 *

Similar Documents

Publication Publication Date Title
CN110609897B (en) Multi-category Chinese text classification method integrating global and local features
CN109933664B (en) Fine-grained emotion analysis improvement method based on emotion word embedding
CN111339305B (en) Text classification method and device, electronic equipment and storage medium
CN112199956B (en) Entity emotion analysis method based on deep representation learning
CN111401061A (en) Method for identifying news opinion involved in case based on BERT and Bi L STM-Attention
CN110362819B (en) Text emotion analysis method based on convolutional neural network
CN108170848B (en) Chinese mobile intelligent customer service-oriented conversation scene classification method
CN111078833A (en) Text classification method based on neural network
CN113505200B (en) Sentence-level Chinese event detection method combined with document key information
CN112667818A (en) GCN and multi-granularity attention fused user comment sentiment analysis method and system
CN112818698B (en) Fine-grained user comment sentiment analysis method based on dual-channel model
Shi et al. Chatgraph: Interpretable text classification by converting chatgpt knowledge to graphs
CN112749274A (en) Chinese text classification method based on attention mechanism and interference word deletion
CN113254637B (en) Grammar-fused aspect-level text emotion classification method and system
CN113987167A (en) Dependency perception graph convolutional network-based aspect-level emotion classification method and system
CN112287106A (en) Online comment emotion classification method based on dual-channel hybrid neural network
CN112988970A (en) Text matching algorithm serving intelligent question-answering system
CN115759119A (en) Financial text emotion analysis method, system, medium and equipment
CN114692623A (en) Emotion analysis method for environment network public sentiment
CN113486174B (en) Model training, reading understanding method and device, electronic equipment and storage medium
CN114547303A (en) Text multi-feature classification method and device based on Bert-LSTM
CN113486143A (en) User portrait generation method based on multi-level text representation and model fusion
CN115204143B (en) Method and system for calculating text similarity based on prompt
WO2023159759A1 (en) Model training method and apparatus, emotion message generation method and apparatus, device and medium
CN115906824A (en) Text fine-grained emotion analysis method, system, medium and computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211126