CN109992780B - Specific target emotion classification method based on deep neural network - Google Patents

Specific target emotion classification method based on deep neural network Download PDF

Info

Publication number
CN109992780B
CN109992780B CN201910249992.4A CN201910249992A CN109992780B CN 109992780 B CN109992780 B CN 109992780B CN 201910249992 A CN201910249992 A CN 201910249992A CN 109992780 B CN109992780 B CN 109992780B
Authority
CN
China
Prior art keywords
target
specific
specific target
word
word vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910249992.4A
Other languages
Chinese (zh)
Other versions
CN109992780A (en
Inventor
谢金宝
王振东
马骏杰
战岭
吕世伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201910249992.4A priority Critical patent/CN109992780B/en
Publication of CN109992780A publication Critical patent/CN109992780A/en
Application granted granted Critical
Publication of CN109992780B publication Critical patent/CN109992780B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a specific target emotion classification method based on a deep neural network. Belonging to the field of text sentiment classification of natural language processing. Firstly, Chinese word segmentation, stop word removal and punctuation removal are carried out on a data set, then word2vec algorithm is adopted to train the processed corpus to obtain corresponding word vectors, then the training set is input into a long-term and short-term memory network model structure based on a target attention mechanism, a specific target and a specific aspect are embedded in the process of realizing attention weight training, the specific target is expressed by weighted summation embedded in the specific aspect, so that the model gives more correct attention to the specific target and the specific aspect, the real semantics of the target is better captured, and finally the emotion classification accuracy of the specific target is improved.

Description

Specific target emotion classification method based on deep neural network
Technical Field
The invention relates to comment text emotion classification, in particular to a depth neural network-based specific target emotion classification method, and belongs to the technical field of natural language processing.
Background
The emotion analysis method mainly comprises a rule-based method, a machine learning-based method and a deep neural network-based method. The rule-based method usually needs to construct an emotion dictionary or emotion collocation template, and then calculates the emotion tendency of the text by comparing emotion words or fixed collocation contained in the comment text, but the construction of a relatively complete emotion dictionary or related collocation rules is a major problem at present. The method based on machine learning mainly carries out feature extraction and modeling on the labeled training corpus, thereby automatically realizing judgment of emotion polarity by using a machine learning algorithm; the method mainly comprises a support vector machine, naive Bayes, maximum information entropy, a conditional random field and the like, but the effect of machine learning classification is usually determined by the selection of features, great uncertainty exists in manual feature selection, and functions used in modeling the materials are generally simple, deep features are difficult to capture, and the modeling capability and generalization capability have great limitations. With the development of deep learning and the freedom and diversification of language expression modes, the advantage of the deep neural network technology is gradually highlighted and becomes the mainstream technology in the natural language processing field, compared with the emotion analysis method based on rules and the emotion analysis method based on machine learning, the deep neural network method can capture more comprehensive and deeper text characteristics when facing the current complex and changeable language phenomena due to the complexity of the model and the function, namely, the deep neural network method has better comprehension capability on the text and can achieve better effect in the emotion analysis field.
The LSTM neural network model, also called long-short term memory network model, is a variant of the RNN model. The LSTM solves the problem of information disappearance or information explosion when the RNN model transmits information in a long distance, and the LSTM neural network model adds various gate structures to neural network nodes on the basis of the RNN model to control the information to flow at different moments. In order to control the flow of information, a memory unit is specially designed in an internal node of the LSTM neural network, and the deletion or addition of the information is controlled through a gate structure, wherein the gate is a method for selectively passing the information, and the nodes of the LSTM neural network are provided with three gate structures for protecting and controlling the states of the nodes, namely an input gate, a forgetting gate and an output gate. The attention mechanism is derived from the fact that more attention is allocated to key parts of things concerned by human brains, is applied to the field of visual images at first and then applied to a task of natural language processing, and has a good effect.
The specific target emotion analysis is an important subtask of emotion analysis and is deeper emotion analysis. Different from ordinary emotion analysis, the judgment of the emotion polarity of the specific target not only depends on the context information of the text, but also depends on the feature information of the specific target. For example, the sentence "the food at this restaurant is very good to eat but expensive, but the service is compelling," the "taste" aspect for the target "food" is a positive emotion, the "price" aspect for the target "food" is a negative emotion, and the "service" aspect for the target "restaurant" is a positive emotion. Therefore, even if the same sentence is used, the completely opposite emotional polarities may occur for the same target, and different targets may have different emotional polarities. However, most text emotion classification models based on neural networks do not pay correct attention to the emotion of a specific aspect of a specific target, and the classification effect is poor. The method realizes better capture of the real semantics of the target, enriches the semantic information in the text, improves the accuracy of emotion classification of the specific target, and is the main research direction of the method.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a specific target emotion classification method based on a deep neural network and a target attention mechanism, which is used for analyzing the emotion colors of specific targets and specific aspects contained in text data in a social network.
The invention can be realized by adopting the following system:
a specific target emotion classification method based on a deep neural network is characterized by comprising the following steps:
step one, acquiring a Chinese emotion classification data set and preprocessing a text, and dividing the emotion classification data set into a training set and a test set;
secondly, training a word vector model for the preprocessed data set by using a word2vec tool and mapping texts in the data set into a word vector set;
step three, inputting the word vector set of the training set into the LSTM, discarding or transferring information by using three gates in the LSTM with trainable parameters, and outputting a series of hidden vectors h ═ { h ═ h1,h2,…,hn};
Step four, putting the word vector matrix of the training set, the word vector matrix of the specific target and the word vector matrix of the specific aspect into a target attention mechanism to obtain each hiPositive weight p ofiSubsequently, a sentence representation Z is obtainedS
Step five, generating a sentence ZSAnd judging the emotional polarity of the specific target by using the full connection layer and the softmax function.
Further, the text preprocessing specifically includes: :
preprocessing mainly comprises the steps of carrying out Chinese word segmentation, word stop and punctuation removal on the sentences marked with emotion polarities; 80% of the data sets were randomly selected as training sets and 20% as test sets.
Further, the training a word vector model on the preprocessed data set using a word2vec tool comprises:
after the word2vec model training is completed, the word2vec model may be used to map each word ω to a continuous feature vector eω∈RdWherein d represents the dimension of the word vector, and finally generating a word vector matrix E E ∈ Rv×dWhere V represents the size of the vocabulary in the dataset.
Further, the set of word vectors of the training set is input into the LSTM, information is discarded or passed using three gates in the LSTM with trainable parameters, and a series of hidden vectors h are output={h1,h2,…,hnThe method specifically comprises the following steps:
three gates in the LSTM include an input gate, a forgetting gate, and an output gate. Let xtFor input at t moment of certain node of LSTM neural network, htIs the output at time t, WxFor inputting the corresponding weight, WhIn order to output the corresponding weight, the flow of updating the LSTM neural network model through the gate structure control information is divided into the following steps:
calculating the value i at the moment t of the input gatetThe input gate controls the influence of the current input on the state value of the memory unit, and the calculation method is as follows
it=sigmoid(Wxixt+Whiht-1+Wcict-1+bi) (1)
Calculating the value f at the moment of forgetting to leave the door ttThe influence of the historical information on the state value of the memory unit is controlled by the forgetting gate, and the calculation method is as follows:
ft=sigmoid(Wxfxt+Whfht-1+Wcfct-1+bf) (2)
calculating the value of the candidate memory unit at the current time
Figure BDA0002012120840000031
And updating the value of the memory cell at the current moment, wherein the calculation method comprises the following steps:
Figure BDA0002012120840000032
ct=ft·ct-1+it·ct (4)
finally calculating the output information h at the moment ttThe information is determined by an output gate, and the calculation method is as follows:
ot=sigmoid(Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6)
putting the word vector matrix of the training set, the word vector matrix of the specific target and the word vector matrix of the specific aspect into a target attention mechanism specifically comprises the following steps:
and calculating the word vector matrix of the training set and the word vector matrix of the specific target as follows:
Figure BDA0002012120840000033
where the representative Average returns the Average of the input vectors.
Figure BDA0002012120840000034
Is a matrix of word vectors for a particular target,
Figure BDA0002012120840000035
as a word vector matrix of the training set, cSIs operative to capture both target information and context information.
Computing a weight vector q over all k aspect-specific embeddingstThe formula is as follows:
qt=softmax(Wt·cS+bt) (8)
wherein q istRepresenting a vector of weights embedded over all k particular aspects, each weight qtIndicating the likelihood that a particular object belongs to a relevant aspect, WtAnd btRespectively representing the weight matrix and the bias vector.
Computing a target-specific vector tsThe formula is as follows
ts=T·qt (9)
Wherein t issVector representing a specific target, T represents a word vector matrix of a specific aspect, T ∈ RK×dWherein K represents a specific aspect number.
Computing a positive weight piThe formula is as follows:
Figure BDA0002012120840000041
wherein
Figure BDA0002012120840000042
Wa∈Rd×dIs a trainable weight matrix.
Computing a sentence representation ZSThe formula is as follows:
Figure BDA0002012120840000043
each hidden vector hiCorresponding to a positive weight piWherein the value piCalculated from the target attention model, piCan be interpreted as ω when determining the emotional polarity of a specific target aiIs the probability that the model is correctly focused on the word. ZSRepresenting sentences used for emotion classification.
In summary, the invention provides a method for classifying specific target emotion based on a deep neural network. Belonging to the field of text emotion classification of natural language processing. Firstly, Chinese word segmentation, stop word removal and punctuation removal are carried out on a data set, then word2vec algorithm is adopted to train the processed corpus to obtain corresponding word vectors, then the training set is input into a long-term and short-term memory network model structure based on a target attention mechanism, a specific target and a specific aspect are embedded in the process of realizing attention weight training, the specific target is expressed by weighted summation embedded in the specific aspect, so that the model gives more correct attention to the specific target and the specific aspect, the real semantics of the target is better captured, and finally the emotion classification accuracy of the specific target is improved.
Compared with the prior art, the invention has the following beneficial effects:
on the basis of a long-term and short-term memory network, a target attention mechanism is introduced, a specific target is represented by a weighted sum embedded in a specific aspect, so that the model can better capture the real semantics of the specific target, the model can give more correct attention to the specific target and the specific aspect, meanwhile, the influence of secondary information in a text is ignored or reduced, the real semantics of the target can be better captured, and finally, the emotion classification accuracy of the specific target is improved.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of the emotion classification model training of the present invention;
FIG. 2 is a view of the internal structure of the LSTM of the present invention;
FIG. 3 is the overall architecture of the emotion classification model of the present invention.
Detailed Description
In order to make the technical scheme in the embodiments of the present invention better understood and make the above objects, features and advantages of the present invention more obvious and understandable, the present invention provides an embodiment of a method for classifying specific target emotions based on a deep neural network, and the technical scheme of the present invention is further described in detail below with reference to the accompanying drawings:
the invention firstly provides a specific target emotion classification method based on a deep neural network, which comprises the following steps of:
s101, step 1: acquiring a Chinese emotion classification data set and preprocessing a text, and dividing the data set into a training set and a test set;
s102, step 2: training a word vector model for the preprocessed data set by using a word2vec tool and mapping texts in the data set into a word vector set;
s103, step 3: the word vector set of the training set is input into the LSTM, information is discarded or passed using three gates in the LSTM with trainable parameters, and a series of hidden vectors h ═ h is output1,h2,…,hn};
S104, step 4: putting the word vector matrix of the training set, the word vector matrix of the specific target and the word vector matrix of the specific aspect into a target attention mechanism to obtain each hiPositive weight p ofiThen, the sentence Z is obtainedS
S105, step 5: according to the generated sentence ZSAnd judging the emotional polarity of the specific target by using the full connection layer and the softmax function.
The preprocessing in the step 1 mainly comprises the steps of carrying out Chinese word segmentation, word stop and punctuation removal on the sentences marked with emotion polarities; 80% of the data sets were randomly selected as training sets and 20% as test sets.
After the word2vec model training is completed in step 2, the word2vec model may be used to map each word ω to a continuous feature vector eω∈RdWherein d represents the dimension of the word vector, and finally generating a word vector matrix E E ∈ Rv×dWhere V represents the size of the vocabulary in the dataset.
The three gates in the LSTM in step 3, as shown in fig. 2, include an input gate, a forgetting gate, and an output gate. Let xtFor input at t moment of certain node of LSTM neural network, htIs the output at time t, WxFor inputting the corresponding weight, WhIn order to output the corresponding weight value, the flow of updating the LSTM neural network model through the gate structure control information is to firstly calculate the value i of the input gate at the t momenttThe input gate controls the influence of the current input on the state value of the memory unit, and the calculation method is as follows
it=sigmoid(Wxixt+Whiht-1+Wcict-1+bi) (1)
Then calculating the value f of the moment t of forgetting to leave the doortThe influence of the historical information on the state value of the memory unit is controlled by the forgetting gate, and the calculation method is as follows:
ft=sigmoid(Wxfxt+Whfht-1+Wcfct-1+bf) (2)
recalculating current time candidate memory listValue of element
Figure BDA0002012120840000061
And updating the value of the memory cell at the current moment, wherein the calculation method comprises the following steps:
Figure BDA0002012120840000062
ct=ft·ct-1+it·ct (4)
finally calculating the output information h at the moment ttThe information is determined by an output gate, and the calculation method is as follows:
ot=sigmoid(Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6)
in step 4, the word vector matrix of the training set and the word vector matrix of the specific target are first averaged, as shown in the following formula:
Figure BDA0002012120840000063
where Average represents the Average of the returned input vectors.
Figure BDA0002012120840000064
Is a matrix of word vectors for a particular target,
Figure BDA0002012120840000065
as a word vector matrix of the training set, cSIs operative to capture both target information and context information.
Next step cSPut into the softmax function to calculate the weight vector q embedded on all k specific aspectstThe formula is as follows:
qt=softmax(Wt·cS+bt) (8)
wherein q istRepresenting a vector of weights embedded over all k particular aspects, each weight qtIndicating the likelihood that a particular object belongs to a relevant aspect, WtAnd btRespectively representing the weight matrix and the bias vector.
Q is to betPerforming point multiplication with the word vector matrix of a specific aspect to calculate a vector t of a specific targetsThe formula is as follows
ts=T·qt (9)
Wherein t issVector representing a specific target, T represents a word vector matrix of a specific aspect, T ∈ RK×dWhere K represents a specific number of facets, much smaller than V.
Next, a positive weight p is calculatediThe formula is as follows:
Figure BDA0002012120840000071
wherein
Figure BDA0002012120840000072
Wa∈Rd×dIs a trainable weight matrix.
Then the sentence representation Z is calculatedSThe formula is as follows:
Figure BDA0002012120840000073
each hidden vector hiCorresponding to a positive weight piWherein the value piCalculated from the target attention model, piCan be interpreted as ω when judging the emotion polarity of a specific target aiIs the probability that the model is correctly focused on the word. ZSRepresenting sentences used for emotion classification.
Finally, according to the generated sentence ZSThe emotion polarity of the specific target is judged by using the full connection layer and the softmax function, and a specific calculation flow is shown in fig. 3.
In summary, the invention provides a method for classifying specific target emotion based on a deep neural network. Belonging to the field of text emotion classification of natural language processing. Firstly, Chinese word segmentation, stop word removal and punctuation removal are carried out on a data set, then word2vec algorithm is adopted to train the processed corpus to obtain corresponding word vectors, then the training set is input into a long-term and short-term memory network model structure based on a target attention mechanism, a specific target and a specific aspect are embedded in the process of realizing attention weight training, the specific target is expressed by weighted summation embedded in the specific aspect, so that the model gives more correct attention to the specific target and the specific aspect, the real semantics of the target is better captured, and finally the emotion classification accuracy of the specific target is improved.
The above examples are intended to illustrate but not to limit the technical solutions of the present invention. Any modification or partial replacement without departing from the spirit and scope of the present invention should be covered in the claims of the present invention.

Claims (4)

1. A specific target emotion classification method based on a deep neural network is characterized by comprising the following steps:
step one, acquiring a Chinese emotion classification data set and preprocessing a text, and dividing the emotion classification data set into a training set and a test set;
secondly, training a word vector model for the preprocessed data set by using a word2vec tool and mapping texts in the data set into a word vector set;
step three, inputting the word vector set of the training set into the LSTM, discarding or transferring information by using three gates in the LSTM with trainable parameters, and outputting a series of hidden vectors h ═ { h { (h) }1,h2,…,hn};
Step four, putting the word vector matrix of the training set, the word vector matrix of the specific target and the word vector matrix of the specific aspect into a target attention mechanism to obtain each hiPositive weight p ofiSubsequently, a sentence representation Z is obtainedS
Step five, generating a sentence ZSJudging the emotion polarity of a specific target by using a full connection layer and a softmax function;
putting the word vector matrix of the training set, the word vector matrix of the specific target and the word vector matrix of the specific aspect into a target attention mechanism specifically comprises the following steps:
(5.1) calculating the word vector matrix of the training set and the word vector matrix of the specific target as follows:
Figure FDA0003649094910000011
where Average represents the Average of the returned input vectors,
Figure FDA0003649094910000012
is a matrix of word vectors for a particular target,
Figure FDA0003649094910000013
as a word vector matrix of the training set, cSFor capturing target information and context information simultaneously;
(5.2) computing a weight vector q embedded on all k specific aspectstThe formula is as follows:
qt=soft max(Wt·cS+bt) (8)
wherein q istRepresenting a vector of weights embedded over all k particular aspects, each weight qtIndicating the likelihood that a particular object belongs to a relevant aspect, WtAnd btRespectively representing a weight matrix and an offset vector;
(5.3) calculating a target-specific vector tsThe formula is as follows
ts=T·qt (9)
Wherein t issTarget-specific vector, T represents aspect-specific word vector matrix, T ∈ RK×dWherein K represents a specific aspect number;
(5.4) calculation ofPositive weight piThe formula is as follows:
Figure FDA0003649094910000014
wherein
Figure FDA0003649094910000021
Wa∈Rd×dIs a trainable weight matrix;
(5.5) calculating sentence representation ZSThe formula is as follows:
Figure FDA0003649094910000022
each hidden vector hiCorresponding to a positive weight piWherein the value piCalculated from the target attention model, piInterpreted as the probability, Z, of the word that the model is correctly interested in when judging the emotional polarity of a particular target aSRepresents a sentence for emotion classification;
in the process of realizing attention weight training, the specific target and the specific aspect are embedded, and the specific target is represented by the weighted sum of the embedding of the specific aspect, so that the emotion classification accuracy of the specific target is finally improved.
2. The method for classifying specific target emotion based on deep neural network as claimed in claim 1, wherein: the text preprocessing specifically comprises the following steps: carrying out Chinese word segmentation, word stop and punctuation removal on the sentences marked with emotion polarities; 80% of the data sets were randomly selected as training sets and 20% as test sets.
3. The method for classifying specific target emotion based on deep neural network as claimed in claim 1, wherein: training a word vector model using a word2vec tool on the preprocessed dataset comprises:
word2after the vec model training is completed, the word2vec model is used to map each word ω to a continuous feature vector eω∈RdWherein d represents the dimension of the word vector, and finally generating a word vector matrix E E ∈ Rv×dWhere V represents the size of the vocabulary in the dataset.
4. The method for classifying specific target emotion based on deep neural network as claimed in claim 1, wherein: the set of word vectors of the training set is input into the LSTM, information is discarded or passed using three gates in the LSTM with trainable parameters, and a series of hidden vectors h ═ { h ═ is output1,h2,…,hnThe method specifically comprises the following steps:
three gates in the LSTM, including an input gate, a forgetting gate and an output gate; let xtFor input at t moment of certain node of LSTM neural network, htIs the output at time t, WxFor inputting the corresponding weight, WhIn order to output the corresponding weight, the flow of updating the LSTM neural network model through the gate structure control information is divided into four steps:
(4.1) calculating the value i at the moment t of the input gatetThe input gate controls the influence of the current input on the state value of the memory unit, and the calculation method is as follows
it=sigmoid(Wxixt+Whiht-1+Wcict-1+bi) (1)
(4.2) calculating the value f at the moment of forgetting to gate ttThe influence of the historical information on the state value of the memory unit is controlled by the forgetting gate, and the calculation method is as follows:
ft=sigmoid(Wxfxt+Whfht-1+Wcfct-1+bf) (2)
(4.3) calculating the value of the candidate memory cell at the current time
Figure FDA0003649094910000031
And updating the value of the memory cell at the current time, calculatingThe method comprises the following steps:
Figure FDA0003649094910000032
ct=ft·ct-1+it·ct (4)
(4.4) finally calculating the output information h at the moment ttThe information is determined by an output gate, and the calculation method is as follows:
ot=sigmoid(Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6)。
CN201910249992.4A 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network Expired - Fee Related CN109992780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910249992.4A CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910249992.4A CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Publications (2)

Publication Number Publication Date
CN109992780A CN109992780A (en) 2019-07-09
CN109992780B true CN109992780B (en) 2022-07-01

Family

ID=67131875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910249992.4A Expired - Fee Related CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Country Status (1)

Country Link
CN (1) CN109992780B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390017B (en) * 2019-07-25 2022-12-27 中国民航大学 Target emotion analysis method and system based on attention gating convolutional network
CN110728298A (en) * 2019-09-05 2020-01-24 北京三快在线科技有限公司 Multi-task classification model training method, multi-task classification method and device
CN110517121A (en) * 2019-09-23 2019-11-29 重庆邮电大学 Method of Commodity Recommendation and the device for recommending the commodity based on comment text sentiment analysis
CN110704622A (en) * 2019-09-27 2020-01-17 北京明略软件系统有限公司 Text emotion classification method and device and electronic equipment
CN111191026A (en) * 2019-12-10 2020-05-22 央视国际网络无锡有限公司 Text classification method capable of calibrating specific segments
CN111291189B (en) * 2020-03-10 2020-12-04 北京芯盾时代科技有限公司 Text processing method and device and computer readable storage medium
CN111444728A (en) * 2020-04-20 2020-07-24 复旦大学 End-to-end aspect-based emotion analysis method
CN112115243B (en) * 2020-08-11 2023-06-16 南京理工大学 Session representation learning method by modeling time-series time correlation
CN112434161B (en) * 2020-11-24 2023-01-03 哈尔滨工程大学 Aspect-level emotion analysis method adopting bidirectional long-short term memory network
CN112464281B (en) * 2020-11-29 2022-11-18 深圳市索迪统计科技有限公司 Network information analysis method based on privacy grouping and emotion recognition
CN112699237B (en) * 2020-12-24 2021-10-15 百度在线网络技术(北京)有限公司 Label determination method, device and storage medium
CN114357166A (en) * 2021-12-31 2022-04-15 北京工业大学 Text classification method based on deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491490A (en) * 2017-07-19 2017-12-19 华东师范大学 Text sentiment classification method based on Emotion center
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
CN108717439A (en) * 2018-05-16 2018-10-30 哈尔滨理工大学 A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11205103B2 (en) * 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491490A (en) * 2017-07-19 2017-12-19 华东师范大学 Text sentiment classification method based on Emotion center
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
CN108717439A (en) * 2018-05-16 2018-10-30 哈尔滨理工大学 A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
attention-based LSTM for aspect-level sentiment classification;Yequan Wang 等;《Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing》;20161105;606-615 *
Improving sentiment analysis via sentence type classification using BiLSTM-CRF and CNN;Tao Chen 等;《Expert Systems With Applications》;20170415;221-230 *
基于多注意力卷积神经网络的特定目标情感分析;梁斌等;《计算机研究与发展》;20170815(第08期);1724-1735 *
基于改进注意力机制的中文细粒度情感分析;王振东;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20210215;I138-2800 *

Also Published As

Publication number Publication date
CN109992780A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109992780B (en) Specific target emotion classification method based on deep neural network
CN110502749B (en) Text relation extraction method based on double-layer attention mechanism and bidirectional GRU
CN108628823B (en) Named entity recognition method combining attention mechanism and multi-task collaborative training
Yao et al. An improved LSTM structure for natural language processing
CN107273355B (en) Chinese word vector generation method based on word and phrase joint training
Cui et al. Consensus attention-based neural networks for Chinese reading comprehension
WO2022007823A1 (en) Text data processing method and device
CN110609891A (en) Visual dialog generation method based on context awareness graph neural network
Ghorbanali et al. Ensemble transfer learning-based multimodal sentiment analysis using weighted convolutional neural networks
CN109308353B (en) Training method and device for word embedding model
CN111401061A (en) Method for identifying news opinion involved in case based on BERT and Bi L STM-Attention
CN109214006B (en) Natural language reasoning method for image enhanced hierarchical semantic representation
CN109271636B (en) Training method and device for word embedding model
CN112784041B (en) Chinese short text sentiment orientation analysis method
Guo et al. Who is answering whom? Finding “Reply-To” relations in group chats with deep bidirectional LSTM networks
CN113934835B (en) Retrieval type reply dialogue method and system combining keywords and semantic understanding representation
CN111046157B (en) Universal English man-machine conversation generation method and system based on balanced distribution
CN112560440A (en) Deep learning-based syntax dependence method for aspect-level emotion analysis
Ermatita et al. Sentiment Analysis of COVID-19 using Multimodal Fusion Neural Networks.
CN112349294A (en) Voice processing method and device, computer readable medium and electronic equipment
WO2023116572A1 (en) Word or sentence generation method and related device
CN111914084A (en) Deep learning-based emotion label text generation and evaluation system
CN115510230A (en) Mongolian emotion analysis method based on multi-dimensional feature fusion and comparative reinforcement learning mechanism
CN114519353A (en) Model training method, emotion message generation device, emotion message generation equipment and emotion message generation medium
Windiatmoko et al. Mi-Botway: A deep learning-based intelligent university enquiries chatbot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220701