CN114239565A - Deep learning-based emotion reason identification method and system - Google Patents

Deep learning-based emotion reason identification method and system Download PDF

Info

Publication number
CN114239565A
CN114239565A CN202111445330.8A CN202111445330A CN114239565A CN 114239565 A CN114239565 A CN 114239565A CN 202111445330 A CN202111445330 A CN 202111445330A CN 114239565 A CN114239565 A CN 114239565A
Authority
CN
China
Prior art keywords
statement
emotion
text
reason
bert
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111445330.8A
Other languages
Chinese (zh)
Inventor
何婷婷
范瑞
王逾凡
章哲铭
洪婕
戴汝峰
阿布都乃比江·库尔班
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central China Normal University
Original Assignee
Central China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central China Normal University filed Critical Central China Normal University
Priority to CN202111445330.8A priority Critical patent/CN114239565A/en
Publication of CN114239565A publication Critical patent/CN114239565A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Machine Translation (AREA)

Abstract

The invention provides a method and a system for recognizing emotion reasons based on deep learningeThe text data D combined with any sentence in the text is formedcp(ii) a And then using the new text data DcpTraining an emotion reason recognition model PECR-BERT, wherein the emotion reason recognition model PECR-BERT comprises a transform-based bidirectional coding indicator BERT model and two fully-connected neural network layers; finally, recognizing emotion d in text by using trained emotion reason recognition model PECR-BERTeCorresponding reason statement dc. The invention can effectively identify the reason sentences corresponding to the emotion sentences in the text, and the informationThe system can provide decision reference for governments, prevent negative emotion spreading of the public, provide more personalized service for users, improve user experience of commodities and have great value and research significance.

Description

Deep learning-based emotion reason identification method and system
Technical Field
The invention belongs to the technical field of text emotion analysis, and particularly relates to an emotion reason identification method and system based on deep learning.
Background
The text sentiment analysis is mainly used for processing and analyzing text data with sentiment colors through technologies such as data mining, natural language processing and the like, and aims to acquire sentiment information contained in the text. Text is generally classified into three categories according to emotional tendency in the text content: positive, negative or neutral, or finer granularity, can be classified into various emotions, such as disgust, sadness, anger, etc. The emotional information in the text can not only influence the selection of others, but also help the organization to know the preference degree of the user to the product, and the competitiveness of the product is improved.
As the popularity of networks and users increase, text information in the internet becomes more rich and complex, and for the processing of a piece of emotion data, in addition to analyzing subjective emotion contained in the data, it is also very important to know the reason for generating the emotion. The emotion reason identification technology can accurately capture the emotion reason information, and the information can provide effective help in many application scenes, such as providing decision reference for governments, preventing public negative emotion propagation, or providing more personalized services for users, improving the user experience of commodities, and the like.
Emotional cause recognition enables the corresponding cause sentence to be recognized in the text of a given emotional sentence. In the traditional solution, a rule-based method is mainly used, and the original sentence in the text is identified through the manually constructed language rule such as part of speech or grammar. The rule-based method has a good effect in a specific application environment, but the establishment of the rule needs to consume a large amount of manpower and material resources. With the expansion of the application environment, various rules need to be expanded manually to meet the application requirements, which may cause the consistency of the rules to be difficult to maintain, the difficulty of program maintenance to increase, and even the conditions of conflicting rules may occur, resulting in the paralysis of the whole system. In addition, a rule formulated in a certain scene is difficult to be suitable for the requirement of another scene, different application scenes have different language expression forms and characteristics, different rules need to be constructed for different scenes, and the construction of the rule requires very professional background knowledge of related fields, so that the rule-based method is not flexible enough, has low universality and is difficult to be widely applied. With the rise of deep learning algorithms, when large amount of data is faced, the emotion reason identification method based on deep learning has more advantages compared with the traditional method, complex and various manual rules are not needed, and only a large amount of data is needed, the model can learn the relationship between emotion sentences and reason sentences, so that the reason sentences corresponding to emotion in the text can be automatically identified. In conclusion, the emotion reason identification method is an important part in the emotion analysis field, and has a very important meaning for application in many scenes such as society, enterprises and individuals. The invention provides an emotion reason identification method based on deep learning based on a deep learning technology and a most advanced pre-training model in the field of current natural language processing.
Disclosure of Invention
The invention aims to solve the problem that the accuracy of recognizing the reason sentences corresponding to the emotion sentences in the text is improved by utilizing a deep learning technology.
The technical scheme of the invention provides an emotion reason identification method based on deep learning, which comprises the steps of firstly, carrying out emotion sentence D in a section of text DeThe text data D combined with any sentence in the text is formedcp(ii) a And then using the new text data DcpTraining an emotion reason recognition model PECR-BERT, wherein the emotion reason recognition model PECR-BERT comprises a transform-based bidirectional coding indicator BERT model and two fully-connected neural network layers; last using trainingRecognition model of emotion reason after PECR-BERT recognizes emotion d in texteCorresponding reason statement dc
Furthermore, the emotion sentence D in the text segment DeThe text D is combined with any sentence in the text to form a new text DcpIncluding performing the following processing for each text data,
one piece of text data D ═ D1,d2,…,di,…,dn},diThe method includes the steps of representing the ith sentence in a text, wherein i is 1,2, …, n, and the text is marked with an emotional sentence d in advanceeIn emotional statement deWithin a certain distance of selecting an arbitrary sentence djAs deCorresponding reason sentences to be judged, wherein | j-e | is less than or equal to r, r is more than or equal to 0 and less than or equal to n-1, e and j represent position serial numbers of sentences in the text, and r represents emotion sentences deAnd statement djBy the maximum relative distance between deD is selected within a certain distance from front to backjThe method of (2) constructs to obtain a text Dcp={de,dj,d1,d2,…,di,…,dnJ is more than or equal to r-e and less than or equal to r + e, and j is more than or equal to 1 and less than or equal to n.
Also, the user text DcpTraining the emotion cause recognition model PECR-BERT, which is realized as follows,
1) for text DcpThe processing obtains an input text I [ [ CLS ]],de,[SEP],[CLS],dj,[SEP],d1,d2,…,di,…,dn]Wherein, [ CLS]For the result judging the identifier, the first [ CLS ]]For judging the statement djWhether or not it is an emotional statement deCorresponding reason statement, second [ CLS]For judging the statement djWhether it is a cause statement, [ SEP ]]Are paragraph separation markers for separating emotional sentences deStatement to be judged djAnd entire text paragraphs;
performing word segmentation on an input text I by using a word list provided by a BERT model, wherein the length of a sequence after word segmentation is (M +2), and then obtaining a corresponding word vector sequence by using a word embedding technology
Figure BDA0003384724090000021
Representing two [ CLS ] in the input text I]Word vectors, x, corresponding to the designatorsmA word vector representing the mth word, M ═ 1,2, … M;
inputting the sequence X into a BERT model, and obtaining the characteristic representation of each word through calculation
Figure BDA0003384724090000022
Figure BDA0003384724090000023
Respectively represent two [ CLS]Output corresponding to the identifier, hiDenotes xiA corresponding output;
2) by a second [ CLS]Output corresponding to identifier
Figure BDA0003384724090000031
Prediction of djWhether it is a cause statement, the implementation process is as follows,
Figure BDA0003384724090000032
wherein WcIs a matrix representing trainable parameters of a fully-connected neural network layer, sigmoid represents an activation function, a numerical value is mapped to (0,1), and the fully-connected network layer with the sigmoid activation function is used for calculation to obtain a statement djScore y of reason sentencecIf y isc> 0.5, statement djIs a reason statement;
3) get emotional statement deAnd statement djRelative distance s betweene,jUsing word embedding processing to make constant relative distance se,jMapping to a feature vector pe,jThe first [ CLS ]]Output corresponding to identifier
Figure BDA0003384724090000033
Feature vector p of relative distancee,jConcatenating and then computing statement djIs an emotional statement deDivision of corresponding cause statementThe specific implementation process is as follows,
ycp=sigmoid(Wcp[hcls1,pe,j])
wherein WcpIs a matrix representing trainable parameters of a fully-connected neural network layer, sigmoid represents an activation function, and a statement d is obtained through calculation of the fully-connected layer and the sigmoid activation functionjIs an emotional statement deScore y of the corresponding cause statementcpIf y iscp> 0.5, statement djFor emotional statement deThe corresponding reason statement.
Furthermore, the emotion sentence D in the text D' which is subjected to analysis is directly obtained by using the trained emotion reason recognition model PECR-BERTeCorresponding reason statement dc
And the method is used for supporting the optimization of man-machine conversation, hot public opinion analysis or product quality analysis and optimization service.
On the other hand, the invention also provides an emotion reason identification system based on deep learning, which is used for realizing the emotion reason identification method based on deep learning.
Furthermore, the system comprises a processor and a memory, wherein the memory is used for storing program instructions, and the processor is used for calling the stored instructions in the memory to execute the emotion reason identification method based on deep learning.
Alternatively, a readable storage medium is included, on which a computer program is stored, which when executed, implements a deep learning based emotional cause recognition method as described above.
The invention provides a text emotion reason recognition method based on deep learning, which comprises the steps of firstly combining emotion sentences and any sentences in a section of text to construct training corpora, then training an emotion reason recognition model PECR-BERT to recognize whether the selected sentences are reason sentences corresponding to the emotion sentences, and finally recognizing the reason sentences corresponding to the emotion sentences in the text by using the trained emotion reason recognition model PECR-BERT. According to the emotion reason identification method provided by the invention, through combination of the emotion sentences and any sentences in the text, the training data scale can be effectively enlarged, and the problem that a deep learning model cannot be fully optimized due to data sparseness is avoided. In addition, the deep learning model PECR-BERT provided by the invention integrates the relative distance information between the two sentences when calculating the scores of the reason sentences corresponding to the emotion sentences, can effectively capture the distance relationship between the emotion sentences and the reason sentences, and improves the identification accuracy. Compared with a plurality of international advanced emotion reason recognition models on the basis of a Chinese emotion reason recognition data set, the results show that the emotion recognition method provided by the invention has a better effect on each index and realizes remarkable improvement on recognition accuracy. The method can effectively identify the reason sentences corresponding to the emotion sentences in the text, and the information not only can provide decision reference for the government and prevent negative emotion propagation of the public, but also can provide more personalized service for the user, improve the user experience of the commodity and have great market value and research significance.
Drawings
Fig. 1 is a flowchart of an emotion reason identification method in an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is specifically described below with reference to the accompanying drawings and examples.
At present, with the appearance of training models in deep learning, more and more researches attempt to introduce pre-training models to solve various problems and achieve good effects in many tasks. In the emotion reason recognition method, a plurality of researches introduce a pre-training model BERT into the task of solving emotion reason recognition, the main idea is to use the BERT model as a sentence encoder to obtain a vector representation of each sentence, and a self-attention mechanism in the BERT model can efficiently interact information between different sentences. In addition, the BERT model is used as a bidirectional language model, so that semantic information of preceding and following words can be better acquired, and dynamic word vector representation is provided. Based on the above two points, the clauses obtained by BERT indicate that the emotional cause recognition task can be better served. But these methods rely heavily on the subsequent design of the model to solve the pairing problem between different statements. Moreover, for a large-scale pre-training model, a larger model means more parameters need to be learned, and the scarcity of data may cause insufficient learning of the model.
Aiming at the problems, the invention provides a new emotion reason recognition method based on a pre-training model BERT, wherein an emotion reason recognition task is regarded as the relation prediction between sentence pairs, and the BERT model is utilized to learn whether the emotion reason relation exists between two sentences or not. In addition, a new corpus data set is reconstructed on the basis of the original corpus by simulating the training form of the next sentence prediction task in the BERT pre-training process, so that the size of the training data is effectively expanded. Compared with other methods using BERT, the PECR-BERT model provided by the invention is simpler and does not need additional pairing module design. Experiments prove that the PECR-BERT model provided by the invention has a better effect on Chinese emotion recognition data sets.
The emotion reason identification method based on deep learning provided by the embodiment of the invention comprises the following steps of firstly, identifying the emotion D in a section of text DeThe text data D combined with any sentence in the text is formedcp(ii) a And then using the new text data DcpTraining an emotion reason recognition model PECR-BERT, wherein the emotion reason recognition model PECR-BERT comprises a transform-based bidirectional coding indicator BERT model and two fully-connected neural network layers; finally, recognizing emotion d in text by using trained emotion reason recognition model PECR-BERTeCorresponding reason statement dc
BERT is an abbreviation for Bidirective Encoder reactivations from transformations, and in particular implementations, BERT is referred to in the prior art: kenton J D M W C, Toutanova L K. Bert: Pre-tracking of deep bidirectional transducers for road understating [ C ]// Proceedings of NAACL-HLT.2019:4171-4186. the present invention is not repeated herein.
Referring to fig. 1, the embodiment provides an emotion reason identification method based on deep learning, which specifically includes the following implementation processes:
step 1, in a section of text DEmotional statement deThe text D is combined with any sentence in the text to form a new text DcpThe method comprises the following steps of:
one piece of text data D ═ D1,d2,…,di,…,dn},diThe i-th sentence in the text is represented, i is 1,2, …, n is the number of sentences in the text data D; pre-marked emotional sentences d in texteIn emotional statement deSelecting any sentence d within a certain distancejAs deCorresponding reason sentences to be judged, wherein | j-e | is less than or equal to r, r is more than or equal to 0 and less than or equal to n-1, e and j represent position serial numbers of sentences in the text, and r represents emotion sentences deAnd statement djBy selecting d within a certain distancejThe method of (2) constructs to obtain a text Dcp={de,dj,d1,d2,…,di,…,dnJ is more than or equal to r-e and less than or equal to r + e, and j is more than or equal to 1 and less than or equal to n. The embodiment proposes to choose r 10, i.e. d, which is associated with the emotional sentenceeSentences within a distance of 10 are all selected as deAnd the corresponding reason statement to be judged. In specific implementation, the user can preset the value of the maximum distance r.
In the whole training data construction process, a plurality of pieces of training data can be constructed for each section of text, so that the size of the data is effectively expanded. In general, larger data sets are helpful for deep learning models, but not all sentences are selected in the actual construction process. But the selected sentence to be judged is limited by the maximum relative distance r between sentences. Because the expression of emotion and the reason in the text are close in distance, picking data in such a way covers most of data with high scores, and many data with low scores are removed, so that the data volume is effectively reduced, and the model can learn the relationship between emotion and reason more effectively. The embodiment preferably proposes to choose r 10, i.e. the term d associated with the emotioneSentence d within distance 10jAre all selected as deAnd corresponding reason sentences to be judged. In particular, the user can presetThe value of the maximum distance r.
Step 2, training text D obtained based on step 1cpFor text DcpProcessing the text to obtain an input text I [ [ CLS ]],de,[SEP],[CLS],dj,[SEP],d1,d2,…,di,…,dn]Wherein, [ CLS]For the result judging the identifier, the first [ CLS ]]For judging the statement djWhether or not it is an emotional statement deCorresponding reason statement, second [ CLS]For judging the statement djWhether it is a cause statement, [ SEP ]]Are paragraph separation markers for separating emotional sentences deStatement to be judged djAnd an entire text passage. I.e. inputting d in text IeAfter the statement is finished, there will be one SEP],[SEP]Followed by [ CLS]。
Step 3, using a word list provided by a BERT model to perform word segmentation on the input text I, wherein the sequence length after word segmentation is (M +2), and then using a word embedding technology to obtain a corresponding word vector sequence
Figure BDA0003384724090000051
Figure BDA0003384724090000061
Representing two [ CLS ] in the input text I]Word vectors, x, corresponding to the designatorsmA word vector representing the mth word, M being 1,2, … M.
The word vector sequence X is composed of three parts, namely a character vector (Token Embedding), a Segment vector (Segment Embedding) and a Position vector (Position Embedding), wherein the Token Embedding is represented by a vector corresponding to each character, the Segment Embedding uses vectors corresponding to 0 and 1 to represent that the input text belongs to different segments, and the Segment Embedding is realized by [ SEP ]]The identifiers are separated so that each fragment can be distinguished. The Position Embedding uses the Position vector to represent the Position of the input character in the text, so that the Position information in the text can be better ensured not to be lost. Respectively using tm,smAnd pmThe character vector representation, the segment vector representation and the position vector representation representing the m-th word, and finally the vector representation of the m-th word is the sum of these three parts, i.e. xm=tm+sm+pm
Figure BDA0003384724090000062
And
Figure BDA0003384724090000063
and also by adding the corresponding three parts.
Step 4, inputting the vector sequence X into a BERT model, and calculating through the BERT model to obtain the feature representation of each word
Figure BDA0003384724090000064
Respectively represent two [ CLS]Output corresponding to the identifier, hmDenotes xmAnd outputting correspondingly.
BERT is an abbreviation for Bidirective Encoder reactivations from transformations, and in particular implementations, BERT is referred to in the prior art: kenton J D M W C, Toutanova L K. Bert: Pre-tracking of deep bidirectional transducers for road understating [ C ]// Proceedings of NAACL-HLT.2019:4171-4186. the present invention is not repeated herein.
Step 5, output second [ CLS ]]Output corresponding to identifier
Figure BDA0003384724090000067
Prediction of djWhether the reason is the reason statement or not, the specific implementation process is as follows,
Figure BDA0003384724090000065
wherein WcIs a matrix representing trainable parameters of a fully-connected neural network layer, sigmoid represents an activation function, a numerical value can be mapped to (0,1), and the fully-connected neural network layer with the sigmoid activation function is used for calculation to obtain a statement djScore y of reason sentencecIf y isc> 0.5, statement djIs a cause statement.
The fully-connected neural network layer generally comprises an input layer, an output layer and a hidden layer, wherein each node in the hidden layer receives input signals transmitted by all nodes of the input layer, and the input signals need to be transmitted after being calculated by weight parameters and a nonlinear activation function. The fully connected neural network layer implementation can be seen in: svozil D, Kvasnika V, Pospichal J. introduction to Multi-layer feed-forward neural networks [ J ]. Chemimetics and interaction laboratories, 1997,39(1):43-62.
In this step, the input of the fully-connected neural network is
Figure BDA0003384724090000066
The output of the fully-connected neural network, derived from BERT, is yc
Step 6, because the appearance of the reason sentences in the text is usually within a short distance of the emotion sentences, the relative distance information between the sentences has strong reference for judging the reason sentences. Although the BERT model has a position vector, the position vector is based on character positions, and no distance information is represented between sentences. Thus by obtaining an emotional statement deAnd statement djRelative distance s betweene,jUsing word embedding techniques to map a constant relative distance se,jMapping to a feature vector pe,jThe first [ CLS ]]Output of designators
Figure BDA0003384724090000071
Feature vector p of relative distancee,jConcatenating and then computing statement djIs an emotional statement deThe score of the corresponding reason statement is realized by the following specific processes,
ycp=sigmoid(Wcp[hcls1,pe,j])
wherein WcpIs a matrix representing trainable parameters of the fully-connected neural network layer, sigmoid represents an activation function, and a statement d is obtained by calculation of the fully-connected neural network layer and the sigmoid activation functionjIs an emotional statement deScore y of the corresponding cause statementcpIf y iscp> 0.5, statement djFor emotional statement deThe corresponding reason statement.
Step 7, constructing training data for each section of text according to the step 1, and then repeating the steps 2-6 to obtain two scores y of the sentence to be judgedcAnd ycpTraining a PECR-BERT model by comparing with a real score (the score of a reason sentence corresponding to an emotion sentence in each piece of data is 1, otherwise is 0) to continuously iterate and optimize parameters in the model, wherein the emotion reason recognition model PECR-BERT comprises a bidirectional coding indicator BERT model and two fully connected neural network layers, and calculating (calculating in a corresponding mode of steps 1-6) by using the trained emotion reason recognition model PECR-BERT to obtain an emotion sentence D in a text D' to be analyzedeCorresponding reason statement dc
By adopting the method, the emotion reason recognition of the text can be realized based on the deep learning model PECR-BERT.
In specific implementation, a person skilled in the art can implement an automatic operation process by using a computer software technology, and accordingly, if a method for emotion reason identification based on deep learning is provided, which includes a computer or a server, and the above process is executed on the computer or the server to implement emotion reason identification by using a PECR-BERT model, the method provided by the technical scheme of the present invention should also be within the scope of the present invention.
In some possible embodiments, a deep learning based emotion cause recognition system is provided, which includes a processor and a memory, the memory is used for storing program instructions, and the processor is used for calling the stored instructions in the memory to execute a deep learning based emotion cause recognition method as described above.
In some possible embodiments, a deep learning based emotion cause recognition system is provided, which includes a readable storage medium, on which a computer program is stored, and when the computer program is executed, the deep learning based emotion cause recognition method is implemented.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (8)

1. A emotion reason identification method based on deep learning is characterized in that: first, the emotion D in a piece of text DeThe text data D combined with any sentence in the text is formedcp(ii) a And then using the new text data DcpTraining an emotion reason recognition model PECR-BERT, wherein the emotion reason recognition model PECR-BERT comprises a transform-based bidirectional coding indicator BERT model and two fully-connected neural network layers; finally, recognizing emotion d in text by using trained emotion reason recognition model PECR-BERTeCorresponding reason statement dc
2. The emotion cause recognition method based on deep learning of claim 1, wherein: combining the emotional sentence de in a section of text D with any sentence in the section of text to form a new text DcpIncluding performing the following processing for each text data,
one piece of text data D ═ D1,d2,…,di,…,dn},diThe method includes the steps of representing the ith sentence in a text, wherein i is 1,2, …, n, and the text is marked with an emotional sentence d in advanceeIn emotional statement deWithin a certain distance of selecting an arbitrary sentence djAs deCorresponding reason sentences to be judged, wherein | j-e | is less than or equal to r, r is more than or equal to 0 and less than or equal to n-1, e and j represent position serial numbers of sentences in the text, and r represents emotion sentences deAnd statement djBy the maximum relative distance between deD is selected within a certain distance from front to backjThe method of (2) constructs to obtain a text Dcp={de,dj,d1,d2,…,di,…,dnJ is more than or equal to r-e and less than or equal to r + e, and j is more than or equal to 1 and less than or equal to n.
3. The emotion cause recognition method based on deep learning of claim 1, wherein: by text DcpTraining the emotion cause recognition model PECR-BERT, which is realized as follows,
1) for text DcpThe processing obtains an input text I [ [ CLS ]],de,[SEP],[CLS],dj,[SEP],d1,d2,…,di,…,dn]Wherein, [ CLS]For the result judging the identifier, the first [ CLS ]]For judging the statement djWhether or not it is an emotional statement deCorresponding reason statement, second [ CLS]For judging the statement djWhether it is a cause statement, [ SEP ]]Are paragraph separation markers for separating emotional sentences deStatement to be judged djAnd entire text paragraphs;
performing word segmentation on an input text I by using a word list provided by a BERT model, wherein the length of a sequence after word segmentation is (M +2), and then obtaining a corresponding word vector sequence by using a word embedding technology
Figure FDA0003384724080000011
Figure FDA0003384724080000012
Representing two [ CLS ] in the input text I]Word vectors, x, corresponding to the designatorsmA word vector representing the mth word, M ═ 1,2, … M;
inputting the sequence X into a BERT model, and obtaining the characteristic representation of each word through calculation
Figure FDA0003384724080000013
Figure FDA0003384724080000014
Figure FDA0003384724080000015
Respectively represent two [ CLS]Output corresponding to the identifier, hiDenotes xiA corresponding output;
2) by a second [ CLS]Output corresponding to identifier
Figure FDA0003384724080000016
Prediction of djWhether it is a cause statement, the implementation process is as follows,
Figure FDA0003384724080000017
wherein WcIs a matrix representing trainable parameters of a fully-connected neural network layer, sigmoid represents an activation function, a numerical value is mapped to (0,1), and the fully-connected network layer with the sigmoid activation function is used for calculation to obtain a statement djScore y of reason sentencecIf y isc> 0.5, statement djIs a reason statement;
3) get emotional statement deAnd statement djRelative distance s betweene,jUsing word embedding processing to make constant relative distance se,jMapping to a feature vector pe,jThe first [ CLS ]]Output corresponding to identifier
Figure FDA0003384724080000021
Feature vector p of relative distancee,jConcatenating and then computing statement djIs an emotional statement deThe score of the corresponding reason statement is realized by the following specific processes,
ycp=sigmoid(Wcp[hcls1,pe,j])
wherein WcpIs a matrix representing trainable parameters of a fully-connected neural network layer, sigmoid represents an activation function, and a statement d is obtained through calculation of the fully-connected layer and the sigmoid activation functionjIs an emotional statement deScore y of the corresponding cause statementcpIf y iscp> 0.5, statement djFor emotional statement deThe corresponding reason statement.
4. The emotion reason recognition method based on deep learning according to claim 1,2 or 3, wherein: directly obtaining an emotion statement D in a text D' for pre-analysis by using a trained emotion reason recognition model PECR-BERTeCorresponding reason statement dc
5. The emotion reason recognition method based on deep learning according to claim 1,2 or 3, wherein: the method is used for supporting the optimization of man-machine conversation, the hot public opinion analysis or the product quality analysis and the optimization service.
6. An emotion reason recognition system based on deep learning is characterized in that: for implementing a deep learning based emotional cause recognition method according to any of claims 1-5.
7. The deep learning based emotion reason recognition system of claim 6, wherein: comprising a processor and a memory, the memory being arranged to store program instructions, the processor being arranged to invoke the stored instructions in the memory to perform a method of deep learning based emotional cause recognition according to any of the claims 1-5.
8. The deep learning based emotion reason recognition system of claim 6, wherein: comprising a readable storage medium having stored thereon a computer program which, when executed, implements a method for deep learning based emotional cause recognition according to any of claims 1-5.
CN202111445330.8A 2021-11-30 2021-11-30 Deep learning-based emotion reason identification method and system Pending CN114239565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111445330.8A CN114239565A (en) 2021-11-30 2021-11-30 Deep learning-based emotion reason identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111445330.8A CN114239565A (en) 2021-11-30 2021-11-30 Deep learning-based emotion reason identification method and system

Publications (1)

Publication Number Publication Date
CN114239565A true CN114239565A (en) 2022-03-25

Family

ID=80752298

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111445330.8A Pending CN114239565A (en) 2021-11-30 2021-11-30 Deep learning-based emotion reason identification method and system

Country Status (1)

Country Link
CN (1) CN114239565A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116521872A (en) * 2023-04-27 2023-08-01 华中师范大学 Combined recognition method and system for cognition and emotion and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116521872A (en) * 2023-04-27 2023-08-01 华中师范大学 Combined recognition method and system for cognition and emotion and electronic equipment
CN116521872B (en) * 2023-04-27 2023-12-26 华中师范大学 Combined recognition method and system for cognition and emotion and electronic equipment

Similar Documents

Publication Publication Date Title
CN110245229B (en) Deep learning theme emotion classification method based on data enhancement
CN112199956B (en) Entity emotion analysis method based on deep representation learning
CN108170848B (en) Chinese mobile intelligent customer service-oriented conversation scene classification method
CN113505200B (en) Sentence-level Chinese event detection method combined with document key information
CN111709223B (en) Sentence vector generation method and device based on bert and electronic equipment
CN112434142B (en) Method for marking training sample, server, computing equipment and storage medium
CN112861524A (en) Deep learning-based multilevel Chinese fine-grained emotion analysis method
CN113065344A (en) Cross-corpus emotion recognition method based on transfer learning and attention mechanism
CN112818698B (en) Fine-grained user comment sentiment analysis method based on dual-channel model
CN112115242A (en) Intelligent customer service question-answering system based on naive Bayes classification algorithm
CN112632244A (en) Man-machine conversation optimization method and device, computer equipment and storage medium
CN113987167A (en) Dependency perception graph convolutional network-based aspect-level emotion classification method and system
CN114911932A (en) Heterogeneous graph structure multi-conversation person emotion analysis method based on theme semantic enhancement
CN116010581A (en) Knowledge graph question-answering method and system based on power grid hidden trouble shooting scene
CN112784878A (en) Intelligent correction method and system for Chinese discussion papers
CN112183106A (en) Semantic understanding method and device based on phoneme association and deep learning
CN113486174B (en) Model training, reading understanding method and device, electronic equipment and storage medium
CN113326374B (en) Short text emotion classification method and system based on feature enhancement
CN113627550A (en) Image-text emotion analysis method based on multi-mode fusion
CN114239565A (en) Deep learning-based emotion reason identification method and system
CN112988970A (en) Text matching algorithm serving intelligent question-answering system
CN112257432A (en) Self-adaptive intention identification method and device and electronic equipment
CN115204143B (en) Method and system for calculating text similarity based on prompt
CN111737467A (en) Object-level emotion classification method based on segmented convolutional neural network
CN111368524A (en) Microblog viewpoint sentence recognition method based on self-attention bidirectional GRU and SVM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination