CN107038154A - A kind of text emotion recognition methods and device - Google Patents

A kind of text emotion recognition methods and device Download PDF

Info

Publication number
CN107038154A
CN107038154A CN201611067926.8A CN201611067926A CN107038154A CN 107038154 A CN107038154 A CN 107038154A CN 201611067926 A CN201611067926 A CN 201611067926A CN 107038154 A CN107038154 A CN 107038154A
Authority
CN
China
Prior art keywords
text
emotion
semantic
data
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611067926.8A
Other languages
Chinese (zh)
Inventor
刘佳
陈建刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201611067926.8A priority Critical patent/CN107038154A/en
Publication of CN107038154A publication Critical patent/CN107038154A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The present invention provides a kind of text emotion recognition methods and device, and wherein method includes:To text to be identified, the semantic vector comprising text semantic is encoded into by language model;Using the emotion model of training in advance, using the semantic vector as the input of the emotion model, export and be identified the probable value that the semantic vector belongs to each affective style;The big affective style of select probability value, the emotion of the text obtained as identification.The present invention improves the accuracy rate of text emotion identification.

Description

A kind of text emotion recognition methods and device
Technical field
The present invention relates to computer technology, more particularly to a kind of text emotion recognition methods and device.
Background technology
Emotion recognition is carried out to text, may all be used in many places, such as, public sentiment system can capture reservation station The Web content of point, such as microblogging, forum, news, and the emotion of the text analyzing user according to these contents, if negative The emotion in face, shows that perhaps user exists to corresponding business discontented, business needs to be improved, then needs artificial treatment, can The problem of to be existed according to the text analyzing business with negative emotion, and then be improved.Can also in other application scene Using to the identification to text emotion.
In correlation technique, the keyword that can be included according to text analyzes the emotion that text is embodied, such as, Text emotion can be recognized according to keyword and the matched rule of emotion.But, this method that emotion is recognized according to keyword Accuracy rate is relatively low, the emotion recognized sometimes and the real feelings for not meeting text.
The content of the invention
In view of this, the present invention provides a kind of text emotion recognition methods and device, to improve the standard of text emotion identification True rate.
Specifically, the present invention is achieved through the following technical solutions:
First aspect includes there is provided a kind of text emotion recognition methods, methods described:
To text to be identified, the semantic vector comprising text semantic is encoded into by language model;
Using the emotion model of training in advance, using the semantic vector as the input of the emotion model, output obtains Recognize that the semantic vector belongs to the probable value of each affective style;
The big affective style of select probability value, the emotion of the text obtained as identification.
Second aspect includes there is provided a kind of text emotion identifying device, described device:
Text code module, for text to be identified, the semanteme comprising text semantic to be encoded into by language model Vector;
Output module is predicted, for the emotion model using training in advance, the semantic vector is regard as the emotion mould The input of type, output is identified the probable value that the semantic vector belongs to each affective style;
Emotion determining module, the affective style big for select probability value, the emotion of the text obtained as identification.
The text emotion recognition methods of the present invention and device, by converting text to comprising semantic semantic vector, then Emotion recognition is carried out to the semantic vector so that the emotion recognized is the emotion embodied comprising semantic text message, Relative to the emotion differentiated according only to keyword, due to considering the semanteme of text, accuracy rate gets a promotion.
Brief description of the drawings
Fig. 1 is a kind of flow chart of text emotion recognition methods provided in an embodiment of the present invention;
Fig. 2 is a kind of flow according to Recognition with Recurrent Neural Network train language model provided in an embodiment of the present invention;
Fig. 3 is a kind of structural representation of the Recognition with Recurrent Neural Network provided in an embodiment of the present invention on time series;
Fig. 4 is a kind of structural representation of full Connection Neural Network provided in an embodiment of the present invention;
Fig. 5 is a kind of flow according to Recognition with Recurrent Neural Network train language model provided in an embodiment of the present invention;
Fig. 6 is a kind of structural representation of text emotion identifying device provided in an embodiment of the present invention;
Fig. 7 is the structural representation of another text emotion identifying device provided in an embodiment of the present invention.
Embodiment
Some texts are with Sentiment orientation:
Such as, " this application is not too handy, and payment all fails more than once!", what the text was embodied is the negative of user Face emotion, negative emotion generally represents that user is dissatisfied to some goal object, existing defects.
Again such as, " this application is very good, supplements with money later more convenient!", what the problem was embodied is the front of user Emotion, positive emotion generally represents that user is felt quite pleased some goal object.
Many places may all need to carry out emotion recognition to text, for example, public sentiment system can be gathered in a period of time Evaluation of the user to some new upper business, if the negative emotion of the more embodiment user of collection text, needs artificial treatment, The problem of being existed according to the text analyzing business with negative emotion, and then be improved.The embodiment of the present application provides one Text emotion recognition methods is planted, will be used to carry out emotion recognition to text, and this method can cause the feelings to text identification Sense has higher accuracy rate.
For a text, many keywords with Sentiment orientation can be included in text, such as, " good ", " no Well ", " convenience ", " not handy " etc., usual this keyword can embody certain text emotion.But, only according to key Word recognizes emotion, there may come a time when to malfunction, such as, and the word that " good ", " convenience " etc. embody positive emotion is included in a text Language, but expressed by text entirety be not positive emotion, it may be possible to user is using irony expression way, is in table in fact It is discontented with up to things.The embodiment emotion that i.e. text includes word is likely to be and the emotion of text essence has deviation.
A part of reason of above-mentioned deviation situation is caused, can be when simply extracting the key wordses in text, not Consider the contact between word, do not account for the appearance order between word, lack to the overall semantic understanding of text.Therefore, The text emotion recognition methods of the embodiment of the present application, when recognizing the emotion of text, will combine text semantic to recognize.
Fig. 1 illustrates the flow chart of the text emotion recognition methods of the application, can include:
In a step 101, to text to be identified, the semantic vector comprising text semantic is encoded into by language model.
, can be by text code into semantic vector in this step, the semantic vector can include the semantic information of text.Should Coding can realize that, by a text input language model, the model can export the corresponding language of text by language model Adopted vector.The training of language model will be described in embodiment below.
By text code, into semantic vector, can not only be caused the vector include the semanteme of text, and need not The keyword in text is extracted, equivalent to directly carrying out emotion recognition with original text message.
In a step 102, using the emotion model of training in advance, it regard the semantic vector as the defeated of the emotion model Enter, output is identified the probable value that the semantic vector belongs to each affective style.
In this example, emotion model can be that training in advance is good, and the training of emotion model will be described in follow-up example Process.This step can be using the semantic vector obtained in step 101 as the input of emotion model, and the output of emotion model can be with It is the probable value that the semantic vector belongs to each affective style, also, the probable value sum of each affective style can be 1.
It is exemplified below:Assuming that the output of emotion model includes two affective styles, it is respectively " negative emotion " and " non-negative Emotion ", after the corresponding semantic vector input model of some text, what output was obtained can be that the vector belongs to " negative emotion " Probable value be 0.8, the probable value for belonging to " non-negative emotion " is 0.2.
In addition, the affective style of emotion model output, independently sets the need for being analyzed according to practical business.Such as, If the main purpose of text emotion identification is to identify the text with negative emotion, it can set " negative emotion " and " non- Two types of negative emotion ";Or, if the main purpose of text emotion identification is to identify the text with positive emotion, " positive emotion " and " non-frontal emotion " two types can be set.There can also be the example of other affective styles, no longer in detail Lift.
In step 103, the big affective style of select probability value, the emotion of the text obtained as identification.
For example, in the example of step 102, if the probable value that text belongs to " negative emotion " is 0.8, belonging to " non-negative The probable value of face emotion " is 0.2, then the emotion that can determine text is negative emotion.
The text emotion recognition methods of this example, by converting text to comprising semantic semantic vector, then to the language Adopted vector carries out emotion recognition so that the emotion recognized is the emotion embodied comprising semantic text message, relative to The emotion differentiated according only to keyword, due to considering the semanteme of text, accuracy rate gets a promotion.For example, will in actual implementation This method has been applied in the identification of the text emotion of public sentiment system, is found by experiment, the rate of accuracy reached of this method to 90% More than.
In following example, the training to language model and emotion model is briefly described.
The training of language model:
In the example of the application, language model, such as, the neutral net bag that can be used can be built based on neutral net Include:Recognition with Recurrent Neural Network, time recurrent neural network (Long Short-Term Memory, LSTM), Multi-Layer Feedback nerve net Network (Recurrent neural Network, RNN), deep neural network (Paragraph2vec) etc..
Fig. 2 illustrates the flow according to Recognition with Recurrent Neural Network train language model, can include:
In step 201, corpus of text to be trained is obtained.
In this step, substantial amounts of text on internet can be obtained, the language material of model training is used as.This example can not be limited Mode and content of text that text processed is obtained.
In step 202., by the Feature Words of each word rank in the corpus of text, it is expressed as term vector.
Fig. 3 illustrates structural representation of the Recognition with Recurrent Neural Network on time series, and the training method of the network i.e., is led to The top n word for crossing sentence predicts the N+1 word.For example, the single word in sentence is used in this example as " word ", with sentence Exemplified by [train language model], " instruction " is a word, and " white silk " is a word, and " language " is a word, and " speech " is a word, with such Push away, each word can be referred to as to the Feature Words of word rank.
As shown in figure 3, the word in sentence can be expressed as the form of term vector, sentence is made up of in order word.Vector Form for example can be:[-0.02,0.349,-0.97,0.633,0.496,0.278,0.803,0.474,-0.24,-0.16]. X (t) in Fig. 3 is that the term vector of t-th of word in sentence is represented, h (t) is the hiding vector in network, represents current context, y (t) combination current context h (t) and current word x (t) output are represented.Wherein, w represents the parameter in network, for example, h (t) Can be calculated to obtain according to certain formula by x (t) and h (t-1), the parameter that formula includes can be summarized to be represented with w;And example Such as, y (t) can be calculated to obtain according to certain formula by x (t) and h (t), and the parameter that formula includes can also be represented with w. The network parameter being related in specific Recognition with Recurrent Neural Network, can be performed, no according to the training of conventional Recognition with Recurrent Neural Network Repeat again.
In step 203, according to the term vector, the language model is built based on neutral net.
When the network based on Fig. 3 examples carries out model training, for example, can be predicted according to x (t) in y (t), the y (t) Next word comprising x (t), and the parameter in the actual whole network of tone occurred.Such as, still with sentence [training language Model] exemplified by, the structure in Fig. 3, x (t-1) can be " instruction " represented with term vector, can be predicted according to h (t-1) Y (t-1), the y (t-1) are included with the word of next appearance of " instruction " of vector representation, and in sentence [train language model], The actual word occurred below in " instruction " is " white silk ", the term vector y (t- obtained according to the word of the actual appearance and model training prediction 1), can be with adjusting parameter w, to cause the word " white silk " that can occur after adjustment with next reality of Accurate Prediction to " instruction ".When In sequence sequence, x (t) can be the white silk of vector representation, and now corresponding h (t) can be that synthesis is obtained according to x (t), h (t-1), w The hiding vector arrived, prediction y (t) is removed according to the h (t), and method ibid, can continue adjusting parameter w according to the actual word occurred.
During model training as described above, due to the hiding vectorial h (t) in Recognition with Recurrent Neural Network, combine x (t), The factors such as h (t-1), w, i.e., this hide vector h (t) and obtain, it is contemplated that occurring in sentence sequentially positioned at word above, Consider word before in sentence text after the prediction the word in face when linguistic context, be a kind of synthesis of the linguistic context of context, therefore The semanteme of text is contained in resulting vector.By constantly training and adjustment, language model can be obtained, by a text This inputs the language model, can be encoded into containing semantic vector.
The training of emotion model:
In this example, the training of emotion model can be based on full Connection Neural Network, and Fig. 4 illustrates a full connection nerve The structural representation of network, as shown in figure 4, the neutral net can include input layer Input Layer, hidden layer Hidden Layer, output layer Output layer, wherein, to recognize exemplified by " negative emotion " and " non-negative emotion ", two of output layer Node can be output as negative and non-negative probable value by representative model respectively.
Fig. 5 illustrates the flow according to Recognition with Recurrent Neural Network train language model, can include:
In step 501, data of the collection with emotion relevant parameter.
For example, the evaluating data of user can be captured by some direction sites, such as, user can be obtained to applying shop Software evaluation, the evaluation to hotel, the evaluation to film, to data such as the evaluations in dining room.Also, it can be wrapped in these data Relevant parameter containing emotion, the emotion relevant parameter refers to the parameter that can be used to represent certain Sentiment orientation, such as, is obtained above-mentioned In the evaluating data taken, there can be marking again with existing evaluation content, the marking is just properly termed as emotion relevant parameter, because can To reflect the emotion of user according to the height of marking, generally marking is higher shows that user evaluates preferable to things, otherwise, shows to comment Valency is relatively low.
In step 502, according to the emotion relevant parameter, the corresponding affective style of the data is obtained.
Exemplified by recognizing negative emotion, this example can recognize emotion according to the emotion relevant parameter obtained in step 501, Such as, can according to the height of marking, by evaluation be divided into favorable comment, in comment and difference is commented, i.e., positive, neutral and negative sense emotion.
In step 503, by sentiment dictionary, by data content affective style corresponding with the emotion relevant parameter not Consistent data are filtered, and regard the remaining data after filtering as training data.
For example, in the evaluating data of user, the Sentiment orientation of data content and emotion relevant parameter is inconsistent sometimes, than Such as, some evaluation contents are that difference is commented, and score is higher, represents favorable comment;Or, evaluation content is favorable comment, but score compared with It is low to be commented for difference.Accordingly, it would be desirable to be screened again to evaluating data.It can be commented but content by a sentiment dictionary difference is scored at The data of only positive emotion word, and it is scored at favorable comment but content only has the data of negative sense emotion word, all filter out.It is remaining Data after filtering are the higher data of quality, will not include content and the inconsistent data of score, can as training data, For training pattern.
In this example, training data used in emotion model training is obtained automatically, as described above, can be automatic Network data is obtained, and data are filtered by sentiment dictionary, obtain the higher training data of quality, it is to avoid traditional side The operation manually marked in formula, the procurement cost reduction of training data.
In step 504, training data is encoded into comprising text semantic as text to be identified by language model Semantic vector.
This step can use the foregoing language model trained and obtained, and the training data obtained in step 503 is encoded into Semantic vector, the semantic vector contains the semanteme of text.
In step 505, according to the corresponding semantic vector of the training data, training obtains emotion model.
This step can be the neural network training process of standard of comparison.For example, what can be exported according to model in Fig. 4 is general Rate value (for example, probable value of the probable value of negative emotion and non-negative emotion), and actual training data emotional semantic classification As a result (for example, a certain bar user evaluating data, determines the affective style of the evaluating data according to marking value), using anti- To the method for gradient derivation, from last layer, the Grad of preceding layer is calculated successively, then update parameter in network model and The term vector of input is represented.The parameter renewal process of this step, can both include the renewal to the neural network parameter in Fig. 4, It can also include updating the parameter in the language model for encoding training data.
In the method for this example, in the neutral net of training emotion model, addition has used language model, will nerve The input layer training data of network, is semantic vector with language model based coding, then predicts the emotion class belonging to the semantic vector The probability of type so that the emotion model that training is obtained is that, according to the emotional semantic classification arrived comprising semantic text identification, accuracy rate is obtained To be lifted.
In addition, in one example, if the text for participating in model training is generally short text, then in actual use, Can be different according to the text type of input, using different processing modes.Such as, in public sentiment system, microblogging text can be with Directly as input;Newsletter archive takes the title of news as input text, and these are all short texts.
And for some longer texts, text can be divided into multiple Ziwen sheets, each Ziwen sheet by language model, It is separately encoded into the semantic vector comprising text semantic;Then, by by the probability for belonging to same affective style of each Ziwen sheet Value asks for average, selects the big affective style of average as the emotion of the text.For example, by taking forum's text as an example, the type Text may be longer, now can therefrom take a part of content, the title of notice as a sub- text, model content first five Word is respectively as a sub- text, and that comes to six sub- texts;It is encoded into respectively by language model comprising text semantic Semantic vector, semantic vector is identified to the probable value of emotional semantic classification by emotion model, each of which Ziwen originally all may be used To be belonged to the probable value of " negative emotion " and the probable value of " non-negative emotion ".Then it is six sub- texts are corresponding " negative The probable value of face emotion " is averaged, and corresponding " non-negative emotion " probable value of six sub- texts is averaged, average value is big The affective style emotion of forum's text then to recognize.
In order to realize the above method, present invention also offers a kind of text emotion identifying device, as shown in fig. 6, the device It can include:Text code module 61, prediction output module 62 and emotion determining module 63.
Text code module 61, for text to be identified, the language comprising text semantic to be encoded into by language model Adopted vector;
Output module 62 is predicted, for the emotion model using training in advance, the semantic vector is regard as the emotion The input of model, output is identified the probable value that the semantic vector belongs to each affective style;
Emotion determining module 63, the affective style big for select probability value, the feelings of the text obtained as identification Sense.
In one example, as shown in fig. 7, the device can also include:
Language model training module 64, is used for:Obtain corpus of text to be trained;By each word in the corpus of text The Feature Words of rank, are expressed as term vector;According to the term vector, the language model is built based on neutral net.
In one example, the neutral net, including it is following any:Recognition with Recurrent Neural Network, time recurrent neural Network, deep neural network.
In one example, described device also includes:
Emotion model training module 65, is used for:Data of the collection with emotion relevant parameter;Associated and joined according to the emotion Number, obtains the corresponding affective style of the data;It is by sentiment dictionary, data content is corresponding with the emotion relevant parameter The inconsistent data of affective style are filtered, and regard the remaining data after filtering as training data;By the training data As text to be identified, the semantic vector comprising text semantic is encoded into by language model;Instructed according to the semantic vector Get the emotion model.
In one example, the text code module 61, specifically for:Text to be identified is divided into multiple Ziwens This;Each Ziwen sheet is separately encoded into the semantic vector comprising text semantic by language model;
The emotion determining module 63, specifically for:The probable value for belonging to same affective style of each Ziwen sheet is asked Average is taken, the big affective style of average is selected as the emotion of the text.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention God is with principle, and any modification, equivalent substitution and improvements done etc. should be included within the scope of protection of the invention.

Claims (11)

1. a kind of text emotion recognition methods, it is characterised in that methods described includes:
To text to be identified, the semantic vector comprising text semantic is encoded into by language model;
Using the emotion model of training in advance, using the semantic vector as the input of the emotion model, output is identified The semantic vector belongs to the probable value of each affective style;
The big affective style of select probability value, the emotion of the text obtained as identification.
2. according to the method described in claim 1, it is characterised in that methods described also includes:
Obtain corpus of text to be trained;
By the Feature Words of each word rank in the corpus of text, it is expressed as term vector;
According to the term vector, the language model is built based on neutral net.
3. method according to claim 2, it is characterised in that the neutral net, including it is following any:Circulation god Through network, time recurrent neural network, deep neural network.
4. according to the method described in claim 1, it is characterised in that each described affective style, including:Negative sense emotion and non-negative To emotion.
5. according to the method described in claim 1, it is characterised in that methods described also includes:
Data of the collection with emotion relevant parameter;
According to the emotion relevant parameter, the corresponding affective style of the data is obtained;
By sentiment dictionary, the inconsistent data of data content affective style corresponding with the emotion relevant parameter were carried out Filter, and it regard the remaining data after filtering as training data;
Using the training data as text to be identified, the semantic vector comprising text semantic is encoded into by language model;
The emotion model is obtained according to semantic vector training.
6. according to the method described in claim 1, it is characterised in that described to text to be identified, encoded by language model Into the semantic vector comprising text semantic, including:
Text to be identified is divided into multiple Ziwen sheets;
Each Ziwen sheet is separately encoded into the semantic vector comprising text semantic by the language model;
The big affective style of the select probability value, the emotion of the text obtained as identification, including:By each Ziwen sheet The probable value for belonging to same affective style ask for average, the big affective style of selection average as the text emotion.
7. a kind of text emotion identifying device, it is characterised in that described device includes:
Text code module, for text to be identified, the semantic vector comprising text semantic to be encoded into by language model;
Output module is predicted, for the emotion model using training in advance, the semantic vector is regard as the emotion model Input, output is identified the probable value that the semantic vector belongs to each affective style;
Emotion determining module, the affective style big for select probability value, the emotion of the text obtained as identification.
8. device according to claim 7, it is characterised in that described device also includes:
Language model training module, is used for:Obtain corpus of text to be trained;By each word rank in the corpus of text Feature Words, are expressed as term vector;According to the term vector, the language model is built based on neutral net.
9. device according to claim 8, it is characterised in that the neutral net, including it is following any:Circulation god Through network, time recurrent neural network, deep neural network.
10. device according to claim 7, it is characterised in that described device also includes:
Emotion model training module, is used for:Data of the collection with emotion relevant parameter;According to the emotion relevant parameter, obtain To the corresponding affective style of the data;By sentiment dictionary, by data content emotion corresponding with the emotion relevant parameter The data of Type-Inconsistencies are filtered, and regard the remaining data after filtering as training data;Using the training data as Text to be identified, the semantic vector comprising text semantic is encoded into by language model;Trained according to the semantic vector To the emotion model.
11. device according to claim 7, it is characterised in that
The text code module, specifically for:Text to be identified is divided into multiple Ziwen sheets;Each Ziwen sheet passes through described Language model, is separately encoded into the semantic vector comprising text semantic;
The emotion determining module, specifically for:The probable value for belonging to same affective style of each Ziwen sheet is asked for into average, The big affective style of average is selected as the emotion of the text.
CN201611067926.8A 2016-11-25 2016-11-25 A kind of text emotion recognition methods and device Pending CN107038154A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611067926.8A CN107038154A (en) 2016-11-25 2016-11-25 A kind of text emotion recognition methods and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611067926.8A CN107038154A (en) 2016-11-25 2016-11-25 A kind of text emotion recognition methods and device

Publications (1)

Publication Number Publication Date
CN107038154A true CN107038154A (en) 2017-08-11

Family

ID=59531125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611067926.8A Pending CN107038154A (en) 2016-11-25 2016-11-25 A kind of text emotion recognition methods and device

Country Status (1)

Country Link
CN (1) CN107038154A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108039181A (en) * 2017-11-02 2018-05-15 北京捷通华声科技股份有限公司 The emotion information analysis method and device of a kind of voice signal
CN109766440A (en) * 2018-12-17 2019-05-17 航天信息股份有限公司 A kind of method and system for for the determining default categories information of object text description
WO2019174423A1 (en) * 2018-03-16 2019-09-19 北京国双科技有限公司 Entity sentiment analysis method and related apparatus
CN110287323A (en) * 2019-06-27 2019-09-27 成都冰鉴信息科技有限公司 A kind of object-oriented sensibility classification method
CN110321562A (en) * 2019-06-28 2019-10-11 广州探迹科技有限公司 A kind of short text matching process and device based on BERT
CN110413745A (en) * 2019-06-21 2019-11-05 阿里巴巴集团控股有限公司 Selection represents the method for text, determines the method and device of typical problem
CN110929026A (en) * 2018-09-19 2020-03-27 阿里巴巴集团控股有限公司 Abnormal text recognition method and device, computing equipment and medium
CN110990531A (en) * 2019-11-28 2020-04-10 北京声智科技有限公司 Text emotion recognition method and device
CN111460101A (en) * 2020-03-30 2020-07-28 广州视源电子科技股份有限公司 Knowledge point type identification method and device and processor
CN111859979A (en) * 2020-06-16 2020-10-30 中国科学院自动化研究所 Ironic text collaborative recognition method, ironic text collaborative recognition device, ironic text collaborative recognition equipment and computer readable medium
CN115101032A (en) * 2022-06-17 2022-09-23 北京有竹居网络技术有限公司 Method, apparatus, electronic device and medium for generating score of text

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331506A (en) * 2014-11-20 2015-02-04 北京理工大学 Multiclass emotion analyzing method and system facing bilingual microblog text
CN105930503A (en) * 2016-05-09 2016-09-07 清华大学 Combination feature vector and deep learning based sentiment classification method and device
CN106126507A (en) * 2016-06-22 2016-11-16 哈尔滨工业大学深圳研究生院 A kind of based on character-coded degree of depth nerve interpretation method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104331506A (en) * 2014-11-20 2015-02-04 北京理工大学 Multiclass emotion analyzing method and system facing bilingual microblog text
CN105930503A (en) * 2016-05-09 2016-09-07 清华大学 Combination feature vector and deep learning based sentiment classification method and device
CN106126507A (en) * 2016-06-22 2016-11-16 哈尔滨工业大学深圳研究生院 A kind of based on character-coded degree of depth nerve interpretation method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘龙飞 等: "基于卷积神经网络的微博情感倾向性分析", 《中文信息学报》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108039181A (en) * 2017-11-02 2018-05-15 北京捷通华声科技股份有限公司 The emotion information analysis method and device of a kind of voice signal
WO2019174423A1 (en) * 2018-03-16 2019-09-19 北京国双科技有限公司 Entity sentiment analysis method and related apparatus
CN110929026B (en) * 2018-09-19 2023-04-25 阿里巴巴集团控股有限公司 Abnormal text recognition method, device, computing equipment and medium
CN110929026A (en) * 2018-09-19 2020-03-27 阿里巴巴集团控股有限公司 Abnormal text recognition method and device, computing equipment and medium
CN109766440A (en) * 2018-12-17 2019-05-17 航天信息股份有限公司 A kind of method and system for for the determining default categories information of object text description
CN109766440B (en) * 2018-12-17 2023-09-01 航天信息股份有限公司 Method and system for determining default classification information for object text description
CN110413745A (en) * 2019-06-21 2019-11-05 阿里巴巴集团控股有限公司 Selection represents the method for text, determines the method and device of typical problem
CN110287323A (en) * 2019-06-27 2019-09-27 成都冰鉴信息科技有限公司 A kind of object-oriented sensibility classification method
CN110321562B (en) * 2019-06-28 2023-06-02 广州探迹科技有限公司 Short text matching method and device based on BERT
CN110321562A (en) * 2019-06-28 2019-10-11 广州探迹科技有限公司 A kind of short text matching process and device based on BERT
CN110990531A (en) * 2019-11-28 2020-04-10 北京声智科技有限公司 Text emotion recognition method and device
CN110990531B (en) * 2019-11-28 2024-04-02 北京声智科技有限公司 Text emotion recognition method and device
CN111460101A (en) * 2020-03-30 2020-07-28 广州视源电子科技股份有限公司 Knowledge point type identification method and device and processor
CN111460101B (en) * 2020-03-30 2023-09-15 广州视源电子科技股份有限公司 Knowledge point type identification method, knowledge point type identification device and knowledge point type identification processor
CN111859979A (en) * 2020-06-16 2020-10-30 中国科学院自动化研究所 Ironic text collaborative recognition method, ironic text collaborative recognition device, ironic text collaborative recognition equipment and computer readable medium
CN115101032A (en) * 2022-06-17 2022-09-23 北京有竹居网络技术有限公司 Method, apparatus, electronic device and medium for generating score of text

Similar Documents

Publication Publication Date Title
CN107038154A (en) A kind of text emotion recognition methods and device
Poria et al. Context-dependent sentiment analysis in user-generated videos
CN109933664B (en) Fine-grained emotion analysis improvement method based on emotion word embedding
CN108597541B (en) Speech emotion recognition method and system for enhancing anger and happiness recognition
CN104268160B (en) A kind of OpinionTargetsExtraction Identification method based on domain lexicon and semantic role
CN104050160B (en) Interpreter's method and apparatus that a kind of machine is blended with human translation
CN109241255A (en) A kind of intension recognizing method based on deep learning
CN107247702A (en) A kind of text emotion analysis and processing method and system
CN110532912B (en) Sign language translation implementation method and device
CN108986186A (en) The method and system of text conversion video
CN107301170A (en) The method and apparatus of cutting sentence based on artificial intelligence
CN106055662A (en) Emotion-based intelligent conversation method and system
CN111612103A (en) Image description generation method, system and medium combined with abstract semantic representation
Baur et al. eXplainable cooperative machine learning with NOVA
CN111709242B (en) Chinese punctuation mark adding method based on named entity recognition
CN104731874B (en) A kind of evaluation information generation method and device
CN113392641A (en) Text processing method, device, storage medium and equipment
CN110119443A (en) A kind of sentiment analysis method towards recommendation service
CN110956579A (en) Text image rewriting method based on semantic segmentation graph generation
CN112579762B (en) Dialogue emotion analysis method based on semantics, emotion inertia and emotion commonality
CN109325780A (en) A kind of exchange method of the intelligent customer service system in E-Governance Oriented field
CN112966508B (en) Universal automatic term extraction method
CN110096587A (en) The fine granularity sentiment classification model of LSTM-CNN word insertion based on attention mechanism
CN110210036A (en) A kind of intension recognizing method and device
CN108009297A (en) Text emotion analysis method and system based on natural language processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170811