CN109033089B - Emotion analysis method and device - Google Patents

Emotion analysis method and device Download PDF

Info

Publication number
CN109033089B
CN109033089B CN201811037201.3A CN201811037201A CN109033089B CN 109033089 B CN109033089 B CN 109033089B CN 201811037201 A CN201811037201 A CN 201811037201A CN 109033089 B CN109033089 B CN 109033089B
Authority
CN
China
Prior art keywords
model
prediction model
classification result
trained
feature data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811037201.3A
Other languages
Chinese (zh)
Other versions
CN109033089A (en
Inventor
车天博
高维国
何晓冬
刘晓华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201811037201.3A priority Critical patent/CN109033089B/en
Publication of CN109033089A publication Critical patent/CN109033089A/en
Application granted granted Critical
Publication of CN109033089B publication Critical patent/CN109033089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services

Abstract

The present disclosure provides an emotion analysis method and apparatus. The emotion analysis device extracts features of the user session, the extracted features of the user session are respectively input into a preset first prediction model and a preset second prediction model, first feature data in the first prediction model are input into the second prediction model, so that the second prediction model can fuse the first feature data and second feature data of the second prediction model, and emotion classification results of the user session are obtained by using the fused feature data. The method and the device have the advantages that the knowledge in the first prediction model is migrated into the second prediction model in a migration learning mode, so that the second prediction model can obtain a better classification result.

Description

Emotion analysis method and device
Technical Field
The present disclosure relates to the field of information processing, and in particular, to an emotion analysis method and apparatus.
Background
The artificial customer service is used as a window directly facing users, and plays an increasingly important role in the internet industry. The ability of a customer to service a problem will directly impact the user experience and the user's impression of the company.
Currently, user emotions are scored by using a text classification technique. For example, a score of 1 indicates very unsatisfactory, and a score of 5 indicates very unsatisfactory.
Disclosure of Invention
The inventor finds out through research that in an actual service scene, the reason why the user is dissatisfied cannot be accurately known only by scoring the emotion of the user, so that the service cannot be effectively improved.
Therefore, the scheme capable of classifying the user emotion according to the user session is provided, so that the reason why the user is dissatisfied can be conveniently and timely known.
In accordance with an aspect of one or more embodiments of the present disclosure, there is provided an emotion analysis method including: carrying out feature extraction on the user session; inputting the extracted user session features into a preset first prediction model and a preset second prediction model respectively; and inputting the first characteristic data in the first prediction model into the second prediction model so that the second prediction model fuses the first characteristic data and the second characteristic data of the second prediction model, and obtaining an emotion classification result of the user session by using the fused characteristic data.
In some embodiments, the classification result of the first predictive model is smaller than the classification result of the second predictive model.
In some embodiments, each classification result of the first predictive model is associated with at least one classification result of the second predictive model, respectively.
In some embodiments, the classification results of the first predictive model include happy, neutral, negative; the classification results of the second predictive model include happy, neutral, anxious, angry, fear, sadness, and fallen, wherein a negative classification result of the first predictive model is associated with an anxious, angry, fear, sadness, and fallen classification result of the second predictive model.
In some embodiments, the first predictive model and the second predictive model are character-based convolutional neural networks.
In some embodiments, training data is respectively input into the first prediction model and the model to be trained, so that the first prediction model outputs a classification result, wherein the training data comprises user session features; inputting the feature data in the first prediction model into the model to be trained so that the model to be trained can fuse the feature data from the first prediction model and the feature data of the model to be trained, and outputting a classification result by using the fused feature data; and adjusting parameters of the model to be trained according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model to obtain the second prediction model.
In some embodiments, if there is no correlation between the classification result output by the to-be-trained model and the classification result output by the first prediction model, it is determined that there is a deviation between the classification result output by the to-be-trained model and the classification result output by the first prediction model.
In accordance with another aspect of one or more embodiments of the present disclosure, there is provided an emotion analyzing apparatus including: a feature extraction module configured to perform feature extraction on the user session; the characteristic input module is configured to input the extracted user session characteristics into a preset first prediction model and a preset second prediction model respectively; and the transfer learning module is configured to input the first feature data in the first prediction model into the second prediction model so that the second prediction model fuses the first feature data and the second feature data of the second prediction model, and obtains an emotion classification result of the user session by using the fused feature data.
In some embodiments, the classification result of the first predictive model is smaller than the classification result of the second predictive model.
In some embodiments, each classification result of the first predictive model is associated with at least one classification result of the second predictive model, respectively.
In some embodiments, the classification results of the first predictive model include happy, neutral, negative; the classification results of the second predictive model include happy, neutral, anxious, angry, fear, sadness, and fallen, wherein a negative classification result of the first predictive model is associated with an anxious, angry, fear, sadness, and fallen classification result of the second predictive model.
In some embodiments, the first predictive model and the second predictive model are character-based convolutional neural networks.
In some embodiments, the apparatus further includes a training module configured to input training data into the first prediction model and the model to be trained, respectively, so that the first prediction model outputs a classification result, where the training data includes user session features; inputting the feature data in the first prediction model into the model to be trained so that the model to be trained can fuse the feature data from the first prediction model and the feature data of the model to be trained, and outputting a classification result by using the fused feature data; and adjusting parameters of the model to be trained according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model to obtain the second prediction model.
In some embodiments, the training module is further configured to determine that there is a deviation between the classification result output by the model to be trained and the classification result output by the first prediction model if there is no correlation between the classification result output by the model to be trained and the classification result output by the first prediction model.
According to another aspect of one or more embodiments of the present disclosure, there is provided an item recommendation device including: a memory configured to store instructions; a processor coupled to the memory, the processor configured to perform a method according to any of the embodiments described above based on instructions stored in the memory.
According to another aspect of one or more embodiments of the present disclosure, there is provided a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, which when executed by a processor, implement a method as described above in relation to any one of the embodiments.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is an exemplary flow diagram of a sentiment analysis method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an emotion analysis network model according to an embodiment of the present disclosure;
FIG. 3 is an exemplary flow chart of a sentiment analysis method according to another embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an emotion analysis network model according to another embodiment of the present disclosure;
FIG. 5 is an exemplary block diagram of an emotion analysis apparatus according to an embodiment of the present disclosure;
FIG. 6 is an exemplary block diagram of an emotion analysis apparatus according to another embodiment of the present disclosure;
fig. 7 is an exemplary block diagram of an emotion analyzing apparatus according to still another embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
FIG. 1 is an exemplary flowchart of an emotion analysis method according to an embodiment of the present disclosure. In some embodiments, the method steps of the present embodiment may be performed by an emotion analysis device.
In step 101, feature extraction is performed on a user session.
In step 102, the extracted user session features are respectively input into a preset first prediction model and a preset second prediction model.
In some embodiments, the classification result of the first predictive model is smaller than the classification result of the second predictive model. For example, the output of the first predictive model has three classification results and the output of the second predictive model has seven classification results. Thus, the second prediction model can perform more detailed emotion classification by using knowledge of the first prediction model.
In some embodiments, each classification result of the first predictive model is associated with at least one classification result of the second predictive model, respectively. For example, the first prediction model is a three-classification model, and the classification results include happy (happy), neutral (neutral), and negative (negative). The second prediction model is a seven-classification model, and the classification results include happy (happy), neutral (neutral), anxious (anxiety), angry (anger), fear (fear), sad (sad), and lost (lost). The happy classification result of the first prediction model is associated with the happy classification result of the second prediction model. The neutral classification result of the first predictive model is associated with the neutral classification result of the second predictive model. Negative classification results of the first predictive model are associated with anxiety, anger, fear, sadness, and dropout classification results of the second predictive model.
In some embodiments, the first predictive model and the second predictive model are character-based convolutional neural networks (charcnns).
In step 103, the first feature data in the first prediction model is input into the second prediction model, so that the second prediction model fuses the first feature data and the second feature data of the second prediction model, and the emotion classification result of the user session is obtained by using the fused feature data.
In some embodiments, the first feature data from the first predictive model and the second feature data in the second predictive model may be stitched in respective hidden layers of the second predictive model. For example, the first feature data is 200-dimensional data, the second feature data is also 200-dimensional data, and the first feature data and the second feature data are spliced to obtain 400-dimensional data, which is continuously processed in the second prediction model, so as to obtain the emotion classification result of the user session.
In the emotion analysis method provided by the above embodiment of the present disclosure, the features of the first prediction model are input into the second prediction model by adopting a migration learning manner, so that the knowledge in the first prediction model is migrated into the second prediction model, and the second prediction model can obtain a better classification result.
FIG. 2 is a schematic diagram of an emotion analysis network model according to an embodiment of the present disclosure. As shown in fig. 2, the user session characteristics are respectively input into a first predictive model and a second predictive model which are preset. The first prediction model is a three-classification model and has massive annotation data. The second prediction model is a seven-class model with less labeled data. Each classification result of the first prediction model is respectively associated with at least one classification result of the second prediction model. And inputting the first characteristic data in the first prediction model into a second prediction model so that the second prediction model fuses the first characteristic data and the second characteristic data of the second prediction model, and obtaining an emotion classification result of the user session by using the fused characteristic data.
The results of emotion analysis of the user session example using the above embodiment are shown in table 1.
Session identification Emotion categories User sessions
1 Neutral property Is the pacifier of the parent have cross holes?
2 Is lost The watch has a defect and cannot be found when people want to find the watch
3 Neutral property I just bought the power supply
4 Generating qi This slow!
5 Neutral property IPONE5S without 32G memory
6 Happy Da ai
7 Neutral property How long the trousers of S size are
8 Sadness and sorrow Due to deficiency of
9 Sadness and sorrow The answer is not coming
10 Neutral property Timing shutdown woolen cloth
11 Neutral property With or without power increase
12 Generating qi I order a good or bad shipment
TABLE 1
FIG. 3 is an exemplary flowchart of a sentiment analysis method according to another embodiment of the present disclosure. In some embodiments, the method steps of the present embodiment may be performed by an emotion analysis device.
In step 301, training data is respectively input into the first prediction model and the model to be trained, so that the first prediction model outputs a classification result, wherein the training data includes user session features.
In step 302, the feature data in the first prediction model is input into the model to be trained, so that the model to be trained fuses the feature data from the first prediction model and the feature data of the model to be trained, and a classification result is output by using the fused feature data.
In step 303, the parameters of the model to be trained are adjusted according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model, so as to obtain a second prediction model.
In some embodiments, if there is no correlation between the classification result output by the model to be trained and the classification result output by the first prediction model, it is determined that there is a deviation between the classification result output by the model to be trained and the classification result output by the first prediction model.
For example, the emotion of the user output by the first prediction model is happy, and the emotion of the user output by the model to be trained is angry, which indicates that a deviation exists between the output classification results of the model to be trained and the first prediction model.
FIG. 4 is a schematic diagram of an emotion analysis network model according to another embodiment of the present disclosure. And respectively inputting the training data into the first prediction model and the model to be trained so that the first prediction model outputs a classification result. The first prediction model is a three-classification model and has massive annotation data. The second prediction model is a seven-class model with less labeled data. And inputting the characteristic data in the first prediction model into the model to be trained so that the model to be trained can fuse the characteristic data from the first prediction model and the characteristic data of the model to be trained, and outputting a classification result by using the fused characteristic data. And adjusting parameters of the model to be trained according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model to obtain a second prediction model.
Fig. 5 is an exemplary block diagram of an emotion analysis apparatus according to an embodiment of the present disclosure. As shown in fig. 5, the emotion analyzing apparatus includes a feature extracting module 51, a feature inputting module 52, and a transition learning module 53.
The feature extraction module 51 is configured to perform feature extraction on the user session.
The feature input module 52 is configured to input the extracted user session features into a preset first prediction model and a preset second prediction model, respectively.
In some embodiments, the classification result of the first predictive model is smaller than the classification result of the second predictive model. In some embodiments, each classification result of the first predictive model is associated with at least one classification result of the second predictive model, respectively. For example, the classification results of the first prediction model include happy, neutral, and negative. The classification results of the second predictive model include happy, neutral, anxious, angry, fear, sadness, and loss, wherein a negative classification result of the first predictive model is associated with an anxious, angry, fear, sadness, and loss classification result of the second predictive model.
In some embodiments, the first predictive model and the second predictive model are character-based convolutional neural networks.
The migration learning module 53 is configured to input the first feature data in the first prediction model into the second prediction model, so that the second prediction model fuses the first feature data and the second feature data of itself, and obtains an emotion classification result of the user session by using the fused feature data.
Fig. 6 is an exemplary block diagram of an emotion analysis apparatus according to another embodiment of the present disclosure. FIG. 6 differs from FIG. 5 in that, in the embodiment shown in FIG. 6, the emotion analysis apparatus further includes a training module 54.
The training module 54 is configured to input training data into the first predictive model and the model to be trained, respectively, so that the first predictive model outputs the classification result, wherein the training data includes user session features. The training module 54 inputs the feature data in the first prediction model into the model to be trained, so that the model to be trained fuses the feature data from the first prediction model and the feature data of the model to be trained, and outputs a classification result by using the fused feature data. The training module 54 adjusts parameters of the model to be trained according to a deviation between the classification result output by the model to be trained and the classification result output by the first prediction model, so as to obtain a second prediction model.
In some embodiments, the training module 54 is further configured to determine that there is a deviation between the classification result output by the model to be trained and the classification result output by the first prediction model if there is no correlation between the classification result output by the model to be trained and the classification result output by the first prediction model.
Fig. 7 is an exemplary block diagram of an emotion analyzing apparatus according to still another embodiment of the present disclosure. As shown in fig. 7, the emotion analyzing apparatus includes a memory 71 and a processor 72.
The memory 71 is used for storing instructions, the processor 72 is coupled to the memory 71, and the processor 72 is configured to execute the method according to any one of the embodiments in fig. 1 or fig. 3 based on the instructions stored in the memory.
As shown in FIG. 7, the emotion analyzing apparatus further includes a communication interface 73 for information interaction with other devices. Meanwhile, the device also comprises a bus 74, and the processor 72, the communication interface 73 and the memory 71 are communicated with each other through the bus 74.
The memory 71 may comprise a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 71 may also be a memory array. The storage 71 may also be partitioned and the blocks may be combined into virtual volumes according to certain rules.
Further, the processor 72 may be a central processing unit CPU, or may be an application specific integrated circuit ASIC, or one or more integrated circuits configured to implement embodiments of the present disclosure.
The present disclosure also relates to a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and the instructions, when executed by a processor, implement a method according to any one of the embodiments shown in fig. 1 or fig. 3.
Table 2 shows the proposed solution for the above examples of the present disclosure in comparison with the prior art tests. The first test is a seven-classification process using a conventional charCNN model, and the second test is a seven-classification process using the migration learning of the present disclosure.
Name of experiment Rate of accuracy Recall rate F1 score
Test No.) 69.44% 63.22% 66%
Test No. two 76.52% 60.54% 67.54%
TABLE 2
As can be seen from Table 2, the scheme provided by the present disclosure effectively improves the accuracy of emotion classification of the user, and at the same time, the overall F1 score is also improved.
In some embodiments, the functional unit modules described above can be implemented as a general purpose Processor, a Programmable Logic Controller (PLC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable Logic device, discrete Gate or transistor Logic, discrete hardware components, or any suitable combination thereof for performing the functions described in this disclosure.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The description of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (12)

1. An emotion analysis method comprising:
carrying out feature extraction on the user session;
inputting the extracted user session features into a preset first prediction model and a preset second prediction model respectively;
inputting the first characteristic data in the first prediction model into the second prediction model so that the second prediction model fuses the first characteristic data and the second characteristic data of the second prediction model, and obtaining an emotion classification result of the user session by using the fused characteristic data;
wherein the classification result of the first prediction model is smaller than the classification result of the second prediction model, and each classification result of the first prediction model is respectively associated with at least one classification result of the second prediction model.
2. The method of claim 1, wherein,
the classification result of the first prediction model comprises happiness, neutrality and negativity;
the classification results of the second predictive model include happy, neutral, anxious, angry, fear, sadness, and fallen, wherein a negative classification result of the first predictive model is associated with an anxious, angry, fear, sadness, and fallen classification result of the second predictive model.
3. The method of claim 1, wherein,
the first predictive model and the second predictive model are character-based convolutional neural networks.
4. The method of any one of claims 1-3,
respectively inputting training data into the first prediction model and the model to be trained so that the first prediction model outputs a classification result, wherein the training data comprises user session characteristics;
inputting the feature data in the first prediction model into the model to be trained so that the model to be trained can fuse the feature data from the first prediction model and the feature data of the model to be trained, and outputting a classification result by using the fused feature data;
and adjusting parameters of the model to be trained according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model to obtain the second prediction model.
5. The method of claim 4, wherein,
and if no incidence relation exists between the classification result output by the model to be trained and the classification result output by the first prediction model, judging that a deviation exists between the classification result output by the model to be trained and the classification result output by the first prediction model.
6. An emotion analysis apparatus comprising:
a feature extraction module configured to perform feature extraction on the user session;
the feature input module is configured to input the extracted user session features into a preset first prediction model and a preset second prediction model respectively, wherein the classification result of the first prediction model is smaller than that of the second prediction model, and each classification result of the first prediction model is associated with at least one classification result of the second prediction model respectively;
and the transfer learning module is configured to input the first feature data in the first prediction model into the second prediction model so that the second prediction model fuses the first feature data and the second feature data of the second prediction model, and obtains an emotion classification result of the user session by using the fused feature data.
7. The apparatus of claim 6, wherein,
the classification result of the first prediction model comprises happiness, neutrality and negativity;
the classification results of the second predictive model include happy, neutral, anxious, angry, fear, sadness, and fallen, wherein a negative classification result of the first predictive model is associated with an anxious, angry, fear, sadness, and fallen classification result of the second predictive model.
8. The apparatus of claim 6, wherein,
the first predictive model and the second predictive model are character-based convolutional neural networks.
9. The apparatus of any of claims 6-8, further comprising:
the training module is configured to input training data into the first prediction model and the model to be trained respectively so that the first prediction model outputs a classification result, wherein the training data comprise user session characteristics; inputting the feature data in the first prediction model into the model to be trained so that the model to be trained can fuse the feature data from the first prediction model and the feature data of the model to be trained, and outputting a classification result by using the fused feature data; and adjusting parameters of the model to be trained according to the deviation between the classification result output by the model to be trained and the classification result output by the first prediction model to obtain the second prediction model.
10. The apparatus of claim 9, wherein,
the training module is further configured to determine that a deviation exists between the classification result output by the model to be trained and the classification result output by the first prediction model if there is no correlation between the classification result output by the model to be trained and the classification result output by the first prediction model.
11. An emotion analysis apparatus comprising:
a memory configured to store instructions;
a processor coupled to the memory, the processor configured to perform implementing the method of any of claims 1-5 based on instructions stored by the memory.
12. A computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions which, when executed by a processor, implement the method of any one of claims 1-5.
CN201811037201.3A 2018-09-06 2018-09-06 Emotion analysis method and device Active CN109033089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811037201.3A CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811037201.3A CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Publications (2)

Publication Number Publication Date
CN109033089A CN109033089A (en) 2018-12-18
CN109033089B true CN109033089B (en) 2021-01-26

Family

ID=64623779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811037201.3A Active CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Country Status (1)

Country Link
CN (1) CN109033089B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829166B (en) * 2019-02-15 2022-12-27 重庆师范大学 People and host customer opinion mining method based on character-level convolutional neural network
CN110378726A (en) * 2019-07-02 2019-10-25 阿里巴巴集团控股有限公司 A kind of recommended method of target user, system and electronic equipment
CN112465588A (en) * 2020-05-10 2021-03-09 石伟 Image interaction information processing method, system and platform based on e-commerce live broadcast

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107590134A (en) * 2017-10-26 2018-01-16 福建亿榕信息技术有限公司 Text sentiment classification method, storage medium and computer
CN107609009A (en) * 2017-07-26 2018-01-19 北京大学深圳研究院 Text emotion analysis method, device, storage medium and computer equipment
CN108108355A (en) * 2017-12-25 2018-06-01 北京牡丹电子集团有限责任公司数字电视技术中心 Text emotion analysis method and system based on deep learning
CN108460415A (en) * 2018-02-28 2018-08-28 国信优易数据有限公司 Pseudo label generates model training method and pseudo label generation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11205103B2 (en) * 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107609009A (en) * 2017-07-26 2018-01-19 北京大学深圳研究院 Text emotion analysis method, device, storage medium and computer equipment
CN107590134A (en) * 2017-10-26 2018-01-16 福建亿榕信息技术有限公司 Text sentiment classification method, storage medium and computer
CN108108355A (en) * 2017-12-25 2018-06-01 北京牡丹电子集团有限责任公司数字电视技术中心 Text emotion analysis method and system based on deep learning
CN108460415A (en) * 2018-02-28 2018-08-28 国信优易数据有限公司 Pseudo label generates model training method and pseudo label generation method

Also Published As

Publication number Publication date
CN109033089A (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN106649603B (en) Designated information pushing method based on emotion classification of webpage text data
CN109492101B (en) Text classification method, system and medium based on label information and text characteristics
CN107066446B (en) Logic rule embedded cyclic neural network text emotion analysis method
CN109033089B (en) Emotion analysis method and device
CN109376251A (en) A kind of microblogging Chinese sentiment dictionary construction method based on term vector learning model
CN112270196B (en) Entity relationship identification method and device and electronic equipment
CN103577989B (en) A kind of information classification approach and information classifying system based on product identification
CN111666761B (en) Fine-grained emotion analysis model training method and device
CN107944911A (en) A kind of recommendation method of the commending system based on text analyzing
US11630957B2 (en) Natural language processing method and apparatus
CN107273348B (en) Topic and emotion combined detection method and device for text
CN104850617B (en) Short text processing method and processing device
KR20120109943A (en) Emotion classification method for analysis of emotion immanent in sentence
CN109034203A (en) Training, expression recommended method, device, equipment and the medium of expression recommended models
CN113361258A (en) Aspect-level emotion analysis method and system based on graph convolution network and attention selection
CN107122492A (en) Lyric generation method and device based on picture content
CN104102662B (en) A kind of user interest preference similarity determines method and device
AU2014253880A1 (en) Identification of points in a user web journey where the user is more likely to accept an offer for interactive assistance
CN110827797B (en) Voice response event classification processing method and device
Zhang et al. An agreement and sparseness-based learning instance selection and its application to subjective speech phenomena
CN112069315A (en) Method, device, server and storage medium for extracting text multidimensional information
CN106776557B (en) Emotional state memory identification method and device of emotional robot
CN111310014A (en) Scenic spot public opinion monitoring system, method, device and storage medium based on deep learning
CN106933798A (en) The method and device of information analysis
WO2020199590A1 (en) Mood detection analysis method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant