CN106776557B - Emotional state memory identification method and device of emotional robot - Google Patents

Emotional state memory identification method and device of emotional robot Download PDF

Info

Publication number
CN106776557B
CN106776557B CN201611143890.7A CN201611143890A CN106776557B CN 106776557 B CN106776557 B CN 106776557B CN 201611143890 A CN201611143890 A CN 201611143890A CN 106776557 B CN106776557 B CN 106776557B
Authority
CN
China
Prior art keywords
emotional state
user
information
emotional
confidence score
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611143890.7A
Other languages
Chinese (zh)
Other versions
CN106776557A (en
Inventor
简仁贤
叶俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Emotibot Technologies Ltd
Original Assignee
Emotibot Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emotibot Technologies Ltd filed Critical Emotibot Technologies Ltd
Priority to CN201611143890.7A priority Critical patent/CN106776557B/en
Publication of CN106776557A publication Critical patent/CN106776557A/en
Application granted granted Critical
Publication of CN106776557B publication Critical patent/CN106776557B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates

Abstract

The invention belongs to the technical field of intelligent identification, and provides an emotional state identification method and device of an emotional robot. The invention provides an emotional state identification method of an emotional robot, which comprises the following steps: extracting statement information according to statements input by a user; acquiring user information of the user from a memory map; and inputting the statement information and the user information into an emotional state recognition model to obtain the emotional state of the user. According to the emotional state memory identification method and device of the emotional robot, the user information in the memory map is added in the emotional identification process, and the accuracy of identifying the emotional state of the user is improved.

Description

Emotional state memory identification method and device of emotional robot
Technical Field
The invention relates to the technical field of intelligent recognition, in particular to an emotional state memory recognition method and device for an emotional robot.
Background
The existing emotion recognition technology is mainly realized by capturing key words or a relatively universal machine learning model. The main problems of such emotion recognition models are: (1) only through keyword judgment, misjudgment is easily generated because the whole sentence and context cannot be correctly understood, and the accuracy is low; (2) the accuracy of emotion recognition using a general machine learning model is relatively improved, but the input information amount of the general machine learning model is single, and the accuracy of the model is greatly influenced by recognizing the emotional state of the user only through the interaction between the robot and the user.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a method and a device for emotion state memory recognition of an emotion robot, wherein user information in a memory map is added in the emotion recognition process, so that the accuracy of recognizing the emotion state of a user is improved.
In a first aspect, the invention provides an emotional state identification method for an emotional robot, including: extracting statement information according to statements input by a user; acquiring user information of the user from a memory map; and inputting the statement information and the user information into an emotional state recognition model to obtain the emotional state of the user.
The emotional state memory identification method of the emotional robot provided by the invention automatically identifies the emotional state of a user according to the expression of the user in a natural conversation with the user; manual setting by a user or repeated inquiry of the system is not required; the emotion state is judged in an assisting manner by combining other personal information of the user, so that different identification results can be obtained under different situations by the same expression, and the accuracy of emotion identification is improved; additionally, the identified emotional state may help generate a reply that suits the user's state at that time.
Preferably, the method further comprises extracting personal information of the user from the input sentence, and adding the personal information into the memory map.
Preferably, the inputting the statement information and the user information into an emotional state recognition model to obtain the emotional state of the user includes: and inputting the statement information and the user information into a deep learning model to obtain the emotional state of the user.
Preferably, the inputting the sentence information and the user information into a deep learning model to obtain the emotional state of the user includes: inputting the statement information into a rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords; inputting the statement information and the user information into a deep learning model to obtain a second emotional state and a second confidence score of the user; and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score.
Preferably, the selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score includes: if the first confidence score is larger than a threshold value, taking the first emotional state as the emotional state of the user; and if the first confidence score is less than or equal to a threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
In a second aspect, the present invention provides an emotional state recognition apparatus for an emotional robot, including: the sentence information extraction module is used for extracting sentence information according to a sentence input by a user; the user information extraction module is used for acquiring the user information of the user from a memory map; and the emotional state identification module is used for inputting the statement information and the user information into an emotional state identification model to obtain the emotional state of the user.
The emotional state memory recognition device of the emotional robot provided by the invention automatically recognizes the emotional state of a user according to the expression of the user in a natural conversation with the user; manual setting by a user or repeated inquiry of the system is not required; the emotion state is judged in an assisting manner by combining other personal information of the user, so that different identification results can be obtained under different situations by the same expression, and the accuracy of emotion identification is improved; additionally, the identified emotional state may help generate a reply that suits the user's state at that time.
Preferably, the system further comprises a user information updating module, configured to extract personal information of the user from the input sentence, and add the personal information to the memory map.
Preferably, the emotional state identification module is specifically configured to: and inputting the statement information and the user information into a deep learning model to obtain the emotional state of the user.
Preferably, the emotional state identification module is specifically configured to: inputting the statement information into a rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords; inputting the statement information and the user information into a deep learning model to obtain a second emotional state and a second confidence score of the user; and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score.
Preferably, the emotional state identification module is specifically configured to: if the first confidence score is larger than a threshold value, taking the first emotional state as the emotional state of the user; and if the first confidence score is less than or equal to a threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
Drawings
FIG. 1 is a flowchart of an emotional state memory identification method of an emotional robot according to an embodiment of the present invention;
fig. 2 is a block diagram of an emotional state memory recognition device of an emotional robot according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The following examples are only for illustrating the technical solutions of the present invention more clearly, and therefore are only examples, and the protection scope of the present invention is not limited thereby.
It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which the invention pertains.
As shown in fig. 1, the method for recognizing emotional state memory of an emotional robot according to the present embodiment includes:
in step S1, sentence information is extracted from the sentence input by the user.
Wherein, the statement information includes: chinese word segmentation information of the sentence, part-of-speech tagging information after the sentence is segmented, sentence pattern information of the sentence, sensor 2vector information of the sentence, and the like. The extraction modes of the statement information are various, for example, Chinese word segmentation information and part of speech tagging information are mainly obtained by calling an open source library (improved by a third party), and the statement information, the sensor 2vector information and the like are mainly obtained by other module processing of the system or a pre-processing module written by the user. The extraction method of the statement information can be realized by the existing method, and is not described herein again.
Step S2, user information of the user is obtained from the memory map.
Wherein the user information includes: name, gender, birthday, age, constellation, psychological and physiological status of the user, etc.
And step S3, inputting the statement information and the user information into the emotional state recognition model to obtain the emotional state of the user.
Therefore, according to the obtained emotional state of the user, the robot can realize the screening of reply sentences and filter and adjust some unsuitable replies. For example, when a user says: "Song Zhong Ji I Lao Gong! "if the user status is identified as" married "by just keyword grabbing, the resulting reply would likely be" do you get good? "wait for statements that have a relationship with the marriage. However, if more comprehensive user information is considered, such as information of sex "woman", age "20 years", favorite "korea", etc. of the user is considered here, it is possible to judge that the user is only in a state of expressing his favorite but not a marital state with a high degree of accuracy, and the user should be in a state of "single person/love" and "wedding/divorce" because the age is "20 years".
According to the emotional state memory identification method of the emotional robot, the emotional state of the user is automatically identified according to the expression of the user in the natural conversation with the user; manual setting by a user or repeated inquiry of the system is not required; the emotion state is judged in an assisting manner by combining other personal information of the user, so that different identification results can be obtained under different situations by the same expression, and the accuracy of emotion identification is improved; additionally, the identified emotional state may help generate a reply that suits the user's state at that time. The comparison model realized by the prior art can generally only obtain about 50% -60% of accuracy, and the method provided by the embodiment can obtain about 80% of accuracy.
The more perfect the user information in the memory map is, the higher the accuracy of emotion recognition is, therefore, the method provided by this embodiment further includes: and extracting the personal information of the user from the input sentence, and adding the personal information into the memory map. Techniques of rule, machine learning, and deep learning may be mixedly used to extract personal information of a user from an input sentence of the user. The personal information includes basic information of the user, living habits of the user and the like. During continuous conversation with the user, the robot perfects the memory map by extracting the personal information of the user from the conversation.
In addition, the emotional state of the user identified by the emotional state identification model can be stored in the memory map, so that the user information in the memory map is further perfected, and the accuracy of emotional identification is improved.
The memory map comprises a plurality of memory identification modules, wherein the memory identification modules are used for respectively storing different types of user information, such as a memory identification module for storing basic information (name, sex, age, birthday, constellation and the like) of a user, a memory identification module for storing living habits (eating habits, work and rest time and the like) of the user, and a memory identification module for storing emotional states of the user, and the modules are jointly used for perfecting the information of the whole user and can provide references among each other.
The emotion recognition model is realized by one or more algorithms of rules, machine learning and deep learning, and the emotion state of the user is recognized through natural dialogue with the user. The rule generally refers to recognition and extraction through a matching rule prepared in advance such as a keyword table or a regular expression.
When an algorithm is adopted, taking deep learning as an example, the specific implementation steps of step S3 include: and inputting the sentence information and the user information into the deep learning model to obtain the emotional state of the user.
When various algorithms are adopted, taking the adoption of rules and deep learning as an example, the specific implementation steps of the step S3 include:
step S31, inputting the statement information into the rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords.
The rules in the rule model are constantly changing, and preferably adopt a form of a keyword table or a regular expression.
And step S32, inputting the sentence information and the user information into the deep learning model to obtain a second emotional state and a second confidence score of the user. The keywords extracted by the rule model can also be used as the input of the deep learning model, so that the accuracy of recognizing the emotional state of the user is further improved.
And step S33, selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score.
The multiple models are used for emotion recognition at the same time, and the multiple emotion recognition results are screened strategically, so that the accuracy of emotion recognition can be further improved.
The deep learning model is realized through a deep neural Network architecture, the deep neural Network architecture mixedly uses a plurality of neural networks including a deep feedforward neural Network DNN, a recurrent neural Network RNN, a convolutional neural Network CNN/LSTM/Memory Network and the like, and the deep neural Network architecture is adjusted according to the architecture on the basis of the back propagation (bp) algorithm, the time back propagation (bptt) algorithm and other algorithms. A pre-collected test sample, the test sample comprising: the chat statement and the user information of the user are marked with the emotional state; inputting the chat sentences and the user information into the deep learning model to obtain an output emotional state test value, and adjusting parameters of the deep learning model according to the difference between the emotional state test value and the labeled emotional state; and continuously training and adjusting the deep learning model through a large number of test samples until the deep learning model meeting the requirements is obtained.
Further, step S33 specifically includes: if the first confidence score is larger than the threshold value, taking the first emotional state as the emotional state of the user; and if the first confidence score is less than or equal to the threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
Dynamic ordering, among others, involves many parameters, including: text length, extracted keywords, user's above input text, confidence scores of first/second emotional states, and so on. These parameters are entered as input into the dynamic ranking model, which affects the ranking results by giving different weights. The dynamically ordered parameter selection and weight adjustment may be adjusted based on the overall model performance.
Based on the same inventive concept as the emotional state recognition method of the emotional robot, the present embodiment provides an emotional state recognition apparatus of an emotional robot, as shown in fig. 2, including: a sentence information extraction module 101, configured to extract sentence information according to a sentence input by a user; the user information extraction module 102 is used for acquiring user information of a user from a memory map; and the emotional state identification module 103 is used for inputting the statement information and the user information into the emotional state identification model to obtain the emotional state of the user.
The emotional state memory recognition device of the emotional robot provided by the embodiment automatically recognizes the emotional state of a user according to the expression of the user in a natural conversation with the user; manual setting by a user or repeated inquiry of the system is not required; the emotion state is judged in an assisting manner by combining other personal information of the user, so that different identification results can be obtained under different situations by the same expression, and the accuracy of emotion identification is improved; additionally, the identified emotional state may help generate a reply that suits the user's state at that time.
Further, the system also comprises a user information updating module which is used for extracting the personal information of the user from the input sentence and adding the personal information into the memory map.
The emotional state identification module 103 is specifically configured to: and inputting the sentence information and the user information into the deep learning model to obtain the emotional state of the user.
Further, the emotional state identification module 103 is specifically configured to: inputting the statement information into a rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords; inputting the sentence information and the user information into the deep learning model to obtain a second emotional state and a second confidence score of the user; and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score.
Furthermore, the emotional state identification module 103 is specifically configured to: if the first confidence score is larger than the threshold value, taking the first emotional state as the emotional state of the user; and if the first confidence score is less than or equal to the threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (6)

1. An emotional state recognition method for an emotional robot, comprising:
extracting statement information according to statements input by a user;
acquiring user information of the user from a memory map;
inputting the statement information and the user information into an emotional state recognition model to obtain the emotional state of the user;
the step of inputting the statement information and the user information into an emotional state recognition model to obtain the emotional state of the user comprises the following steps: inputting the statement information into a rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords;
inputting the statement information and the user information into a deep learning model to obtain a second emotional state and a second confidence score of the user;
selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score;
the selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score comprises:
if the first confidence score is larger than a threshold value, taking the first emotional state as the emotional state of the user;
and if the first confidence score is less than or equal to a threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
2. The method of claim 1, further comprising extracting personal information of a user from the input sentence, adding the personal information to the memory map.
3. The method of claim 1 or 2, wherein the inputting the sentence information and the user information into an emotional state recognition model to obtain the emotional state of the user comprises:
and inputting the statement information and the user information into a deep learning model to obtain the emotional state of the user.
4. An emotional state recognition device for an emotional robot, comprising:
the sentence information extraction module is used for extracting sentence information according to a sentence input by a user;
the user information extraction module is used for acquiring the user information of the user from a memory map;
the emotional state identification module is used for inputting the statement information and the user information into an emotional state identification model to obtain the emotional state of the user;
the emotion state identification module is specifically configured to:
inputting the statement information into a rule model, extracting keywords, and obtaining a first emotional state and a first confidence score of the user according to the keywords;
inputting the statement information and the user information into a deep learning model to obtain a second emotional state and a second confidence score of the user;
selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the first confidence score and the second confidence score;
the emotion state identification module is specifically configured to:
if the first confidence score is larger than a threshold value, taking the first emotional state as the emotional state of the user;
and if the first confidence score is less than or equal to a threshold value, dynamically sequencing the first emotional state and the second emotional state, and selecting one of the first emotional state and the second emotional state as the emotional state of the user according to the result of the dynamic sequencing.
5. The apparatus of claim 4, further comprising a user information updating module for extracting personal information of a user from the input sentence and adding the personal information to the memory map.
6. The apparatus according to claim 4 or 5, wherein the emotional state identification module is specifically configured to: and inputting the statement information and the user information into a deep learning model to obtain the emotional state of the user.
CN201611143890.7A 2016-12-13 2016-12-13 Emotional state memory identification method and device of emotional robot Active CN106776557B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611143890.7A CN106776557B (en) 2016-12-13 2016-12-13 Emotional state memory identification method and device of emotional robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611143890.7A CN106776557B (en) 2016-12-13 2016-12-13 Emotional state memory identification method and device of emotional robot

Publications (2)

Publication Number Publication Date
CN106776557A CN106776557A (en) 2017-05-31
CN106776557B true CN106776557B (en) 2020-09-08

Family

ID=58876208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611143890.7A Active CN106776557B (en) 2016-12-13 2016-12-13 Emotional state memory identification method and device of emotional robot

Country Status (1)

Country Link
CN (1) CN106776557B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110085211B (en) * 2018-01-26 2021-06-29 上海智臻智能网络科技股份有限公司 Voice recognition interaction method and device, computer equipment and storage medium
CN108960402A (en) * 2018-06-11 2018-12-07 上海乐言信息科技有限公司 A kind of mixed strategy formula emotion towards chat robots pacifies system
CN108984522B (en) * 2018-06-21 2022-12-23 北京亿家老小科技有限公司 Intelligent nursing system
CN108960403B (en) * 2018-07-04 2023-07-04 腾讯科技(深圳)有限公司 Emotion determination method, computer-readable storage medium, and computer device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045857A (en) * 2015-07-09 2015-11-11 中国科学院计算技术研究所 Social network rumor recognition method and system
CN105183848A (en) * 2015-09-07 2015-12-23 百度在线网络技术(北京)有限公司 Human-computer chatting method and device based on artificial intelligence
CN105260745A (en) * 2015-09-30 2016-01-20 西安沧海网络科技有限公司 Information push service system capable of carrying out emotion recognition and prediction based on big data
CN105930503A (en) * 2016-05-09 2016-09-07 清华大学 Combination feature vector and deep learning based sentiment classification method and device
CN106126502A (en) * 2016-07-07 2016-11-16 四川长虹电器股份有限公司 A kind of emotional semantic classification system and method based on support vector machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045857A (en) * 2015-07-09 2015-11-11 中国科学院计算技术研究所 Social network rumor recognition method and system
CN105183848A (en) * 2015-09-07 2015-12-23 百度在线网络技术(北京)有限公司 Human-computer chatting method and device based on artificial intelligence
CN105260745A (en) * 2015-09-30 2016-01-20 西安沧海网络科技有限公司 Information push service system capable of carrying out emotion recognition and prediction based on big data
CN105930503A (en) * 2016-05-09 2016-09-07 清华大学 Combination feature vector and deep learning based sentiment classification method and device
CN106126502A (en) * 2016-07-07 2016-11-16 四川长虹电器股份有限公司 A kind of emotional semantic classification system and method based on support vector machine

Also Published As

Publication number Publication date
CN106776557A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106528845B (en) Retrieval error correction method and device based on artificial intelligence
CN105334743B (en) A kind of intelligent home furnishing control method and its system based on emotion recognition
CN107480122B (en) Artificial intelligence interaction method and artificial intelligence interaction device
WO2020147395A1 (en) Emotion-based text classification method and device, and computer apparatus
WO2018066445A1 (en) Causal relationship recognition apparatus and computer program therefor
CN106776557B (en) Emotional state memory identification method and device of emotional robot
CN104598644B (en) Favorite label mining method and device
CN110675859B (en) Multi-emotion recognition method, system, medium, and apparatus combining speech and text
CN109637537B (en) Method for automatically acquiring annotated data to optimize user-defined awakening model
CN110569354B (en) Barrage emotion analysis method and device
CN109325124B (en) Emotion classification method, device, server and storage medium
WO2021000497A1 (en) Retrieval method and apparatus, and computer device and storage medium
CN105551485B (en) Voice file retrieval method and system
TW201430832A (en) Method and system for automatic speech recognition
CN106446018B (en) Query information processing method and device based on artificial intelligence
CN109192225B (en) Method and device for recognizing and marking speech emotion
CN106847279A (en) Man-machine interaction method based on robot operating system ROS
CN111310440A (en) Text error correction method, device and system
CN112632242A (en) Intelligent conversation method and device and electronic equipment
CN109829045A (en) A kind of answering method and device
CN108595609A (en) Generation method, system, medium and equipment are replied by robot based on personage IP
CN111191463A (en) Emotion analysis method and device, electronic equipment and storage medium
CN112632248A (en) Question answering method, device, computer equipment and storage medium
CN103903615B (en) A kind of information processing method and electronic equipment
CN109074809A (en) Information processing equipment, information processing method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant