CN111611384A - Language emotion perception response method for robot - Google Patents

Language emotion perception response method for robot Download PDF

Info

Publication number
CN111611384A
CN111611384A CN202010455511.8A CN202010455511A CN111611384A CN 111611384 A CN111611384 A CN 111611384A CN 202010455511 A CN202010455511 A CN 202010455511A CN 111611384 A CN111611384 A CN 111611384A
Authority
CN
China
Prior art keywords
emotion
language
information
module
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010455511.8A
Other languages
Chinese (zh)
Inventor
刘则
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Weika Technology Co ltd
Original Assignee
Tianjin Weika Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Weika Technology Co ltd filed Critical Tianjin Weika Technology Co ltd
Priority to CN202010455511.8A priority Critical patent/CN111611384A/en
Publication of CN111611384A publication Critical patent/CN111611384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis

Abstract

The invention provides a language emotion perception response method for a robot, which comprises the following steps: s1, the human-computer interaction module transmits the acquired information to the controller; s2, the controller transmits the collected information to the information analysis module; s3, the information analysis module analyzes and disassembles the information collected by the controller and transmits the disassembled data to the emotion judgment module; s4, the emotion judgment module compares and judges the received information with the data of the storage module and sends the judgment result to the controller; and S5, controlling the emotion expression module to express corresponding emotion by the controller. The language emotion perception response method for the robot can only perform facial expression and can also perform body and sound expression, so that emotion expression is more three-dimensional, emotion management is more flexible, and people can relax mood in the chat process.

Description

Language emotion perception response method for robot
Technical Field
The invention belongs to the technical field of artificial intelligence, and particularly relates to a language emotion perception response method for a robot.
Background
Mood, a common term for a series of subjective cognitive experiences, is a psychological and physiological state resulting from the integration of multiple senses, ideas and behaviors. The most common and popular emotions are happiness, anger, sadness, fright, terror, love, etc. At present, the demand of computer to replace manual work to carry out intelligent voice or text response is becoming more and more extensive, and text response is the premise and the basis of voice response, and the artificial intelligent chat robot is a research and realization aiming at the same. In the prior art, an intelligent chat robot simulates a program unit with a certain language function through computer software to realize simple language communication with people. Many existing chat robots only can simply respond according to some established logic according to input language, and the robots do not really understand the real meaning represented by the input language and the reply language and the corresponding emotion which should be expressed in the communication process.
Disclosure of Invention
In view of the above, the present invention is directed to a language emotion perception response method for a robot, and provides a language emotion perception response method which simulates a human brain thinking process and can determine an emotion of an inputter according to an input voice to make a corresponding emotion response.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a language emotion perception response method for a robot, comprising the steps of:
s1, the human-computer interaction module transmits the acquired information to the controller;
s2, the controller transmits the collected information to the information analysis module;
s3, the information analysis module analyzes and disassembles the information collected by the controller and transmits the disassembled data to the emotion judgment module;
s4, the emotion judgment module compares and judges the received information with the data of the storage module and sends the judgment result to the controller;
and S5, controlling the emotion expression module to express corresponding emotion by the controller.
Furthermore, the information collected by the man-machine interaction module comprises voice information and character input information;
furthermore, the storage module is used for storing language words, logic behaviors, natural common sense, action expressions, a central word vocabulary library and a language library, wherein the central word vocabulary library is used for storing standard central words, and the language library is used for storing central fields contained in the chat information of the client and reply words which are matched with the central fields and feed back to the user.
Furthermore, the information analysis module comprises a splitting unit, a vocabulary statement analyzing unit, an action expression analyzing unit and a semantic sorting unit. The splitting unit is used for splitting the chat information provided by the client into a plurality of vocabularies; the vocabulary sentence analyzing unit is used for comparing the vocabulary obtained by splitting the splitting unit with the standard central words stored in the central word vocabulary library, and sorting and screening out the central vocabulary with the key meaning value; and the semantic sorting unit is connected with the vocabulary analyzing unit and is used for sorting each central vocabulary output by the vocabulary analyzing unit and forming a central field by each screened central vocabulary.
Further, the controller is a single chip microcomputer.
Further, the emotion judging module in S4 divides the chat message of the client into several emotional conditions.
Further, the emotion expression module in S5 changes the face, limbs and language according to the instructions processed by the controller, so as to express corresponding emotion as a response.
Compared with the prior art, the language emotion perception response method for the robot has the following advantages:
(1) according to the language emotion perception response method for the robot, the emotion expression module enables the characters of the robot to be diversified, a user can conveniently select the emotion expression module from multiple choices, the emotion expression module is joyful, the emotion expression module can perform face expression and body and sound expression, emotion expression is more three-dimensional, and emotion management is more flexible. People can relax mood in the process of chatting.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic block diagram of a language emotion perception response method for a robot according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art through specific situations.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
A language emotion perception response method for a robot is shown in figure 1 and comprises a man-machine interaction module, a controller, a storage module, an information analysis module, an emotion judgment module and an emotion expression module, and the specific method comprises the following steps:
s1, the human-computer interaction module transmits the acquired information to the controller;
the information collected by the man-machine interaction module comprises voice information and character input information;
s2, the controller transmits the collected information to the information analysis module;
s3, the information analysis module analyzes and disassembles the information collected by the controller and transmits the disassembled data to the emotion judgment module;
s4, the emotion judgment module compares and judges the received information with the data of the storage module and sends the judgment result to the controller;
and S5, controlling the emotion expression module to express corresponding emotion by the controller.
The controller is a single chip microcomputer.
The storage module is used for storing language words, logic behaviors, natural general knowledge, action expressions, a central word vocabulary library and a language library, wherein the central word vocabulary library is used for storing standard central words, and the language library is used for storing central fields contained in the chat information of the client and matched reply words fed back to the user.
The information analysis module comprises a splitting unit, a vocabulary statement analysis unit, an action expression analysis unit and a semantic arrangement unit. The splitting unit is used for splitting the chat information provided by the client into a plurality of vocabularies; the vocabulary sentence analyzing unit is used for comparing the vocabulary obtained by splitting the splitting unit with the standard central words stored in the central word vocabulary library, and sorting and screening out the central vocabulary with the key meaning value; and the semantic sorting unit is connected with the vocabulary analyzing unit and is used for sorting each central vocabulary output by the vocabulary analyzing unit and forming a central field by each screened central vocabulary.
The emotion judgment module divides the chat information of the client into a plurality of emotion conditions.
The emotion expression module makes changes of face, limbs and language according to the instructions processed by the controller, so that corresponding emotion is expressed as a response.
The emotion expression module enables the character diversification of the robot, is convenient for a user to select, has multiple choices and multiple joys, and can be used for not only carrying out facial expression but also carrying out limb and voice expression, so that the emotion expression is more three-dimensional, and the emotion management is more flexible. People can relax mood in the process of chatting.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. A language emotion perception response method for a robot, characterized by: the method comprises the following steps:
s1, the human-computer interaction module transmits the acquired information to the controller;
s2, the controller transmits the collected information to the information analysis module;
s3, the information analysis module analyzes and disassembles the information collected by the controller and transmits the disassembled data to the emotion judgment module;
s4, the emotion judgment module compares and judges the received information with the data of the storage module and sends the judgment result to the controller;
and S5, controlling the emotion expression module to express corresponding emotion by the controller.
2. A language emotion perception response method for a robot according to claim 1, characterized in that: the information collected by the man-machine interaction module comprises voice information and character input information.
3. A language emotion perception response method for a robot according to claim 1, characterized in that: the storage module is used for storing language words, logic behaviors, natural general knowledge, action expressions, a central word vocabulary library and a language library, wherein the central word vocabulary library is used for storing standard central words, and the language library is used for storing central fields contained in the chat information of the client and matched reply words fed back to the user.
4. A language emotion perception response method for a robot according to claim 1, characterized in that: the information analysis module comprises a splitting unit, a vocabulary statement analysis unit, an action expression analysis unit and a semantic arrangement unit. The splitting unit is used for splitting the chat information provided by the client into a plurality of vocabularies; the vocabulary sentence analyzing unit is used for comparing the vocabulary obtained by splitting the splitting unit with the standard central words stored in the central word vocabulary library, and sorting and screening out the central vocabulary with the key meaning value; and the semantic sorting unit is connected with the vocabulary analyzing unit and is used for sorting each central vocabulary output by the vocabulary analyzing unit and forming a central field by each screened central vocabulary.
5. A language emotion perception response method for a robot according to claim 1, characterized in that: the controller is a single chip microcomputer.
6. A language emotion perception response method for a robot according to claim 1, characterized in that: the emotion judging module in S4 divides the chat message of the client into several emotional conditions.
7. A language emotion perception response method for a robot according to claim 1, characterized in that: the emotion expression module in S5 changes the face, limbs and language according to the instructions processed by the controller, so as to express corresponding emotion as a response.
CN202010455511.8A 2020-05-26 2020-05-26 Language emotion perception response method for robot Pending CN111611384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010455511.8A CN111611384A (en) 2020-05-26 2020-05-26 Language emotion perception response method for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010455511.8A CN111611384A (en) 2020-05-26 2020-05-26 Language emotion perception response method for robot

Publications (1)

Publication Number Publication Date
CN111611384A true CN111611384A (en) 2020-09-01

Family

ID=72200590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010455511.8A Pending CN111611384A (en) 2020-05-26 2020-05-26 Language emotion perception response method for robot

Country Status (1)

Country Link
CN (1) CN111611384A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN106293102A (en) * 2016-10-13 2017-01-04 旗瀚科技有限公司 A kind of robot affective interaction method based on user mood change emotion
CN107783971A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Artificial intelligence chat robots
CN108115695A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of emotional color expression system and robot
CN109227534A (en) * 2018-08-09 2019-01-18 上海常仁信息科技有限公司 A kind of motion management regulating system and method based on robot
CN109416701A (en) * 2016-04-26 2019-03-01 泰康机器人公司 The robot of a variety of interactive personalities

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN109416701A (en) * 2016-04-26 2019-03-01 泰康机器人公司 The robot of a variety of interactive personalities
CN107783971A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Artificial intelligence chat robots
CN106293102A (en) * 2016-10-13 2017-01-04 旗瀚科技有限公司 A kind of robot affective interaction method based on user mood change emotion
CN108115695A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of emotional color expression system and robot
CN109227534A (en) * 2018-08-09 2019-01-18 上海常仁信息科技有限公司 A kind of motion management regulating system and method based on robot

Similar Documents

Publication Publication Date Title
CN106407178B (en) A kind of session abstraction generating method, device, server apparatus and terminal device
CN105843381B (en) Data processing method for realizing multi-modal interaction and multi-modal interaction system
US9213936B2 (en) Electronic brain model with neuron tables
US9064211B2 (en) Method for determining relationships through use of an ordered list between processing nodes in an emulated human brain
CN110815234A (en) Control method and control server of interactive robot
CN107480122A (en) A kind of artificial intelligence exchange method and artificial intelligence interactive device
Sethu et al. The ambiguous world of emotion representation
CN111667926A (en) Psychological consultation (conversation) system and method based on artificial intelligence
CN106815321A (en) Chat method and device based on intelligent chat robots
WO2007081307A1 (en) A method for inclusion of psychological temperament in an electornic emulation of the human brain
Gjoreski et al. Machine learning approach for emotion recognition in speech
Chebbi et al. On the use of pitch-based features for fear emotion detection from speech
CN110807323A (en) Emotion vector generation method and device
Cowie et al. Piecing together the emotion jigsaw
JP2018200650A (en) Language information analysis apparatus and method
Tayarani et al. What an “ehm” leaks about you: mapping fillers into personality traits with quantum evolutionary feature selection algorithms
CN111611384A (en) Language emotion perception response method for robot
Devillers et al. Automatic detection of emotion from vocal expression
CN112466435B (en) Psychological coaching scheme determination method and device, storage medium and electronic device
KR102314332B1 (en) Medical dialog support system and method for physicians and patient using machine learning and NLP
Bojanić et al. Application of dimensional emotion model in automatic emotional speech recognition
Mallios et al. E-IATROS—a virtual medical doctor and its dialogue systems: features, emotion, learning
Petukhova et al. A multidimensional approach to multimodal dialogue act annotation
KR102630803B1 (en) Emotion analysis result providing device and emotion analysis result providing system
CN107908750A (en) A kind of artificial intelligence response System and method for

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200901