CN105082150B - A kind of robot man-machine interaction method based on user emotion and intention assessment - Google Patents
A kind of robot man-machine interaction method based on user emotion and intention assessment Download PDFInfo
- Publication number
- CN105082150B CN105082150B CN201510526445.8A CN201510526445A CN105082150B CN 105082150 B CN105082150 B CN 105082150B CN 201510526445 A CN201510526445 A CN 201510526445A CN 105082150 B CN105082150 B CN 105082150B
- Authority
- CN
- China
- Prior art keywords
- robot
- user
- man
- user emotion
- machine interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Manipulator (AREA)
Abstract
The present invention relates to a kind of robot man-machine interaction method based on user emotion and intention assessment.Which is combined using the biological information of human body such as breathing, heart rate, skin electricity and human facial expression recognition and carries out user emotion identification, it is intended to using various physical sensor information identifying users such as pressure, photoelectricity, temperature, Based Intelligent Control decision-making is carried out according to the result of user emotion and intention assessment, corresponding executing agency of control robot completes limb action and interactive voice.Using the robot man-machine interaction method based on user emotion and intention assessment that the present invention is provided, robot can be made to be more fully understood from the intention of user, understand its psychology change, satisfaction carries out the functional requirement that emotion is accompanied and attended to user, preferably incorporates the life of the users such as old man, children.
Description
Technical field
The present invention relates to robot technical field, more particularly to a kind of machine based on user emotion and intention assessment everybody
Machine exchange method.
Background technology
China's Aging Problem is serious, and the Psychological Health Problem of " Empty nest elderly " gains a special interest.At 2010 the 6th
During secondary census, Empty nest elderly has reached 31.77%, and there is the ratio of psychological problems in Empty nest elderly and reach
60%.Simultaneously as major part family is all only-child at present, the company for lacking contemporary during child growth is also easy
Affect mental health.Therefore, for the old man and children of unmanned company at one's side at ordinary times, spirit is carried out to which with which kind of means and is accompanied
Shield, it is to avoid the generation of its mental disease, becomes current problem in the urgent need to address.
In order to solve the above problems, in recent years it has been proposed that replacing people to carry out the spirit of old age, children using robot
Accompany.Have some robot products that interaction can be carried out with people at present on the market.But, robot is right to be really achieved
Old, children carry out the functional requirement that emotion is accompanied and attended to, and preferably incorporate its life, first have to fully understand the intention of user,
Its psychology change is solved, for the characteristics of different user and demand carries out different types of interaction.Much study and show, the physiology of people
The various ways such as information, facial expression, speech intonation, gesture motion can reflect mood and psychology change to some extent.
In sum, the personalized human-computer interactive control for old man, children's difference emotional state how is realized, is ability
Technical staff's technical problem urgently to be resolved hurrily.
The content of the invention
It is an object of the invention to provide a kind of robot man-machine interaction method based on user emotion and intention assessment, with
Solve the above problems.
In order to achieve the above object, the technical scheme is that what is be achieved in that:
The invention provides a kind of robot man-machine interaction method based on user emotion and intention assessment, including following step
Suddenly:
S100, robot carry out Intelligent Recognition to user emotion and user view;
S200, robot obtain intelligent decision according to the Intelligent Recognition result;
S300, robot export interactive action using man-machine interaction mode according to intelligent decision result.
Further, in the step s 100, to carry out specifically including for Intelligent Recognition to user emotion as follows for the robot
Step:
S101, user emotion identification are concrete using human body physiologic informations such as breathing, pulse, heart rate, skin electricity, by carrying
Take the various features of signal under different moods, Land use models recognizer carries out Emotion identification, by user emotion be identified as easily,
Glad, sad, angry four kinds of emotional informations;
S102, the accuracy that above-mentioned Emotion identification is further improved with reference to facial expression image process, identification.
Wherein, the Human Physiology information is acquired by wearable harvester, by communication with
Robot carries out information transfer.
Wherein, the facial expression is acquired by the camera in robot.
Further, the communication includes bluetooth connection or wireless network connection.
Further, in the step s 100, to carry out specifically including for Intelligent Recognition to user view as follows for the robot
Step:
S103, user view identification are concrete using the pressure being arranged in robot, photoelectricity, temperature multiple sensors
Information differentiates surrounding environment and user action, while voice signal is acquired using the microphone in robot,
By Speech processing and feature extraction, speech recognition is carried out, the result and sensor data fusion of speech recognition are recognized
User view.
Further, in step s 200, the robot obtains intelligent decision according to the Intelligent Recognition result, specifically
Comprise the steps:
S201, the Intelligent Decision-making Method carry out Based Intelligent Control decision-making according to the result of user emotion and intention assessment, and
Obtain corresponding executing agency of control robot and complete control instruction.
Further, in step S300, the robot is exported using man-machine interaction mode according to intelligent decision result
Interactive action specifically includes following steps:
S301, the robot receive control instruction;
S302, the robot control each executing agency according to the control instruction and perform interactive action.
Further, in S302 steps, the robot controls each executing agency according to the control instruction and performs
Interactive action specifically includes following steps:
S303, the robot complete limb action using each step motor control robot, realize machine person to person
Action interaction;
S304, the robot play voice, musical sound file by loudspeaker, realize that the voice of machine person to person is handed over
Mutually.
Further, further, each described stepper motor be used for the manipulator of correspondence control machinery people, mechanical arm,
Pedipulator, mechanical number of people realization imitation animal are shaken hands accordingly, embrace, walking, head shaking movement.
Compared with prior art, this have the advantage that:
A kind of robot man-machine interaction method based on user emotion and intention assessment that the present invention is provided, which specifically includes
Implement three part operations, specifically include following steps:S100, robot carry out intelligent knowledge to user emotion and user view
Not;S200, robot obtain intelligent decision according to the Intelligent Recognition result;S300, robot are adopted according to intelligent decision result
Interactive action is exported with man-machine interaction mode.
The main technical advantage by two aspects;1. mood, intention of the multiple sensors information realization to people can be passed through
Identification, provide precondition for the good interpersonal interaction of robot.2. robot can be made according to the mood and intention of user
Different reactions are targetedly made, personalized man-machine interaction is realized, is preferably played and old man, children's spirit is accompanied
Effect.
Description of the drawings
In order to be illustrated more clearly that the specific embodiment of the invention or technical scheme of the prior art, below will be to concrete
Needed for embodiment or description of the prior art, accompanying drawing to be used is briefly described, it should be apparent that, in describing below
Accompanying drawing is some embodiments of the present invention, for those of ordinary skill in the art, before creative work is not paid
Put, can be with according to these other accompanying drawings of accompanying drawings acquisition.
Fig. 1 is the total of the robot man-machine interaction method based on user emotion and intention assessment provided in an embodiment of the present invention
Body control flow chart;
During Fig. 2 is the robot man-machine interaction method based on user emotion and intention assessment provided in an embodiment of the present invention
User emotion identification process figure;
During Fig. 3 is the robot man-machine interaction method based on user emotion and intention assessment provided in an embodiment of the present invention
User view identification process figure.
Specific embodiment
Technical scheme is clearly and completely described below in conjunction with accompanying drawing, it is clear that described enforcement
Example is a part of embodiment of the invention, rather than the embodiment of whole.Based on the embodiment in the present invention, ordinary skill
The every other embodiment obtained under the premise of creative work is not made by personnel, belongs to the scope of protection of the invention.
In describing the invention, it should be noted that term " " center ", " on ", D score, "left", "right", " vertical ",
The orientation or position relationship of the instruction such as " level ", " interior ", " outward " be based on orientation shown in the drawings or position relationship, merely to
Be easy to description the present invention and simplify description, rather than indicate or imply indication device or element must have specific orientation,
With specific azimuth configuration and operation, therefore it is not considered as limiting the invention.Additionally, term " first ", " second ",
" the 3rd " is only used for describing purpose, and it is not intended that indicating or implying relative importance.
In describing the invention, it should be noted that unless otherwise clearly defined and limited, term " installation ", " phase
Company ", " connection " should be interpreted broadly, for example, it may be being fixedly connected, or being detachably connected, or be integrally connected;Can
Being to be mechanically connected, or electrically connect;Can be joined directly together, it is also possible to be indirectly connected to by intermediary, Ke Yishi
The connection of two element internals.For the ordinary skill in the art, above-mentioned term can be understood at this with concrete condition
Concrete meaning in invention.
Below by specific embodiment and combine accompanying drawing the present invention is described in further detail.
A kind of robot man-machine interaction method based on user emotion and intention assessment provided in an embodiment of the present invention, including
Following steps:
S100, robot carry out Intelligent Recognition to user emotion and user view;
S200, robot obtain intelligent decision according to the Intelligent Recognition result;
S300, robot export interactive action using man-machine interaction mode according to intelligent decision result.
I.e. the present invention is a kind of robot man-machine interaction method based on user emotion and intention assessment, with reference to Fig. 1, mainly
Including user emotion identification, user view identification, intelligent decision, a few parts of man-machine interaction.
Concrete operations to the robot man-machine interaction method based on user emotion and intention assessment and concrete skill below
Art effect is done and is described in detail:
Further, in the step s 100, to carry out specifically including for Intelligent Recognition to user emotion as follows for the robot
Step:
S101, user emotion identification are concrete using breathing, pulse, heart rate, skin electricity Human Physiology information, by extracting
The various features of signal under different moods, Land use models recognizer carry out Emotion identification, user emotion are identified as light, high
Emerging, sad, angry four kinds of emotional informations;
S102, the accuracy that above-mentioned Emotion identification is further improved with reference to facial expression image process, identification.
Wherein, the Human Physiology information is acquired by wearable harvester, by communication with
Robot carries out information transfer.
Wherein, the facial expression is acquired by the camera in robot.
Further, the communication includes bluetooth connection or wireless network connection.
It should be noted that wearable harvester is to be worn on the electronic collection device with user, which can have
Human Physiology information (detections i.e. to human body physiological characteristics) with the collection user of effect;Then will by communication
To robot, Human Physiology information transmission treats that robot is analyzed to above- mentioned information and identifying processing.
The communication mode of above-mentioned wearable harvester and robot, naturally it is also possible to communicated by wired mode
Connection (such as interface connection);By the way of being only to be more highly preferred to using above-mentioned communication;In communication
In, more preferably connected using bluetooth connection or wireless network.
Further, in the step s 100, to carry out specifically including for Intelligent Recognition to user view as follows for the robot
Step:
S103, user view identification are concrete using the pressure being arranged in robot, photoelectricity, temperature multiple sensors
Information differentiates surrounding environment and user action, while voice signal is acquired using the microphone in robot,
By Speech processing and feature extraction, speech recognition is carried out, the result and sensor data fusion of speech recognition are recognized
User view.
Further, in step s 200, the robot obtains intelligent decision according to the Intelligent Recognition result, specifically
Comprise the steps:
S201, the Intelligent Decision-making Method carry out Based Intelligent Control decision-making according to the result of user emotion and intention assessment, and
Obtain corresponding executing agency of control robot and complete control instruction.
Further, in step S300, the robot is exported using man-machine interaction mode according to intelligent decision result
Interactive action specifically includes following steps:
S301, the robot receive control instruction;
S302, the robot control each executing agency according to the control instruction and perform interactive action.
Further, in S302 steps, the robot controls each executing agency according to the control instruction and performs
Interactive action specifically includes following steps:
S303, the robot complete limb action using each step motor control robot, realize machine person to person
Action interaction;
S304, the robot play voice, musical sound file by loudspeaker, realize that the voice of machine person to person is handed over
Mutually.
Further, each described stepper motor is used to correspond to the manipulator of control machinery people, mechanical arm, pedipulator, machine
The tool number of people is realized imitating animal (for example:Including the mankind or other pet animals etc.) shake hands accordingly, embrace, walk, shake the head
Action.
It should be noted that in the mechanical realization of robot, generally comprise mechanical arm, manipulator, pedipulator and
The mechanical number of people, which each implements drive actions by stepper motor;For example, when needs carry out the imitation mankind shakes hands, machine
The interactive action that tool people adopts is that the stepper motor on manipulator implements motion, drives manipulator and arms swing certain angle
And move back and forth, and then realize imitating the action that the mankind shake hands.
Referring to Figure of description to provided in an embodiment of the present invention based on user emotion and the robot of intention assessment
Man-machine interaction method is illustrated:
With reference to shown in Fig. 2, Emotion identification mainly utilizes breathing, pulse, heart rate, skin electricity Human Physiology information, by extracting
The various features of signal under different moods, Land use models recognizer carry out Emotion identification, user emotion are identified as light, high
Emerging, sad, angry four kinds.Meanwhile, the accuracy of Emotion identification is further improved with reference to facial expression image process, identification.Its
In, Human Physiology information is acquired by wearable harvester, is entered with robot by communications such as bluetooths
Row information is transmitted;Facial expression is acquired by the camera in robot.
With reference to shown in Fig. 3, user view identification is main various using the pressure being arranged in robot, photoelectricity, temperature etc.
Sensor information differentiates surrounding environment and user action, while being carried out to voice signal using the microphone in robot
Collection, by Speech processing and feature extraction, carries out speech recognition, the result of speech recognition and sensor information is melted
Close, identifying user is intended to.Such as:Voice-enabled chat, information inquiry, broadcasting music, limbs interaction etc..
In robot control process, the not simple man-to-man corresponding relation of information input and controlled output, but
One cross one another complex network.Due to the input of many heat transfer agents, robot can detect user various moods and
It is intended to, under different emotional status, identical user view is also required to different controls and responds.Such as:In glad and sad feelings
Under thread, the content such as chat, action interaction all should be differentiated.The intelligent decision of the present invention is according to user emotion and intention assessment
Result determine the instruction of Based Intelligent Control so that robot realizes personalized man-machine interaction.
Man-machine interaction mainly adopts step motor control robot limb action, realizes the action interaction of machine person to person,
Such as:Shake hands, shake the head, embrace;Audio files is played by loudspeaker, the interactive voice of machine person to person is realized, such as:Chat, broadcast
Put the music on, information inquiry etc..
The outstanding feature of the present invention is can to merge multiple sensors information to carry out user emotion and the identification being intended to, can
Being robot is targetedly carried out and the emotion friendship such as limbs interaction of people, voice dialogue according to the different moods of user and intention
Stream activity, with more affinity, is easily extensively liked by old man, children, is particularly well-suited to carry out emotion to old man, children
Accompany and attend to, can effectively mitigate its feeling of lonely.Can be widely used for various fields such as the elderly, child household and home for destitute, kindergarten
Institute, with wide market prospects.
In sum, a kind of robot man-machine interaction method based on user emotion and intention assessment that the present invention is provided,
Which can pass through mood, the identification that is intended to of the multiple sensors information realization to people, provide for the good interpersonal interaction of robot
Precondition.Which can make robot according to the mood of user and be intended to targetedly make different reactions simultaneously, realize
Personalized man-machine interaction, preferably plays a part of to accompany old man, children's spirit.
Finally it should be noted that:Various embodiments above only to illustrate technical scheme, rather than a limitation;To the greatest extent
Pipe has been described in detail to the present invention with reference to foregoing embodiments, it will be understood by those within the art that:Its according to
So the technical scheme described in foregoing embodiments can be modified, or which part or all technical characteristic are entered
Row equivalent;And these modifications or replacement, do not make the essence of appropriate technical solution depart from various embodiments of the present invention technology
The scope of scheme.
Claims (6)
1. a kind of robot man-machine interaction method based on user emotion and intention assessment, it is characterised in that comprise the steps:
S100, robot carry out Intelligent Recognition to user emotion and user view;
S200, robot obtain intelligent decision according to the Intelligent Recognition result;
S300, robot export interactive action using man-machine interaction mode according to intelligent decision result;
In the step s 100, what the robot carried out Intelligent Recognition to user emotion specifically includes following steps:
S101, user emotion identification are concrete using breathing, pulse, heart rate, skin electricity Human Physiology information, different by extracting
The various features of signal under mood, Land use models recognizer carry out Emotion identification, by user emotion be identified as it is light, glad,
Sad, angry four kinds of emotional informations;
S102, the accuracy that above-mentioned Emotion identification is further improved with reference to facial expression image process, identification;
Wherein, the facial expression is acquired by the camera in robot;
In the step s 100, what the robot carried out Intelligent Recognition to user view specifically includes following steps:
S103, user view identification are concrete using the pressure being arranged in robot, photoelectricity, temperature multiple sensors information
Differentiate surrounding environment and user action, while being acquired to voice signal using the microphone in robot, pass through
Speech processing and feature extraction, carry out speech recognition, by the result and sensor data fusion of speech recognition, identifying user
It is intended to;
In step s 200, the robot obtains intelligent decision according to the Intelligent Recognition result, specifically includes following steps:
S201, the Intelligent Decision-making Method carry out Based Intelligent Control decision-making according to the result of user emotion and intention assessment, and obtain
Corresponding executing agency of control robot completes control instruction.
2. the robot man-machine interaction method based on user emotion and intention assessment as claimed in claim 1, it is characterised in that
Wherein, the Human Physiology information is acquired by wearable harvester, by communication and machine
People carries out information transfer.
3. the robot man-machine interaction method based on user emotion and intention assessment as claimed in claim 2, it is characterised in that
The communication includes that wireless network connects.
4. the robot man-machine interaction method based on user emotion and intention assessment as claimed in claim 1, it is characterised in that
In step S300, the robot is according to intelligent decision result, concrete using man-machine interaction mode output interactive action
Comprise the steps:
S301, the robot receive control instruction;
S302, the robot control each executing agency according to the control instruction and perform interactive action.
5. the robot man-machine interaction method based on user emotion and intention assessment as claimed in claim 4, it is characterised in that
In S302 steps, the robot controls each executing agency's execution interactive action according to the control instruction and specifically wraps
Include following steps:
S303, the robot complete limb action using each step motor control robot, realize that machine person to person's is dynamic
Interact;
S304, the robot play voice, musical sound file by loudspeaker, realize the interactive voice of machine person to person.
6. the robot man-machine interaction method based on user emotion and intention assessment as claimed in claim 5, it is characterised in that
Each described stepper motor is used for the corresponding manipulator for controlling robot, mechanical arm, pedipulator, robot head and realizes imitating
Animal is shaken hands accordingly, embraces, walking, head shaking movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510526445.8A CN105082150B (en) | 2015-08-25 | 2015-08-25 | A kind of robot man-machine interaction method based on user emotion and intention assessment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510526445.8A CN105082150B (en) | 2015-08-25 | 2015-08-25 | A kind of robot man-machine interaction method based on user emotion and intention assessment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105082150A CN105082150A (en) | 2015-11-25 |
CN105082150B true CN105082150B (en) | 2017-04-05 |
Family
ID=54563921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510526445.8A Active CN105082150B (en) | 2015-08-25 | 2015-08-25 | A kind of robot man-machine interaction method based on user emotion and intention assessment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105082150B (en) |
Families Citing this family (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105345822B (en) * | 2015-12-17 | 2017-05-10 | 成都英博格科技有限公司 | Intelligent robot control method and device |
CN105364933B (en) * | 2015-12-17 | 2017-10-27 | 成都英博格科技有限公司 | Intelligent robot |
CN105739688A (en) * | 2016-01-21 | 2016-07-06 | 北京光年无限科技有限公司 | Man-machine interaction method and device based on emotion system, and man-machine interaction system |
CN105760362B (en) * | 2016-02-04 | 2018-07-27 | 北京光年无限科技有限公司 | A kind of question and answer evaluation method and device towards intelligent robot |
JP2017151517A (en) * | 2016-02-22 | 2017-08-31 | 富士ゼロックス株式会社 | Robot control system |
CN105807933B (en) * | 2016-03-18 | 2019-02-12 | 北京光年无限科技有限公司 | A kind of man-machine interaction method and device for intelligent robot |
CN105825268B (en) * | 2016-03-18 | 2019-02-12 | 北京光年无限科技有限公司 | The data processing method and system of object manipulator action learning |
CN105843381B (en) * | 2016-03-18 | 2020-07-28 | 北京光年无限科技有限公司 | Data processing method for realizing multi-modal interaction and multi-modal interaction system |
CN105868827B (en) * | 2016-03-25 | 2019-01-22 | 北京光年无限科技有限公司 | A kind of multi-modal exchange method of intelligent robot and intelligent robot |
CN105843118B (en) * | 2016-03-25 | 2018-07-27 | 北京光年无限科技有限公司 | A kind of robot interactive method and robot system |
CN105893771A (en) * | 2016-04-15 | 2016-08-24 | 北京搜狗科技发展有限公司 | Information service method and device and device used for information services |
CN106411834A (en) * | 2016-04-18 | 2017-02-15 | 乐视控股(北京)有限公司 | Session method based on companion equipment, equipment and system |
CN105912530A (en) * | 2016-04-26 | 2016-08-31 | 北京光年无限科技有限公司 | Intelligent robot-oriented information processing method and system |
CN105690408A (en) * | 2016-04-27 | 2016-06-22 | 深圳前海勇艺达机器人有限公司 | Emotion recognition robot based on data dictionary |
CN107307852A (en) * | 2016-04-27 | 2017-11-03 | 王方明 | Intelligent robot system |
WO2018000268A1 (en) * | 2016-06-29 | 2018-01-04 | 深圳狗尾草智能科技有限公司 | Method and system for generating robot interaction content, and robot |
CN106489114A (en) * | 2016-06-29 | 2017-03-08 | 深圳狗尾草智能科技有限公司 | A kind of generation method of robot interactive content, system and robot |
WO2018000259A1 (en) * | 2016-06-29 | 2018-01-04 | 深圳狗尾草智能科技有限公司 | Method and system for generating robot interaction content, and robot |
CN106078743B (en) * | 2016-07-05 | 2019-03-01 | 北京光年无限科技有限公司 | Intelligent robot, operating system and application shop applied to intelligent robot |
WO2018006380A1 (en) * | 2016-07-07 | 2018-01-11 | 深圳狗尾草智能科技有限公司 | Human-machine interaction system, device, and method for robot |
CN107590120A (en) * | 2016-07-07 | 2018-01-16 | 深圳狗尾草智能科技有限公司 | Artificial intelligence process method and device |
CN106239506B (en) * | 2016-08-11 | 2018-08-21 | 北京光年无限科技有限公司 | The multi-modal input data processing method and robot operating system of intelligent robot |
CN107784354B (en) | 2016-08-17 | 2022-02-25 | 华为技术有限公司 | Robot control method and accompanying robot |
CN106361356A (en) * | 2016-08-24 | 2017-02-01 | 北京光年无限科技有限公司 | Emotion monitoring and early warning method and system |
CN106182032B (en) * | 2016-08-24 | 2018-11-13 | 陈中流 | One kind is accompanied and attended to robot |
CN106325127B (en) * | 2016-08-30 | 2019-03-08 | 广东美的制冷设备有限公司 | It is a kind of to make the household electrical appliances expression method and device of mood, air-conditioning |
CN106372604A (en) * | 2016-08-31 | 2017-02-01 | 北京光年无限科技有限公司 | Intelligent robot negative emotion detection method and system |
CN106541408B (en) * | 2016-10-11 | 2018-10-12 | 北京光年无限科技有限公司 | Child behavior bootstrap technique based on intelligent robot and system |
CN107943272A (en) * | 2016-10-12 | 2018-04-20 | 深圳大森智能科技有限公司 | A kind of intelligent interactive system |
CN106293102A (en) * | 2016-10-13 | 2017-01-04 | 旗瀚科技有限公司 | A kind of robot affective interaction method based on user mood change emotion |
CN106773923B (en) * | 2016-11-30 | 2020-04-21 | 北京光年无限科技有限公司 | Multi-mode emotion data interaction method and device for robot |
CN106778575A (en) * | 2016-12-06 | 2017-05-31 | 山东瀚岳智能科技股份有限公司 | A kind of recognition methods of Students ' Learning state based on wearable device and system |
CN106843458B (en) * | 2016-12-12 | 2021-05-25 | 北京光年无限科技有限公司 | Man-machine interaction method and device for intelligent robot |
CN108614987A (en) * | 2016-12-13 | 2018-10-02 | 深圳光启合众科技有限公司 | The method, apparatus and robot of data processing |
CN107053191B (en) | 2016-12-31 | 2020-05-08 | 华为技术有限公司 | Robot, server and man-machine interaction method |
WO2018157325A1 (en) * | 2017-03-01 | 2018-09-07 | 深圳市前海中康汇融信息技术有限公司 | Child accompanying robot and operation method therefor |
CN107066956B (en) * | 2017-03-24 | 2020-06-19 | 北京科技大学 | Multisource emotion recognition robot based on body area network |
CN107081774B (en) * | 2017-05-27 | 2019-11-05 | 上海木木机器人技术有限公司 | Robot shakes hands control method and system |
CN107301168A (en) * | 2017-06-01 | 2017-10-27 | 深圳市朗空亿科科技有限公司 | Intelligent robot and its mood exchange method, system |
CN107292244A (en) * | 2017-06-01 | 2017-10-24 | 深圳欧德蒙科技有限公司 | A kind of stress recognition methods, smart machine and computer-readable recording medium |
CN107186728B (en) * | 2017-06-15 | 2020-02-14 | 重庆柚瓣家科技有限公司 | Intelligent endowment service robot control system |
CN107243905A (en) * | 2017-06-28 | 2017-10-13 | 重庆柚瓣科技有限公司 | Mood Adaptable System based on endowment robot |
CN107976919B (en) * | 2017-07-28 | 2019-11-15 | 北京物灵智能科技有限公司 | A kind of Study of Intelligent Robot Control method, system and electronic equipment |
CN109426653A (en) * | 2017-08-27 | 2019-03-05 | 南京乐朋电子科技有限公司 | Psychological consultation robot |
CN107728785A (en) * | 2017-10-16 | 2018-02-23 | 南京阿凡达机器人科技有限公司 | Robot interactive method and its system |
CN107657852B (en) * | 2017-11-14 | 2023-09-22 | 翟奕雲 | Infant teaching robot, teaching system and storage medium based on face recognition |
CN108229641A (en) * | 2017-12-20 | 2018-06-29 | 广州创显科教股份有限公司 | A kind of artificial intelligence analysis's system based on multi-Agent |
CN108320735A (en) * | 2018-01-23 | 2018-07-24 | 北京易智能科技有限公司 | A kind of emotion identification method and system of multi-data fusion |
CN110085262A (en) * | 2018-01-26 | 2019-08-02 | 上海智臻智能网络科技股份有限公司 | Voice mood exchange method, computer equipment and computer readable storage medium |
CN110309254A (en) * | 2018-03-01 | 2019-10-08 | 富泰华工业(深圳)有限公司 | Intelligent robot and man-machine interaction method |
CN109048920A (en) * | 2018-09-30 | 2018-12-21 | 中国船舶重工集团公司第七0七研究所 | A kind of user interactive system based on wearable power-assisting robot |
CN109549624A (en) * | 2018-11-04 | 2019-04-02 | 南京云思创智信息科技有限公司 | A kind of real-time video sentiment analysis method and system based on deep learning |
CN109669535A (en) * | 2018-11-22 | 2019-04-23 | 歌尔股份有限公司 | Audio controlling method and system |
CN109683709A (en) * | 2018-12-17 | 2019-04-26 | 苏州思必驰信息科技有限公司 | Man-machine interaction method and system based on Emotion identification |
CN110263723A (en) * | 2019-06-21 | 2019-09-20 | 王森 | The gesture recognition method of the interior space, system, medium, equipment |
CN110473534A (en) * | 2019-07-12 | 2019-11-19 | 南京邮电大学 | A kind of nursing old people conversational system based on deep neural network |
CN112489797A (en) * | 2019-09-11 | 2021-03-12 | 北京国双科技有限公司 | Accompanying method, device and terminal equipment |
CN110751951B (en) * | 2019-10-25 | 2022-11-11 | 智亮君 | Handshake interaction method and system based on intelligent mirror and storage medium |
CN111507149B (en) * | 2020-01-03 | 2023-10-27 | 京东方艺云(杭州)科技有限公司 | Interaction method, device and equipment based on expression recognition |
CN111297379A (en) * | 2020-02-10 | 2020-06-19 | 中国科学院深圳先进技术研究院 | Brain-computer combination system and method based on sensory transmission |
WO2021159230A1 (en) * | 2020-02-10 | 2021-08-19 | 中国科学院深圳先进技术研究院 | Brain-computer interface system and method based on sensory transmission |
CN111368053B (en) * | 2020-02-29 | 2020-12-11 | 重庆百事得大牛机器人有限公司 | Mood pacifying system based on legal consultation robot |
CN111625799A (en) * | 2020-06-04 | 2020-09-04 | 中国银行股份有限公司 | Processing method and device based on face unlocking |
CN112297023B (en) * | 2020-10-22 | 2022-04-05 | 新华网股份有限公司 | Intelligent accompanying robot system |
CN112379780B (en) * | 2020-12-01 | 2021-10-26 | 宁波大学 | Multi-mode emotion interaction method, intelligent device, system, electronic device and medium |
CN112990067A (en) * | 2021-03-31 | 2021-06-18 | 上海理工大学 | Robot intelligent emotion recognition and cure method for solitary people |
CN113334397B (en) * | 2021-04-30 | 2022-08-30 | 北京智能工场科技有限公司 | Emotion recognition entity robot device |
CN113139525B (en) * | 2021-05-21 | 2022-03-01 | 国家康复辅具研究中心 | Multi-source information fusion-based emotion recognition method and man-machine interaction system |
CN113391701B (en) * | 2021-06-15 | 2021-12-07 | 国家康复辅具研究中心 | Rehabilitation training method and system fusing virtual reality game and intention recognition |
CN113246156A (en) * | 2021-07-13 | 2021-08-13 | 武汉理工大学 | Child accompanying robot based on intelligent emotion recognition and control method |
CN114732663A (en) * | 2022-03-18 | 2022-07-12 | 郑州大学 | Binary mental health image mapping system for stroke patients and caregivers thereof |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4014044B2 (en) * | 2003-01-28 | 2007-11-28 | 株式会社国際電気通信基礎技術研究所 | Communication robot and communication system using the same |
CN101036838A (en) * | 2007-04-19 | 2007-09-19 | 复旦大学 | Intelligent robot friend for study and entertainment |
JP2009061547A (en) * | 2007-09-06 | 2009-03-26 | Olympus Corp | Robot control system, robot, program, and information storage medium |
CN101661569B (en) * | 2009-09-18 | 2013-03-27 | 北京科技大学 | Intelligent emotional robot multi-modal behavioral associative expression system |
CN104287747A (en) * | 2014-10-24 | 2015-01-21 | 南京邮电大学 | Exercise rehabilitation robot interactive control method based on emotion perception |
CN104483847A (en) * | 2014-10-24 | 2015-04-01 | 南京邮电大学 | Robot auxiliary recovery human-computer interaction control method based on emotion recognition and hybrid theory |
-
2015
- 2015-08-25 CN CN201510526445.8A patent/CN105082150B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN105082150A (en) | 2015-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105082150B (en) | A kind of robot man-machine interaction method based on user emotion and intention assessment | |
CN110070944B (en) | Social function assessment training system based on virtual environment and virtual roles | |
CN105078449B (en) | Senile dementia monitor system based on health service robot | |
US8909370B2 (en) | Interactive systems employing robotic companions | |
Stefanov et al. | The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives | |
CN104440925B (en) | A kind of pet type accompany and attend to robot and system | |
CN108461126A (en) | In conjunction with virtual reality(VR)The novel intelligent psychological assessment of technology and interfering system | |
CN102460347A (en) | Nasal flow device controller | |
Castillo et al. | A framework for recognizing and regulating emotions in the elderly | |
CN110136499A (en) | Robot assisted interaction systems and its method | |
KR102476675B1 (en) | Method and server for smart home control based on interactive brain-computer interface | |
CN104799984A (en) | Assistance system for disabled people based on brain control mobile eye and control method for assistance system | |
US20220113799A1 (en) | Multiple switching electromyography (emg) assistive communications device | |
CN107307865A (en) | A kind of autism children supplementary AC device | |
CN111297379A (en) | Brain-computer combination system and method based on sensory transmission | |
KR102048551B1 (en) | System and Method for Virtual reality rehabilitation training using Smart device | |
CN111736483A (en) | Intelligent feedback housekeeper system and feedback method for same | |
WO2021159230A1 (en) | Brain-computer interface system and method based on sensory transmission | |
WO2021026078A1 (en) | Remote virtual and augmented reality monitoring and control systems | |
CN116440383A (en) | Portable psychological accompanying robot system and emotion supporting method | |
CN102880080A (en) | Somatosensory interaction method for bionic fish | |
CN109621441A (en) | A kind of intelligent electronic pet | |
CN105005691A (en) | Social emotion accompanying system | |
Semeraro et al. | Physiological wireless sensor network for the detection of human moods to enhance human-robot interaction | |
CN201299883Y (en) | Intelligent psychological hug guiding instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |