CN108858219A - A kind of good robot of interaction effect - Google Patents

A kind of good robot of interaction effect Download PDF

Info

Publication number
CN108858219A
CN108858219A CN201810726815.6A CN201810726815A CN108858219A CN 108858219 A CN108858219 A CN 108858219A CN 201810726815 A CN201810726815 A CN 201810726815A CN 108858219 A CN108858219 A CN 108858219A
Authority
CN
China
Prior art keywords
robot
user
state
affective
subsystem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201810726815.6A
Other languages
Chinese (zh)
Inventor
韦德远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Well Trading Co Ltd
Original Assignee
Wuzhou Well Trading Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuzhou Well Trading Co Ltd filed Critical Wuzhou Well Trading Co Ltd
Priority to CN201810726815.6A priority Critical patent/CN108858219A/en
Publication of CN108858219A publication Critical patent/CN108858219A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present invention provides a kind of good robots of interaction effect, including perceiving subsystem, interactive voice subsystem, affective interaction subsystem, control executive subsystem, the perception subsystem is for obtaining external environmental information, the perception subsystem includes microphone, high-definition camera and laser radar, the microphone is used to obtain the voice messaging of user, the high definition photography/videography head is used to obtain the face information of user, the laser radar is for obtaining obstacle information, the interactive voice subsystem is used to carry out interactive voice with user according to voice messaging, the affective interaction subsystem is for carrying out affective interaction with user according to face information, the control executive subsystem is used to control robot according to obstacle information mobile, barrier is hidden.Beneficial effects of the present invention are:A kind of good robot of interaction effect is provided, the interactive voice and affective interaction with user are realized.

Description

A kind of good robot of interaction effect
Technical field
The present invention relates to robotic technology fields, and in particular to a kind of good robot of interaction effect.
Background technique
Artificial intelligence refers to the intelligence realized out by calculating the system that equipment is built.With deduction method, know Know representation, knowledge base, natural language processing, cognition technology it is gradually mature, the artificial intelligence study of 21 century is It gradually recognizes probabilistic objective reality in human knowledge and intelligence, and is sought for rule and the side of Uncertainty Management Method, at the same time, artificial intelligence have striden into automation engineering, natural language understanding, speech recognition, pattern-recognition and robotics The intelligent epoch combined;Along with the continuous development of the sensor-based network technologies such as information physical emerging system, Internet of Things, artificial intelligence Can will the year two thousand twenty preliminarily form integrate vision, smell, tactile, reasoning and affection computation comprehensive " perception " it is intelligent when Generation.
With the development in an all-round way of artificial intelligence, machine man-based development has also welcome spring, in the human-computer interaction and conjunction of intelligence During work, robot not only needs to have apery shape, it is often more important that has the emotion of apery, the thinking of apery and imitates People's behavior, with meet help the elderly help the disabled, mental support, the community services demand such as rehabilitation.
Summary of the invention
In view of the above-mentioned problems, the present invention is intended to provide a kind of good robot of interaction effect.
The purpose of the present invention is realized using following technical scheme:
Provide a kind of good robot of interaction effect, including perception subsystem, interactive voice subsystem, affective interaction Subsystem, control executive subsystem, for the perception subsystem for obtaining external environmental information, the perception subsystem includes wheat Gram wind, high-definition camera and laser radar, the microphone are used to obtain the voice messaging of user, and the high definition photography/videography head is used In the face information for obtaining user, the laser radar is used for root for obtaining obstacle information, the interactive voice subsystem Interactive voice is carried out with user according to voice messaging, the affective interaction subsystem is for carrying out emotion with user according to face information Interaction, the control executive subsystem are used to control robot movement according to obstacle information, hide to barrier.
Beneficial effects of the present invention are:A kind of good robot of interaction effect is provided, the voice with user is realized Interactive and affective interaction, detects barrier by laser radar, improves the locomitivity of robot.
Detailed description of the invention
The present invention will be further described with reference to the accompanying drawings, but the embodiment in attached drawing is not constituted to any limit of the invention System, for those of ordinary skill in the art, without creative efforts, can also obtain according to the following drawings Other attached drawings.
Fig. 1 is structural schematic diagram of the invention;
Appended drawing reference:
Perceive subsystem 1, interactive voice subsystem 2, affective interaction subsystem 3, control executive subsystem 4.
Specific embodiment
The invention will be further described with the following Examples.
Referring to Fig. 1, a kind of good robot of interaction effect of the present embodiment, including perception subsystem 1, interactive voice System 2, affective interaction subsystem 3, control executive subsystem 4, the perception subsystem 1 is for obtaining external environmental information, institute Stating perception subsystem 1 includes microphone, high-definition camera and laser radar, and the microphone is used to obtain the voice letter of user Breath, the high definition photography/videography head are used to obtain the face information of user, and the laser radar is described for obtaining obstacle information Interactive voice subsystem 2 is used to carry out interactive voice with user according to voice messaging, and the affective interaction subsystem 3 is used for basis Face information carries out affective interaction with user, and the control executive subsystem 4 is used to control robot according to obstacle information and move It is dynamic, barrier is hidden.
A kind of good robot of interaction effect is present embodiments provided, is realized and the interactive voice of user and emotion friendship Mutually, barrier is detected by laser radar, improves the locomitivity of robot.
Preferably, the interactive voice subsystem 2 includes speech recognition module, voice synthetic module and loudspeaker, described Speech recognition module is used to extract the voice messaging of user, and is translated into identifiable binary machine language, institute's predicate Sound synthesis module is used to convert character information to voice messaging, and the loudspeaker is used to play the voice messaging of conversion.
This preferred embodiment realizes robot with interactive voice accurate between user.
Preferably, the affective interaction subsystem 3 includes the first modeling module, the second modeling module, machine vision module With affective interaction module, first modeling module is for establishing emotional space model, and second modeling module is according to emotion Spatial model determines emotional energy, and the machine vision module is handed over for obtaining user feeling, the emotion according to face information Mutual module is made corresponding emotion according to user feeling for robot and is changed.
First modeling module is for establishing emotional space model, specially:
Establish two-dimentional emotional space model, the dimension of two-dimentional emotional space is respectively happy degree and activity, described happy The happy degree for indicating emotion is spent, the activity is used to indicate the activation degree of emotion;
It is by the affective state set expression of robot:S={ s1, s2..., sn, in formula, siIndicate i-th kind of robot Affective state, i=1,2 ..., n, n indicate the quantity of the affective state of robot, and robot affective state is empty in two-dimentional emotion Between in be described in dots:(ai, bi), wherein aiIndicate the happy degree of i-th kind of affective state of robot, biExpression machine The activity of i-th kind of affective state of device people;
It is by the affective state set expression of user:R={ r1, r2..., rm, in formula, rjIndicate i-th kind of emotion of user State, j=1,2 ..., m, m indicates the quantity of the affective state of user, by user feeling state with point in two-dimentional emotional space Form be described:(aj, bj), wherein ajIndicate the happy degree of user's jth kind affective state, biIndicate user's jth kind feelings The activity of sense state;
This preferred embodiment while realizing affective state and accurately express, is subtracted by establishing two-dimentional emotional space model Calculation amount is lacked, has improved computational efficiency, has laid a good foundation for subsequent interaction.
Preferably, second modeling module determines emotional energy according to emotional space model, specially:
Various psychological activity driving sources are defined as mental capacity, are indicated with E:
E=E1+E2
In formula, E1Indicate the free mental capacity of spontaneous generation under suitable condition, E11E, E2It indicates to pierce in the external world Swash the lower controlled mental capacity generated of effect, E22E, wherein δ1Indicate psychological arousal, δ2Indicate psychological degree of suppression, δ1、δ2∈ [0,1], δ12=1;
The mental capacity of emotion is determined according to emotional space model:
E=log2(y+1)×D(a+b)
In formula, D indicates emotional intensity, and y indicates emotion coefficient, and a, b respectively indicate the happy degree of affective state, activity;
Emotional energy is determined using following formula:
Eq=E1+μE2=(1- δ2+μδ2)×log2(y+1)×D(a+b)
In formula, EqIndicate that emotional energy, μ indicate the emotion shooting parameter of psychology, μ ∈ [0,1];
This preferred embodiment defines mental capacity and emotional energy, facilitates the interactive performance of hoisting machine people, after being Continuous interaction lays the foundation.
Preferably, the affective interaction module is made corresponding emotion according to user feeling for robot and is changed, specifically For:
When the current affective state of robot is identical as user feeling state, then the affective state of robot does not become Change, but the emotional energy of robot can be double;
When the current affective state of robot is different from user feeling state, then the lower affective state of robot will change Becoming, next affective state is not only related with the affective state of current robot, and it is also closely bound up with user feeling state, if machine The current affective state of people is si(ai, bi), i={ 1,2 ..., n }, user feeling state is rj(aj, bj), j={ 1,2 ..., m }, Any possible affective state of robot subsequent time is sk(ak, bk), k={ 1,2 ..., n }, i ≠ j ≠ k;
Calculating the feature vector shifted from current affective state to user feeling state is X1:X1=(aj-ai, bj-bi), from The feature vector of current affective state to any possible affective state transfer is X2:X2=(ak-ai, bk-bi), from user feeling shape The feature vector of state to any possible affective state transfer is X2:X2=(ak-aj, bk-bj), transference letter is determined using following formula Number Z:
Transference function is minimized, affective state s when transference function is minimized is obtainedz(az, bz), z ∈ k, Using the affective state as robot subsequent time state.
The emotion that this preferred embodiment robot can simulate the mankind generates, changes, and is allowed to meet human emotion's variation Rule, meet the needs of human emotion in home environment, when the current affective state of robot is identical as user feeling state, Robot emotional energy increases, when the current affective state of robot is different from user feeling state, then the lower emotion of robot State will change;The current affective state of robot is got up with user feeling state relation by transference function, in turn The type for judging the lower affective state of robot, improves the interaction capabilities of robot.
The good robot of interaction effect of the present invention carries out human-computer interaction, chooses 5 users and tests, respectively user 1, user 2, user 3, user 4, user 5, count interactive efficiency and user satisfaction, compare compared with robot, produce Raw has the beneficial effect that shown in table:
Interactive efficiency improves User satisfaction improves
User 1 29% 27%
User 2 27% 26%
User 3 26% 26%
User 4 25% 24%
User 5 24% 22%
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected The limitation of range is protected, although explaining in detail referring to preferred embodiment to the present invention, those skilled in the art are answered Work as understanding, it can be with modification or equivalent replacement of the technical solution of the present invention are made, without departing from the reality of technical solution of the present invention Matter and range.

Claims (6)

1. a kind of good robot of interaction effect, which is characterized in that including perceiving subsystem, interactive voice subsystem, emotion Interactive subsystem, control executive subsystem, the perception subsystem is for obtaining external environmental information, the perception subsystem packet Microphone, high-definition camera and laser radar are included, the microphone is used to obtain the voice messaging of user, the high definition photography/videography Head is used for obtaining the face information of user, the laser radar for obtaining obstacle information, the interactive voice subsystem In carrying out interactive voice with user according to voice messaging, the affective interaction subsystem with user for carrying out according to face information Affective interaction, the control executive subsystem are used to control robot movement according to obstacle information, hide to barrier.
2. the good robot of interaction effect according to claim 1, which is characterized in that the interactive voice subsystem packet Speech recognition module, voice synthetic module and loudspeaker are included, the speech recognition module is used to extract the voice messaging of user, and It is translated into identifiable binary machine language, the voice synthetic module is used to convert character information in voice letter Breath, the loudspeaker are used to play the voice messaging of conversion.
3. the good robot of interaction effect according to claim 2, which is characterized in that the affective interaction subsystem packet The first modeling module, the second modeling module, machine vision module and affective interaction module are included, first modeling module is for building Vertical emotional space model, second modeling module determine emotional energy, the machine vision module according to emotional space model For obtaining user feeling according to face information, the affective interaction module is made accordingly for robot according to user feeling Emotion variation.
4. the good robot of interaction effect according to claim 3, which is characterized in that first modeling module is used for Emotional space model is established, specially:
Two-dimentional emotional space model is established, the dimension of two-dimentional emotional space is respectively happy degree and activity, and the happy degree is used In the happy degree for indicating emotion, the activity is used to indicate the activation degree of emotion;
It is by the affective state set expression of robot:S={ s1, s2..., sn, in formula, siIndicate i-th kind of emotion of robot State, i=1,2 ..., n, n indicates the quantity of the affective state of robot, by robot affective state in two-dimentional emotional space It is described in dots:(ai, bi), wherein aiIndicate the happy degree of i-th kind of affective state of robot, biIndicate robot The activity of i-th kind of affective state;
It is by the affective state set expression of user:R={ r1, r2..., rm, in formula, rjIndicate i-th kind of affective state of user, J=1,2 ..., m, m indicates the quantity of the affective state of user, by user feeling state with the shape of point in two-dimentional emotional space Formula is described:(aj, bj), wherein ajIndicate the happy degree of user's jth kind affective state, biIndicate user's jth kind emotion shape The activity of state.
5. the good robot of interaction effect according to claim 4, which is characterized in that second modeling module according to Emotional space model determines emotional energy, specially:
Various psychological activity driving sources are defined as mental capacity, are indicated with E:
E=E1+E2
In formula, E1Indicate the free mental capacity of spontaneous generation under suitable condition, E11E, E2It indicates to make in environmental stimuli With the controlled mental capacity of lower generation, E22E, wherein δ1Indicate psychological arousal, δ2Indicate psychological degree of suppression, δ1、δ2 ∈ [0,1], δ12=1;
The mental capacity of emotion is determined according to emotional space model:
E=log2(y+1)×D(a+b)
In formula, D indicates emotional intensity, and y indicates emotion coefficient, and a, b respectively indicate the happy degree of affective state, activity;
Emotional energy is determined using following formula:
Eq=E1+μE2=(1- δ2+μδ2)×log2(y+1)×D(a+b)
In formula, EqIndicate that emotional energy, μ indicate the emotion shooting parameter of psychology, μ ∈ [0,1].
6. the good robot of interaction effect according to claim 5, which is characterized in that the affective interaction module is used for Robot makes corresponding emotion according to user feeling and changes, specially:
When the current affective state of robot is identical as user feeling state, then the affective state of robot does not change, but The emotional energy of robot can be double;
When the current affective state of robot is different from user feeling state, then the lower affective state of robot will change, under One affective state is not only related with the affective state of current robot, also closely bound up with user feeling state, if robot works as Preceding affective state is si(ai, bi), i={ 1,2 ..., n }, user feeling state is rj(aj, bj), j={ 1,2 ..., m }, machine Any possible affective state of people's subsequent time is sk(ak, bk), k={ 1,2 ..., n }, i ≠ j ≠ k;
Calculating the feature vector shifted from current affective state to user feeling state is X1:X1=(aj-ai, bj-bi), from current The feature vector of affective state to any possible affective state transfer is X2:X2=(ak-ai, bk-bi), from user feeling state to The feature vector of any possibility affective state transfer is X2:X2=(ak-aj, bk-bj), transference function Z is determined using following formula:
Transference function is minimized, affective state s when transference function is minimized is obtainedz(az, bz), z ∈ k, by this Affective state is as robot subsequent time state.
CN201810726815.6A 2018-07-04 2018-07-04 A kind of good robot of interaction effect Withdrawn CN108858219A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810726815.6A CN108858219A (en) 2018-07-04 2018-07-04 A kind of good robot of interaction effect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810726815.6A CN108858219A (en) 2018-07-04 2018-07-04 A kind of good robot of interaction effect

Publications (1)

Publication Number Publication Date
CN108858219A true CN108858219A (en) 2018-11-23

Family

ID=64299167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810726815.6A Withdrawn CN108858219A (en) 2018-07-04 2018-07-04 A kind of good robot of interaction effect

Country Status (1)

Country Link
CN (1) CN108858219A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101474481A (en) * 2009-01-12 2009-07-08 北京科技大学 Emotional robot system
KR20120098033A (en) * 2011-02-28 2012-09-05 동명대학교산학협력단 System of emotion appraisal and expression based on robot cognition
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN104842358A (en) * 2015-05-22 2015-08-19 上海思岚科技有限公司 Autonomous mobile multifunctional robot
CN107065863A (en) * 2017-03-13 2017-08-18 山东大学 A kind of guide to visitors based on face recognition technology explains robot and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101474481A (en) * 2009-01-12 2009-07-08 北京科技大学 Emotional robot system
KR20120098033A (en) * 2011-02-28 2012-09-05 동명대학교산학협력단 System of emotion appraisal and expression based on robot cognition
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN104842358A (en) * 2015-05-22 2015-08-19 上海思岚科技有限公司 Autonomous mobile multifunctional robot
CN107065863A (en) * 2017-03-13 2017-08-18 山东大学 A kind of guide to visitors based on face recognition technology explains robot and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
朱明旱: ""基于流形学习的人脸表情识别研究"", 《万方数据知识服务平台》 *
王毅: ""基于仿人机器人的人机交互与合作研究"", 《中国博士学位论文全文数据库 信息科技辑》 *

Similar Documents

Publication Publication Date Title
Chen et al. Wearable affective robot
CN107030691B (en) Data processing method and device for nursing robot
US11670324B2 (en) Method for predicting emotion status and robot
CN106997243B (en) Speech scene monitoring method and device based on intelligent robot
Suzuki et al. Intelligent agent system for human-robot interaction through artificial emotion
CN106462254A (en) Robot interaction content generation method, system and robot
CN106875940A (en) A kind of Machine self-learning based on neutral net builds knowledge mapping training method
CN108009573A (en) A kind of robot emotion model generating method, mood model and exchange method
CN106462255A (en) A method, system and robot for generating interactive content of robot
CN108510049A (en) The service autonomous cognitive approach of robot based on emotion-space time information and robot
CN106502382A (en) Active exchange method and system for intelligent robot
Wei et al. Designing robot behavior in human robot interaction based on emotion expression
CN106489114A (en) A kind of generation method of robot interactive content, system and robot
CN106462804A (en) Method and system for generating robot interaction content, and robot
CN108919804A (en) A kind of intelligent vehicle Unmanned Systems
CN106537293A (en) Method and system for generating robot interactive content, and robot
CN108858219A (en) A kind of good robot of interaction effect
CN116205294A (en) Knowledge base self-updating method and device for robot social contact and robot
Goto et al. A method for driving humanoid robot based on human gesture
CN108762500A (en) A kind of intelligent robot
Jokinen et al. Embodied communicative activity in cooperative conversational interactions-studies in visual interaction management
Mori et al. Analysis of body behaviours in human-human and human-robot interactions
CN108563138A (en) A kind of intelligent domestic system
Du et al. The management system with emotional virtual human based on smart home
Tanaka et al. Nonverbal Communication Based on Instructed Learning for Socially Embedded Robot Partners

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20181123