CN106662931A - Robot man-machine interactive system, device and method - Google Patents

Robot man-machine interactive system, device and method Download PDF

Info

Publication number
CN106662931A
CN106662931A CN201680001719.4A CN201680001719A CN106662931A CN 106662931 A CN106662931 A CN 106662931A CN 201680001719 A CN201680001719 A CN 201680001719A CN 106662931 A CN106662931 A CN 106662931A
Authority
CN
China
Prior art keywords
robot
information
man
variable factor
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201680001719.4A
Other languages
Chinese (zh)
Inventor
杨新宇
王昊奋
邱楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Original Assignee
Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd filed Critical Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Publication of CN106662931A publication Critical patent/CN106662931A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot man-machine interactive system, wherein the system comprises an input module for receiving command information inputted by a user, a variable factor recognition module for recognizing variable factors occuring on a robot, and an output module for outputting a result of man-machine interaction. The command information comprises video, human face, facial expression, scene, vocal print, fingerprint, iris pupil or light sensation information. The variable factor information is based on the interaction information between the user and the robot. According to the invention, the state of the robot can be fed back to the user in time, and the robot actively tells the user through a way of animation, the way of man-machine interaction is more funny, intelligence, and the entertainment of man-machine interaction can be enhanced.

Description

A kind of robot man-machine interactive system, apparatus and method
Technical field
The present invention relates to intelligent robot technology field, more particularly, to a kind of robot man-machine interactive system, device and Method.
Background technology
With the progress of society, robot is not only widely used in industry, medical science, agricultural or military affairs, is even more giving birth to Begin slowly to incorporate the social activity of the mankind in work.Robot application in common social activity in site of activity or family, particularly In site of activity, the interaction of robot tends to the concern for drawing a crowd and interest.
At present, robot is mainly interacted by passive form with people, and this is passively that interactive mode easily causes people Be the interest actively amused to it.As social affective type intelligent robot, the attribute for highlighting people is needed, but active ditch The logical necessary process for being again to initiate between men emotional connection;Also need to catch the emotional resonance point of human nature in active is linked up, Such as people thirsts for being concerned about, is recognized, the presence sense in community;Perceptual, forgetful, preference easily, account for it is cheap, see immediately Effect is the popular common denominator of general sieve.
The content of the invention
The invention discloses a kind of robot man-machine interactive system, including:Robotic end, for the finger that receive user sends Information is made, variable factor information is also included wherein in robotic end, the variable factor based on robot exports interaction results;And User side for sending configured information and interactive with robot to robot.
Preferably, command information includes video, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation letter Breath.
Preferably, variable factor information is the interactive information based on user and robot.
Preferably, variable factor affects the output result of man-machine interaction.
The invention discloses a kind of robot man-machine interaction method, including:Configured information is sent to robot;Receive user Command information;And the variable factor based on robot exports interaction results.
Preferably, command information includes video, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation letter Breath.
Preferably, variable factor information is the interactive information based on user and robot.
The invention discloses a kind of robot human-computer interaction device, including:Input module, for the finger of receiving user's input Make information;Variable factor identification module, for recognizing the variable factor that robot occurs;And output module, for exporting people The result of machine interaction.
Preferably, command information includes video, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation letter Breath.
Preferably, variable factor information is the interactive information based on user and robot.
The invention also discloses a kind of robot man-machine interaction method, the command information of receive user;Identification robot Variable factor information;And according to variable factor information output interaction results.
Preferably, command information includes video, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation letter Breath.
Preferably, variable factor information is the interactive information based on user and robot.
By man-machine interactive system disclosed by the invention, device and method, be conducive to feeding back to user robot in time State, and user actively informs by animation mode in robot, and man-machine interaction mode is more interesting, intelligence, and increased people It is recreational that machine is interacted.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing The accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of invention, for those of ordinary skill in the art, without having to pay creative labor, may be used also To obtain other accompanying drawings according to these accompanying drawings.
Fig. 1 is according to a kind of robot man-machine interactive system block diagram of embodiments of the invention.
Fig. 2 is according to a kind of module frame chart of robot human-computer interaction device of embodiments of the invention.
Fig. 3 is according to a kind of robot man-machine interaction method flow chart of embodiments of the invention.
Fig. 4 is according to a kind of robot man-machine interaction method flow chart of embodiments of the invention.
Specific embodiment
Make further more detailed description to technical scheme with specific embodiment below in conjunction with the accompanying drawings.Obviously, Described embodiment a part of embodiment only of the invention, rather than the embodiment of whole.Based on the reality in the present invention Apply example, the every other embodiment that those of ordinary skill in the art are obtained on the premise of creative work is not made, all The scope of protection of the invention should be belonged to.
Fig. 1 show a kind of robot man-machine interactive system of embodiments in accordance with the present invention.As shown in figure 1, the system bag Include robotic end 110 and user side 120.Robotic end 110 is connected with user side 120, can mutually send command information, with And interactive information etc..Wherein robotic end 110 can be robot, virtual robot or other intelligent apparatus etc., user side 120 can be the Smartphone device of user's control, IPAD etc. electronic equipment, but be not limited to this kind equipment.
Fig. 2 is according to a kind of module frame chart of robot human-computer interaction device of embodiments of the invention.Specific Fig. 2 will be tied Close Fig. 1 to be described.As shown in Fig. 2 input module 210, variable factor identification module 220 are included in robotic end 110 with And output module 230.Wherein input module 210 is used for the command information of receiving user's input, and wherein user input is multi-modal Input, " multi-modal input " here is included but is not limited to, video, face, expression, scene, vocal print, fingerprint, iris pupil, light The information such as sense.Variable factor identification module 220 is mutually coupled with input module 210, for recognizing the variable factor that robot occurs, Spraining occurs in such as robot, schedule planning changes etc., but is not limited to this.It is multi-modal as input (sound, picture, Scene etc.), variable factor is the interactive information based on user and robot.In one embodiment, robot is due to excessively multiple Take exercises, the problem that excessive number of times is danced etc. can increase probability and turn round and arrive foot etc., and such case will be by digital display flow chart The example on side, so as to man-machine interaction can change, promises to dance that to dancing etc. now robot can inform use from script Family cannot dance make it is sorry wait action expression be transferred to output module.Output module 230 and variable factor identification module phase coupling Connect, for exporting the result of man-machine interaction.The instruction for such as selecting output user to send, such as dances, walks, or comprehensive Close and consider after variable factor, select to export the action expression such as sorry.
Fig. 3 is according to a kind of robot man-machine interaction method flow chart of embodiments of the invention.Fig. 3 will be carried out with reference to Fig. 1 Description.As shown in figure 3, step S301 robot receive user command information.Including but not limited to, video, face, expression, field The information such as scape, vocal print, fingerprint, iris pupil, light sensation.Step S302:Interaction results are exported based on variable factor.In an enforcement In example, variable factor is that based on the interactive information of user and robot, such as robot is taken exercises due to excessively multiple, excessively repeatedly The problem that number is danced etc., can increase probability and turn round and arrive foot etc., the example that such case will be beside digital display flow chart, so as to people Machine interaction can change, and promise to dance to dancing etc. from script, then inform the object information that can not be danced output User.
Fig. 4 is according to a kind of robot man-machine interaction method flow chart of embodiments of the invention.Fig. 4 will be carried out with reference to Fig. 2 Description, mainly includes the following steps that:S401:The command information that receive user sends;Step 402:Identification variable factor;Step 403:Interaction results are exported based on variable factor.Wherein command information is included but is not limited to, video, face, expression, scene, sound The information such as stricture of vagina, fingerprint, iris pupil, light sensation.
By man-machine interaction method disclosed by the invention, be conducive to feeding back to the state of user robot, and machine in time Device people actively informs user by animation mode, and man-machine interaction mode is more interesting, intelligence, and increased the amusement of man-machine interaction Property.
Above disclosed is only the preferred embodiment in the embodiment of the present invention, can not limit this with this certainly Bright interest field, therefore the equivalent variations made according to the claims in the present invention, still belong to the scope that the present invention is covered.

Claims (13)

1. a kind of robot man-machine interactive system, including:
Robotic end, for the command information that receive user sends, also includes variable factor information wherein in robotic end, be based on The variable factor output interaction results of robot;And
User side for sending configured information and interactive with robot to robot.
2. robot man-machine interactive system according to claim 1, it is characterised in that wherein command information is included and regarded Frequently, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation information.
3. robot man-machine interactive system according to claim 2, it is characterised in that the variable factor information be based on User and the interactive information of robot.
4. robot man-machine interactive system according to claim 1, it is characterised in that the variable factor affects man-machine friendship Mutual output result.
5. a kind of robot man-machine interaction method, including:
Configured information is sent to robot;
The command information of receive user;And
Variable factor based on robot exports interaction results.
6. robot man-machine interaction method according to claim 5, it is characterised in that the command information is included and regarded Frequently, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation information.
7. robot man-machine interaction method according to claim 5, it is characterised in that the variable factor information be based on User and the interactive information of robot.
8. a kind of robot human-computer interaction device, including:
Input module, for the command information of receiving user's input;
Variable factor identification module, for recognizing the variable factor that robot occurs;And
Output module, for exporting the result of man-machine interaction.
9. robot human-computer interaction device according to claim 8, it is characterised in that the command information is included and regarded Frequently, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation information.
10. robot human-computer interaction device according to claim 8, it is characterised in that the variable factor information is base In user and the interactive information of robot.
11. a kind of robot man-machine interaction methods, it is characterised in that include:
The command information of receive user;
The variable factor information of identification robot;And
According to variable factor information output interaction results.
12. robot man-machine interaction methods according to claim 11, it is characterised in that wherein command information is included and regarded Frequently, face, expression, scene, vocal print, fingerprint, iris pupil or light sensation information.
13. robot man-machine interaction methods according to claim 11, it is characterised in that wherein variable factor information is base In user and the interactive information of robot.
CN201680001719.4A 2016-07-07 2016-07-07 Robot man-machine interactive system, device and method Pending CN106662931A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/089231 WO2018006380A1 (en) 2016-07-07 2016-07-07 Human-machine interaction system, device, and method for robot

Publications (1)

Publication Number Publication Date
CN106662931A true CN106662931A (en) 2017-05-10

Family

ID=58838047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680001719.4A Pending CN106662931A (en) 2016-07-07 2016-07-07 Robot man-machine interactive system, device and method

Country Status (2)

Country Link
CN (1) CN106662931A (en)
WO (1) WO2018006380A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107340859A (en) * 2017-06-14 2017-11-10 北京光年无限科技有限公司 The multi-modal exchange method and system of multi-modal virtual robot
CN110328667A (en) * 2019-04-30 2019-10-15 北京云迹科技有限公司 Control method and device for robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100084734A (en) * 2009-01-19 2010-07-28 한국과학기술원 The emotion expression robot which can interact with human
US20110004577A1 (en) * 2009-07-02 2011-01-06 Samsung Electronics Co., Ltd. Emotion model, apparatus, and method for adaptively modifying personality features of emotion model
CN102103707A (en) * 2009-12-16 2011-06-22 群联电子股份有限公司 Emotion engine, emotion engine system and control method of electronic device
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN105082150A (en) * 2015-08-25 2015-11-25 国家康复辅具研究中心 Robot man-machine interaction method based on user mood and intension recognition
CN105345818A (en) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 3D video interaction robot with emotion module and expression module

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9569976B2 (en) * 2012-10-02 2017-02-14 Gavriel Yaacov Krauss Methods circuits, devices and systems for personality interpretation and expression
CN105490918A (en) * 2015-11-20 2016-04-13 深圳狗尾草智能科技有限公司 System and method for enabling robot to interact with master initiatively
CN105511608B (en) * 2015-11-30 2018-12-25 北京光年无限科技有限公司 Exchange method and device, intelligent robot based on intelligent robot
CN105598972B (en) * 2016-02-04 2017-08-08 北京光年无限科技有限公司 A kind of robot system and exchange method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100084734A (en) * 2009-01-19 2010-07-28 한국과학기술원 The emotion expression robot which can interact with human
US20110004577A1 (en) * 2009-07-02 2011-01-06 Samsung Electronics Co., Ltd. Emotion model, apparatus, and method for adaptively modifying personality features of emotion model
CN102103707A (en) * 2009-12-16 2011-06-22 群联电子股份有限公司 Emotion engine, emotion engine system and control method of electronic device
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN105082150A (en) * 2015-08-25 2015-11-25 国家康复辅具研究中心 Robot man-machine interaction method based on user mood and intension recognition
CN105345818A (en) * 2015-11-04 2016-02-24 深圳好未来智能科技有限公司 3D video interaction robot with emotion module and expression module

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107340859A (en) * 2017-06-14 2017-11-10 北京光年无限科技有限公司 The multi-modal exchange method and system of multi-modal virtual robot
CN107340859B (en) * 2017-06-14 2021-04-06 北京光年无限科技有限公司 Multi-modal interaction method and system of multi-modal virtual robot
CN110328667A (en) * 2019-04-30 2019-10-15 北京云迹科技有限公司 Control method and device for robot

Also Published As

Publication number Publication date
WO2018006380A1 (en) 2018-01-11

Similar Documents

Publication Publication Date Title
CN111833418B (en) Animation interaction method, device, equipment and storage medium
US11600033B2 (en) System and method for creating avatars or animated sequences using human body features extracted from a still image
CN109902659B (en) Method and apparatus for processing human body image
WO2020063009A1 (en) Image processing method and apparatus, storage medium, and electronic device
CN107728780A (en) A kind of man-machine interaction method and device based on virtual robot
JP2019145108A (en) Electronic device for generating image including 3d avatar with facial movements reflected thereon, using 3d avatar for face
CN109919888A (en) A kind of method of image co-registration, the method for model training and relevant apparatus
CN104777911B (en) A kind of intelligent interactive method based on holographic technique
CN110443167B (en) Intelligent recognition method and intelligent interaction method for traditional culture gestures and related devices
CN107632706A (en) The application data processing method and system of multi-modal visual human
CN113508369A (en) Communication support system, communication support method, communication support program, and image control program
CN110418095B (en) Virtual scene processing method and device, electronic equipment and storage medium
WO2021227916A1 (en) Facial image generation method and apparatus, electronic device, and readable storage medium
CN109327737A (en) TV programme suggesting method, terminal, system and storage medium
CN106471444A (en) A kind of exchange method of virtual 3D robot, system and robot
CN109584992A (en) Exchange method, device, server, storage medium and sand play therapy system
CN106502382A (en) Active exchange method and system for intelligent robot
CN108052250A (en) Virtual idol deductive data processing method and system based on multi-modal interaction
CN113362263A (en) Method, apparatus, medium, and program product for changing the image of a virtual idol
CN113556603B (en) Method and device for adjusting video playing effect and electronic equipment
CN110677610A (en) Video stream control method, video stream control device and electronic equipment
CN109343695A (en) Exchange method and system based on visual human's behavioral standard
CN108388889A (en) Method and apparatus for analyzing facial image
CN107066081A (en) The interaction control method and device and virtual reality device of a kind of virtual reality system
CN108037825A (en) The method and system that a kind of virtual idol technical ability is opened and deduced

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 301, Building 39, 239 Renmin Road, Gusu District, Suzhou City, Jiangsu Province, 215000

Applicant after: Suzhou Dogweed Intelligent Technology Co., Ltd.

Address before: 518000 Dongfang Science and Technology Building 1307-09, 16 Keyuan Road, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen green bristlegrass intelligence Science and Technology Ltd.

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20170510

RJ01 Rejection of invention patent application after publication