CN107009362A - Robot control method and device - Google Patents

Robot control method and device Download PDF

Info

Publication number
CN107009362A
CN107009362A CN201710385759.XA CN201710385759A CN107009362A CN 107009362 A CN107009362 A CN 107009362A CN 201710385759 A CN201710385759 A CN 201710385759A CN 107009362 A CN107009362 A CN 107009362A
Authority
CN
China
Prior art keywords
emoticon
answer
behavior
default
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710385759.XA
Other languages
Chinese (zh)
Inventor
谢行
康平陆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Asimov Technology Co Ltd
Original Assignee
Shenzhen Asimov Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Asimov Technology Co Ltd filed Critical Shenzhen Asimov Technology Co Ltd
Priority to CN201710385759.XA priority Critical patent/CN107009362A/en
Publication of CN107009362A publication Critical patent/CN107009362A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor

Abstract

The present invention relates to a kind of robot control method and device, robot receives the input information of user, according to input acquisition of information and the answer for including emoticon of input information matches, obtain the emoticon in answer, and default behavior corresponding with emoticon in default behavior storehouse is obtained according to emoticon, perform default behavior.Robot can directly obtain according to the emoticon and perform default behavior corresponding with the emoticon.It need not be gone to realize the control to robot limb behavior with complicated sequence and order, so as to greatly reduce development cost.

Description

Robot control method and device
Technical field
The present invention relates to robotic technology field, more particularly to a kind of robot control method and device.
Background technology
Due to the various interaction modes of robot (especially anthropomorphic robot), interactive mode includes passing through voice, screen And the various ways such as behavior, and robot operative scenario is complex various, so robot control method is also complicated various.Pass System method relies primarily on the realization of the programming languages such as Python/C++, opened in realizing that the limbs behavior to robot is controlled Hair personnel need to remember sequence and the order of complexity, therefore extremely complex and learning cost is high, general when only have passed through longer Between professional training robot software engineer or algorithm scientific research personnel can just grasp.This just result in robot control method Development cost it is very high.Therefore, urgent need exploitation is a kind of can reduce the robot control method of development cost.
The content of the invention
Based on this, it is necessary to can reduce the robot controlling party of development cost there is provided a kind of for above-mentioned technical problem Method and device.
A kind of robot control method, methods described includes:
Receive input information;
According to the input acquisition of information and the answer for including emoticon of the input information matches;
Obtain the emoticon in the answer, and according to the emoticon obtain in default behavior storehouse with the expression Symbol presets behavior accordingly;
Perform the default behavior.
In one of the embodiments, it is described to input acquisition of information with the input information matches comprising table according to described The answer of feelings symbol, including:
According to the input acquisition of information and the first answer of the input information matches, first answer is included and institute State the emoticon of input information matches.
In one of the embodiments, it is described to input acquisition of information with the input information matches comprising table according to described The answer of feelings symbol, including:
According to the input acquisition of information and the second answer of the input information matches;
Feature extraction is carried out to the input information, and sentiment analysis is carried out to the feature of extraction;
The emoticon matched with the result of the sentiment analysis is obtained according to the result of sentiment analysis, by the emoticon In number insertion second answer.
In one of the embodiments, described obtained according to the result of sentiment analysis matches with the result of the sentiment analysis Emoticon, the emoticon is inserted in second answer, including:
According to the result of sentiment analysis, the default expression set of symbols matched with the result of the sentiment analysis is obtained;
The emoticon for reaching preset matching degree is obtained in the default expression set of symbols of the matching, by the emoticon In number insertion second answer.
In one of the embodiments, it is described according to the emoticon obtain in default behavior storehouse with the emoticon Default behavior accordingly, including:
According to the default mapping table for identifying and being identified with behavior comprising emoticon, search and the emoticon in answer Behavior of the emoticon mark with mapping relations is identified, according to the behavior mark found obtain in default behavior storehouse with it is described Emoticon presets behavior accordingly.
A kind of robot controller, described device includes:
Information receiving module is inputted, for receiving input information;
Answer acquisition module, for inputting acquisition of information with the input information matches comprising emoticon according to described Answer;
Default behavior acquisition module, is obtained for obtaining the emoticon in the answer, and according to the emoticon Default behavior corresponding with the emoticon in default behavior storehouse;
Default behavior performing module, for performing the default behavior.
In one of the embodiments, the answer acquisition module be additionally operable to according to it is described input acquisition of information with it is described defeated Enter the first answer of information matches, first answer includes the emoticon with the input information matches.
In one of the embodiments, the answer acquisition module includes:
Second answer acquisition module, for being answered according to the input acquisition of information with the second of the input information matches Case;
Feature extraction and sentiment analysis module, for carrying out feature extraction to the input information, and to the feature of extraction Carry out sentiment analysis;
Matching module, for obtaining the emoticon matched with the result of the sentiment analysis according to the result of sentiment analysis Number, the emoticon is inserted in second answer.
In one of the embodiments, the matching module is additionally operable to the result according to sentiment analysis, obtains and the feelings The default expression set of symbols of the result matching of sense analysis;Obtained in the default expression set of symbols of the matching and reach preset matching The emoticon of degree, the emoticon is inserted in second answer.
In one of the embodiments, the default behavior acquisition module is additionally operable to include emoticon mark according to default Know the mapping table identified with behavior, the emoticon mark searched with the emoticon in answer has the behavior mark of mapping relations Know, default behavior corresponding with the emoticon in default behavior storehouse is obtained according to the behavior mark found.
Above-mentioned robot control method and device, robot receive input information, are believed according to input acquisition of information with input Cease the answer for including emoticon of matching.The emoticon in answer is obtained, and default behavior storehouse is obtained according to emoticon In default behavior corresponding with emoticon, perform default behavior.Because the answer with inputting information matches that robot is obtained In contain emoticon, and default behavior corresponding with emoticon is stored in default behavior storehouse, so robot can root Directly obtained according to the emoticon and perform default behavior corresponding with the emoticon.Complicated sequence and order need not be used Go to realize the control to robot limb behavior, so as to greatly reduce development cost.
Brief description of the drawings
Fig. 1 is the flow chart of robot control method in one embodiment;
Fig. 2 is the flow chart of acquisition and the answer comprising emoticon of input information matches in Fig. 1;
Fig. 3 is the emoticon that is matched with the result of sentiment analysis to be obtained in Fig. 2 and by emoticon the second answer of insertion In flow chart;
Fig. 4 is the structural representation of robot controller in one embodiment;
Fig. 5 is the structural representation of answer acquisition device in Fig. 4.
Embodiment
In order to facilitate the understanding of the purposes, features and advantages of the present invention, below in conjunction with the accompanying drawings to the present invention Embodiment be described in detail.Many details are elaborated in the following description to fully understand this hair It is bright.But the invention can be embodied in many other ways as described herein, those skilled in the art can be not Similar improvement is done in the case of running counter to intension of the present invention, therefore the present invention is not limited to the specific embodiments disclosed below.
Unless otherwise defined, all of technologies and scientific terms used here by the article is with belonging to technical field of the invention The implication that technical staff is generally understood that is identical.Term used in the description of the invention herein is intended merely to description tool The purpose of the embodiment of body, it is not intended that in the limitation present invention.Each technical characteristic of above example can carry out arbitrary group Close, to make description succinct, combination not all possible to each technical characteristic in above-described embodiment is all described, however, As long as contradiction is not present in the combination of these technical characteristics, the scope of this specification record is all considered to be.
In one embodiment, as shown in Figure 1 there is provided a kind of robot control method, including:
Step 110, input information is received.
Robot refers to the robot for having certain limb motion ability and ability of language expression.Input information refers to user to machine Any of which such as voice, text, image and video that device people sends or much information, it is certainly, in other embodiments, defeated It can also be other information to enter information.Robot receives the input information that user sends to robot.
Step 120, according to input acquisition of information and the answer for including emoticon of input information matches.
Various emoticons emoticon storehouse has been stored in advance in robot.And in advance to robot typing some The question and answer data of basic problem, robot is contained and input information according to the input acquisition of information answer of user in answer The emoticon matched somebody with somebody.Emoticon be exactly emoji expression, be a kind of visual emotion symbol, with lively small pattern (icon) come The different emotion of expression.Such as user says " hello " that robot obtains corresponding answer, contains and is matched with " hello " in answer Emoticon, such as be the emoticon shaken hands that reaches.
Step 130, obtain the emoticon in answer, and according to emoticon obtain in default behavior storehouse with emoticon Default behavior accordingly.
The robot behavior corresponding to each emoticon, and emoticon are stored in the default behavior storehouse of robot Number and each emoticon corresponding to robot behavior between corresponding relation.Robot behavior includes limb action and language Sound, certainly, in other embodiments, robot behavior can also include other behaviors.The emoticon in answer is obtained, according to The corresponding relation between robot behavior corresponding to emoticon and each emoticon, is obtained with being somebody's turn to do in default behavior storehouse The corresponding default behavior of emoticon, for example, perform corresponding limb action and/or send corresponding voice.
Step 140, default behavior is performed.
Got according to emoticon after default behavior corresponding with emoticon, control machine people performs default behavior. For example according to the emoticon of " reach and shake hands ", the default behavior of " reach and shake hands " is got, then control machine people The limb action of " reach and shake hands " is performed, and sends the voice of " hello ".
In the present embodiment, because emoticon is contained in the answer with inputting information matches that robot is obtained, and in advance If storing default behavior corresponding with emoticon in behavior storehouse, so robot can directly be obtained simultaneously according to the emoticon Perform default behavior corresponding with the emoticon.It need not go to realize to robot limb behavior with complicated sequence and order Control, so as to greatly reduce development cost.
In one embodiment, according to input acquisition of information and the answer for including emoticon of input information matches, bag Include:According to input acquisition of information and the first answer of input information matches, the first answer includes the expression with inputting information matches Symbol.
In the present embodiment, for some input information, emoticon has been previously written in the corresponding answer of robot Number.So when robot receives these input information of user, then just being contained in the answer directly obtained and input information The emoticon of matching, the answer comprising the emoticon with inputting information matches is the first answer.
In one embodiment, as shown in Fig. 2 according to input acquisition of information with input information matches include emoticon Answer, specifically include:
Step 121, according to input acquisition of information and the second answer of input information matches.
For some input information, the emoticon with inputting information matches is not included in the default answer of robot.I.e. Robot does not include the table with inputting information matches according to the input information of user in the answer with inputting information matches of acquisition Feelings symbol, the answer not comprising the emoticon with inputting information matches is the second answer.
Step 122, feature extraction is carried out to input information, and sentiment analysis is carried out to the feature of extraction.
When not including the emoticon with inputting information matches in the answer that robot is obtained, then be accomplished by user Input information carry out feature extraction.Specifically, the voice sent to user to robot, text, image and video etc. are wherein Any or a variety of input information, using neural LISP program LISP (NLP, Neuro-Linguistic Programming) And computer vision technique, directly to input information extraction objective characteristics.For example, the Partial key words in voice can be extracted And the spy such as the limb action of crucial words in tone intonation, text, the facial expression of the user's face image obtained and user Levy.
Sentiment analysis (Sentiment Analysis), the main purpose of sentiment analysis are carried out to the features described above extracted User is exactly recognized to things or the view of people, can be that evaluated views are such as liked, disliked, liking, thirsting for or be specific Evaluation content, so as to draw the result of sentiment analysis.
For example, user sends the voice of " I have found boyfriend " and the input information of a facial expression to robot, The emoticon of matching is not included in the default answer of robot.The progress of the input information to user feature is so accomplished by carry Take, voice is converted into text, the crucial words of " have found " and " boyfriend " is extracted from text, is carried from facial expression The features such as " corners of the mouth raises up ", " cheek, which raises up, to be heaved " are taken out.The features described above extracted is entered using neural LISP program LISP Row sentiment analysis, to " have found " and " boyfriend ", the two crucial words are analyzed, and it is " happy " to draw emotion, to the corners of the mouth Raise up ", the feature such as " cheek, which raises up, to be heaved " analyzed, show that face does the action of " laughing at ", so analyzing emotion For " happy ".
Step 123, the emoticon matched with the result of sentiment analysis is obtained according to the result of sentiment analysis, by emoticon Number insertion the second answer in.
The matching of emotion is carried out in emoticon storehouse according to the result of sentiment analysis, the result with sentiment analysis is obtained The emoticon matched somebody with somebody, emoticon is inserted into the second answer, and emoticon is just contained in such second answer.For example, The emotion of " happy " that is gone out according to sentiment analysis, " happy " is matched in emoticon storehouse, is obtained and is represented " happy " Emoticon, the emoticon for representing " happy " is inserted in the second answer.
In the present embodiment, for some input information, emoticon is not included in the default answer of robot.Now lead to Cross and feature extraction and sentiment analysis are carried out to user's input information, the result finally according to sentiment analysis is entered in emoticon storehouse The matching of market sense, obtains the emoticon matched with the result of sentiment analysis.Pass through being calculated as not comprising expression for robot The answer of symbol with the addition of emoticon, be subsequently to call execution default behavior corresponding with emoticon to beat according to emoticon Basis is got well.
In one embodiment, as shown in figure 3, obtaining what is matched with the result of sentiment analysis according to the result of sentiment analysis Emoticon, emoticon is inserted in the second answer, specifically included:
Step 123a, according to the result of sentiment analysis, obtains the default expression set of symbols matched with the result of sentiment analysis.
The emoticon in emoticon storehouse is classified according to emotion in advance, is divided into different default emoticons Group.According to different emotions such as happy, sad, dejected, angry, awkward, it is divided into different default expression set of symbols, it is each Individual default expression set of symbols all corresponds to a kind of emotion.The result of sentiment analysis is entered with the emotion corresponding to default expression set of symbols Row matching, so as to obtain the default expression set of symbols matched with the result of sentiment analysis.
Step 123b, the emoticon for reaching preset matching degree is obtained in the default expression set of symbols of matching, will be expressed one's feelings Symbol is inserted in the second answer.
Obtain after the default expression set of symbols matched with the result of sentiment analysis, selected in the default expression set of symbols One emoticon for most matching.Specially each emoticon has a corresponding text message, by text information and feelings The result of sense analysis carries out similarity mode, generates matching degree, and preset matching degree could be arranged to matching degree highest.Therefore obtain The emoticon for reaching preset matching degree is to obtain matching degree highest emoticon.
In the present embodiment, the emoticon in emoticon storehouse is divided into different emoticons according to different emotions Group, is first matched the result of sentiment analysis from different emoticon groups, selects the emoticon group of matching.First roughly Big classification is found, the emoticon most matched is then found under the big classification, progressively reduces and finds scope, be easy to efficiently Fast accurate finds the emoticon most matched.
In one embodiment, default behavior corresponding with emoticon in default behavior storehouse is obtained according to emoticon, Including:According to the default mapping table for identifying and being identified with behavior comprising emoticon, the table with the emoticon in answer is searched The behavior that feelings symbol logo has mapping relations is identified, according to the behavior mark found obtain in default behavior storehouse with emoticon Number default behavior accordingly.
The mapping table that preset in advance is identified comprising emoticon in robot and behavior is identified, emoticon mark pair Corresponding emoticon is answered, behavior mark correspondence presets behavior accordingly, and emoticon mark and behavior mark are to correspond 's.After the emoticon in answer is obtained, inquired about in the mapping table according to the corresponding emoticon mark of emoticon To corresponding behavior mark, obtain after behavior mark, corresponding default behavior is obtained further according to behavior mark.
In the present embodiment, it is one-to-one, i.e. emoticon that the emoticon mark in mapping table is identified with behavior It is one-to-one with default behavior.Corresponding default behavior just can accurately be found by emoticon, behavior bag is preset Limb action and voice containing robot etc..User is inputted to the corresponding relation of information and robot behavior, emoticon is reduced to Number and default behavior corresponding with emoticon corresponding relation.Robot behavior is standardized with emoticon, it is not necessary to use Complicated sequence and order is controlled according to input information come the behavior to robot.
In one embodiment, as shown in figure 4, additionally providing a kind of robot controller, the device includes:Input letter Cease receiving module 410, answer acquisition module 420, default behavior acquisition module 430, default behavior performing module 440.
Information receiving module 410 is inputted, for receiving input information.
Answer acquisition module 420, for according to input acquisition of information and the answering comprising emoticon of input information matches Case.
Default behavior acquisition module 430, default row is obtained for obtaining the emoticon in answer, and according to emoticon For default behavior corresponding with emoticon in storehouse.
Default behavior performing module 440, for performing default behavior.
In one embodiment, answer acquisition module 420 is additionally operable to according to input acquisition of information and input information matches First answer, the first answer includes the emoticon with inputting information matches.
In one embodiment, as shown in figure 5, answer acquisition module 420 includes:Second answer acquisition module 421, feature Extract and sentiment analysis module 422, matching module 423.
Second answer acquisition module 421, for according to input acquisition of information and the second answer of input information matches, second The emoticon with inputting information matches is not included in answer.
Feature extraction and sentiment analysis module 422, for carrying out feature extraction to input information, and enter to the feature of extraction Row sentiment analysis.
Matching module 423, for obtaining the emoticon matched with the result of sentiment analysis according to the result of sentiment analysis, Emoticon is inserted in the second answer.
In one embodiment, matching module 423 is additionally operable to the result according to sentiment analysis, obtains the knot with sentiment analysis The default expression set of symbols of fruit matching;The emoticon for reaching preset matching degree is obtained in the default expression set of symbols of matching, Emoticon is inserted in the second answer.
In one embodiment, preset behavior acquisition module 430 be additionally operable to according to it is default comprising emoticon mark and The mapping table of behavior mark, searches behavior of the emoticon mark with mapping relations with the emoticon in answer and identifies, Behavior mark according to finding obtains default behavior corresponding with emoticon in default behavior storehouse.
Embodiment described above only expresses the several embodiments of the present invention, and it describes more specific and detailed, but simultaneously Can not therefore it be construed as limiting the scope of the patent.It should be pointed out that coming for one of ordinary skill in the art Say, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to the protection of the present invention Scope.Therefore, the protection domain of patent of the present invention should be determined by the appended claims.

Claims (10)

1. a kind of robot control method, methods described includes:
Receive input information;
According to the input acquisition of information and the answer for including emoticon of the input information matches;
Obtain the emoticon in the answer, and according to the emoticon obtain in default behavior storehouse with the emoticon Default behavior accordingly;
Perform the default behavior.
2. according to the method described in claim 1, it is characterised in that described to be believed according to the input acquisition of information with the input The answer for including emoticon of matching is ceased, including:
According to it is described input acquisition of information with it is described input information matches the first answer, first answer include with it is described defeated Enter the emoticon of information matches.
3. according to the method described in claim 1, it is characterised in that described to be believed according to the input acquisition of information with the input The answer for including emoticon of matching is ceased, including:
According to the input acquisition of information and the second answer of the input information matches;
Feature extraction is carried out to the input information, and sentiment analysis is carried out to the feature of extraction;
The emoticon matched with the result of the sentiment analysis is obtained according to the result of sentiment analysis, the emoticon is inserted Enter in second answer.
4. method according to claim 3, it is characterised in that described to be obtained and the emotion according to the result of sentiment analysis The emoticon of the result matching of analysis, the emoticon is inserted in second answer, including:
According to the result of sentiment analysis, the default expression set of symbols matched with the result of the sentiment analysis is obtained;
The emoticon for reaching preset matching degree is obtained in the default expression set of symbols of the matching, the emoticon is inserted Enter in second answer.
5. according to the method described in claim 1, it is characterised in that described to be obtained according to the emoticon in default behavior storehouse Default behavior corresponding with the emoticon, including:
According to the default mapping table for identifying and being identified with behavior comprising emoticon, the expression with the emoticon in answer is searched The behavior that symbol logo has mapping relations is identified, according to the behavior mark found obtain in default behavior storehouse with the expression Symbol presets behavior accordingly.
6. a kind of robot controller, it is characterised in that described device includes:
Information receiving module is inputted, for receiving input information;
Answer acquisition module, for according to the input acquisition of information and the answering comprising emoticon of the input information matches Case;
Default behavior acquisition module, obtains default for obtaining the emoticon in the answer, and according to the emoticon Default behavior corresponding with the emoticon in behavior storehouse;
Default behavior performing module, for performing the default behavior.
7. device according to claim 6, it is characterised in that the answer acquisition module is additionally operable to be believed according to the input Breath obtains the first answer with the input information matches, and first answer includes the emoticon with the input information matches Number.
8. device according to claim 6, it is characterised in that the answer acquisition module includes:
Second answer acquisition module, for according to the input acquisition of information and the second answer of the input information matches;
Feature extraction and sentiment analysis module, for carrying out feature extraction to the input information, and are carried out to the feature of extraction Sentiment analysis;
Matching module, will for obtaining the emoticon matched with the result of the sentiment analysis according to the result of sentiment analysis The emoticon is inserted in second answer.
9. device according to claim 8, it is characterised in that the matching module is additionally operable to the knot according to sentiment analysis Really, the default expression set of symbols matched with the result of the sentiment analysis is obtained;In the default expression set of symbols of the matching The emoticon for reaching preset matching degree is obtained, the emoticon is inserted in second answer.
10. device according to claim 6, it is characterised in that the default behavior acquisition module is additionally operable to according to default Identified comprising emoticon and mapping table that behavior is identified, searching to identify with the emoticon of the emoticon in answer has The behavior mark of mapping relations, obtains corresponding with the emoticon pre- in default behavior storehouse according to the behavior mark found If behavior.
CN201710385759.XA 2017-05-26 2017-05-26 Robot control method and device Pending CN107009362A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710385759.XA CN107009362A (en) 2017-05-26 2017-05-26 Robot control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710385759.XA CN107009362A (en) 2017-05-26 2017-05-26 Robot control method and device

Publications (1)

Publication Number Publication Date
CN107009362A true CN107009362A (en) 2017-08-04

Family

ID=59451549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710385759.XA Pending CN107009362A (en) 2017-05-26 2017-05-26 Robot control method and device

Country Status (1)

Country Link
CN (1) CN107009362A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109324688A (en) * 2018-08-21 2019-02-12 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109343695A (en) * 2018-08-21 2019-02-15 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008107673A (en) * 2006-10-27 2008-05-08 Business Design Kenkyusho:Kk Conversation robot
CN101474481A (en) * 2009-01-12 2009-07-08 北京科技大学 Emotional robot system
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103488293A (en) * 2013-09-12 2014-01-01 北京航空航天大学 Man-machine motion interaction system and method based on expression recognition
CN105498228A (en) * 2016-01-14 2016-04-20 胡文杰 Intelligent robot learning toy
CN106625678A (en) * 2016-12-30 2017-05-10 首都师范大学 Robot expression control method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008107673A (en) * 2006-10-27 2008-05-08 Business Design Kenkyusho:Kk Conversation robot
CN101474481A (en) * 2009-01-12 2009-07-08 北京科技大学 Emotional robot system
CN103218654A (en) * 2012-01-20 2013-07-24 沈阳新松机器人自动化股份有限公司 Robot emotion generating and expressing system
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103488293A (en) * 2013-09-12 2014-01-01 北京航空航天大学 Man-machine motion interaction system and method based on expression recognition
CN105498228A (en) * 2016-01-14 2016-04-20 胡文杰 Intelligent robot learning toy
CN106625678A (en) * 2016-12-30 2017-05-10 首都师范大学 Robot expression control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
中国机械工程学会编著: "《中国机械史 技术卷》", 30 November 2014 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109271018A (en) * 2018-08-21 2019-01-25 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109324688A (en) * 2018-08-21 2019-02-12 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard
CN109343695A (en) * 2018-08-21 2019-02-15 北京光年无限科技有限公司 Exchange method and system based on visual human's behavioral standard

Similar Documents

Publication Publication Date Title
CN108000526B (en) Dialogue interaction method and system for intelligent robot
US10733381B2 (en) Natural language processing apparatus, natural language processing method, and recording medium for deducing semantic content of natural language elements based on sign language motion
CN108962255B (en) Emotion recognition method, emotion recognition device, server and storage medium for voice conversation
CN105843381B (en) Data processing method for realizing multi-modal interaction and multi-modal interaction system
CN107516533A (en) A kind of session information processing method, device, electronic equipment
CN109791549A (en) Machine customer interaction towards dialogue
CN109036405A (en) Voice interactive method, device, equipment and storage medium
JP3346799B2 (en) Sign language interpreter
CN108009490A (en) A kind of determination methods of chat robots system based on identification mood and the system
CN105046238A (en) Facial expression robot multi-channel information emotion expression mapping method
CN107193948B (en) Human-computer dialogue data analysing method and device
CN113259780B (en) Holographic multidimensional audio and video playing progress bar generating, displaying and playing control method
CN109389005A (en) Intelligent robot and man-machine interaction method
Li Multi-scenario gesture recognition using Kinect
CN107009362A (en) Robot control method and device
CN110309254A (en) Intelligent robot and man-machine interaction method
CN111368053A (en) Mood pacifying system based on legal consultation robot
CN108108391A (en) For the processing method and device of the information of data visualization
CN107730082A (en) Intelligent Task distribution method, device and JICQ
Hahn et al. Learning to localize and align fine-grained actions to sparse instructions
CN108762480A (en) A kind of input method and electronic equipment
Febriansyah et al. SER: speech emotion recognition application based on extreme learning machine
Yabunaka et al. Facial expression sequence recognition for a japanese sign language training system
CN107992825A (en) A kind of method and system of the recognition of face based on augmented reality
CN115171673A (en) Role portrait based communication auxiliary method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170804