CN104493827A - Intelligent cognitive robot and cognitive system thereof - Google Patents

Intelligent cognitive robot and cognitive system thereof Download PDF

Info

Publication number
CN104493827A
CN104493827A CN201410660457.5A CN201410660457A CN104493827A CN 104493827 A CN104493827 A CN 104493827A CN 201410660457 A CN201410660457 A CN 201410660457A CN 104493827 A CN104493827 A CN 104493827A
Authority
CN
China
Prior art keywords
module
emotion
understanding
order
issuing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410660457.5A
Other languages
Chinese (zh)
Inventor
杨利
陈思鑫
王思一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
7th Middle School Of Quanzhou City Of Fujian Province
Original Assignee
7th Middle School Of Quanzhou City Of Fujian Province
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 7th Middle School Of Quanzhou City Of Fujian Province filed Critical 7th Middle School Of Quanzhou City Of Fujian Province
Priority to CN201410660457.5A priority Critical patent/CN104493827A/en
Publication of CN104493827A publication Critical patent/CN104493827A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses an intelligent cognitive robot and a cognitive system thereof. The cognitive system comprises understanding modules and emotional modules; the understanding modules include a sound understanding module, a visual understanding module and a touch understanding module; the emotional modules include a sound emotional cognitive module, an expression emotional cognitive module and a touch emotional cognitive module; emotional weight of external instructions is determined, the emotional modules correspondingly call information in a sound database, an expression database and/or a behavior database according to the emotional weight and control corresponding executing pieces to made emotional responses. The invention further discloses an intelligent cognitive robot adopting the cognitive system. The corresponding responses are made to the external instructions through sounds, images and touch, question and answer type of interaction with the external instruction can be performed, and the head, all fours and/or the body of the robot can response to the emotional weight of the external instruction with actions and sounds with emotions.

Description

Intelligent cognition robot and cognitive system thereof
Technical field
The present invention relates to a kind of intelligent cognition robot, particularly relate to a kind of can with the cognitive system of external environment condition interaction.
Background technology
Along with the development of science and technology, robot walks out laboratory, in the middle of the life entering people.Although robot is varied, powerful, can all trades and professions be applied to, as manufacturing industry, service trade or other special trades.But, in man-machine interaction, lack the interactive model of emotion cognition.
Summary of the invention
In view of this, the object of the present invention is to provide a kind of cognitive system of intelligent cognition robot, the responsing reaction of emotion can be had with external world's generating strap of issuing an order.
The object of the invention is to also provide a kind of intelligent cognition robot, the responsing reaction of emotion can be had with external world's generating strap of issuing an order.
In order to reach above-mentioned purpose, solution of the present invention is:
A cognitive system for intelligent cognition robot, comprising: Understanding Module and emotion module, and Understanding Module comprises:
Speech understanding module: according to the sound of issuing an order, getting word, punctuate and tone by dividing, understanding the intention of issuing an order;
Visual analysis module: according to the image of issuing an order, is compared and probability analysis process image by coordinate setting, greyscale transformation, binaryzation, profile description, data, realizes following the tracks of and understanding the intention of issuing an order;
Sense of touch Understanding Module: put on the position touched of robot, dynamics and direction according to the external world of issuing an order, realizes following the tracks of and understanding the intention of issuing an order;
Emotion module comprises sound emotion cognition module, expression emotion cognition module and sense of touch emotion cognition module;
Sound emotion cognition module stores information comparison by the output of speech understanding module and audio database, determines emotion weights,
Expression emotion cognition module stores information comparison by the output of visual analysis module and expression data storehouse, determines emotion weights,
Sense of touch emotion cognition module stores information comparison by the output of sense of touch Understanding Module and behavioral pattern data storehouse, determines emotion weights,
The responsing reaction with emotion is made according to the corresponding information executive item that also control is corresponding transferred in audio database, expression data storehouse and/or behavioral pattern data storehouse of emotion weights.
A kind of intelligent cognition robot, this robot comprises head, trunk and the four limbs that humanoid is movably connected, and also comprises cognitive system, and cognitive system comprises Understanding Module and emotion module;
Understanding Module comprises:
Speech understanding module: according to the sound of issuing an order, getting word, punctuate and tone by dividing, understanding the intention of issuing an order;
Visual analysis module: according to the image of issuing an order, is compared and probability analysis process image by coordinate setting, greyscale transformation, binaryzation, profile description, data, realizes following the tracks of and understanding the intention of issuing an order;
Sense of touch Understanding Module: put on the position touched of robot, dynamics and direction according to the external world of issuing an order, realizes following the tracks of and understanding the intention of issuing an order;
Emotion module comprises sound emotion cognition module, expression emotion cognition module and sense of touch emotion cognition module;
Sound emotion cognition module stores information comparison by the output of speech understanding module and audio database, determines emotion weights,
Expression emotion cognition module stores information comparison by the output of visual analysis module and expression data storehouse, determines emotion weights,
Sense of touch emotion cognition module stores information comparison by the output of sense of touch Understanding Module and behavioral pattern data storehouse, determines emotion weights,
Transfer information in audio database, expression data storehouse and/or behavioral pattern data storehouse according to emotion weights correspondence and control corresponding voice playing unit, the incidence of robot, four limbs and/or trunk and make responsing reaction with emotion.
After adopting such scheme, the cognitive system of intelligent cognition robot of the present invention has following beneficial effect: by sound, image and sense of touch, cognitive system can be issued an order to external world and be made corresponding reaction, interactive interchange is produced with people, not only can issue an order with the external world and produce the interaction of question-response formula, can also respond with the action of emotion, sounding by the executive item of the control emotion weights of issuing an order according to the external world.
After adopting such scheme, intelligent cognition robot of the present invention has following beneficial effect: robot is by sound, image and sense of touch, cognitive system can be issued an order to external world and be made corresponding reaction, interactive interchange is produced with people, not only can issue an order with the external world and produce the interaction of question-response formula, the emotion weights that the incidence of robot, four limbs and/or trunk can also be issued an order according to the external world are responded with the action of emotion, sounding.The head that humanoid is movably connected, trunk and four limbs form robot, and its freedom of movement, expands the scope of activities of robot, can develop better and expand other function.
Accompanying drawing explanation
Fig. 1 is structural representation of the present invention.
In figure:
Head 1 eyes 11
Nose 12 ear 13
Mouth 14 eyebrow 15
Trunk 2 four limbs 3
Detailed description of the invention
In order to explain technical scheme of the present invention further, below by specific embodiment, the present invention will be described in detail.
As shown in Figure 1, intelligent cognition robot of the present invention comprises head 1, trunk 2 and the four limbs 3 that humanoid is movably connected, and these form executive item.Head 1 connects trunk 2 by neck, and neck can rotate freely revolving part at any angle by bearing etc. and be assemblied on trunk 2.Head also by having the above-mentioned revolving part of damping function, is assemblied on neck in the mode that can swing up and down set angle.Like this, head 1 can swing up and down, left rotation and right rotation.Head is equipped with anthropomorphic face, as eyes 11, nose 12, ear 13 and mouth 14, eyebrow 15.Face on head 1, head 1, trunk 2 and four limbs 3 respectively by the servo driving of correspondence, realize nodding, shake the head, dynamic eyebrow, in an instant eyeball, to shut up, a stupefied ear, droop ear, walk, stretch crook one's arm, the anthropomorphic action of the band emotion such as swing arm.
The present invention also comprises cognitive system, and cognitive system controls above-mentioned executive item and issues an order according to the external world and make respective reaction, i.e. instruction of issuing an order here.Cognitive system comprises Understanding Module and emotion module.Issuing an order in the external world can be by the people that issues an order, and also can be that the sound, image, action etc. that the audio-video player set sends is issued an order.
Understanding Module comprises speech understanding module, visual analysis module, sense of touch Understanding Module and study module.
Voice, statement, music, recording broadcasting etc. that speech understanding module receives the extraneous people that issues an order by the recording such as Mike terminal are issued an order, by carrying out participle to issuing an order, get word, the eustasy of punctuate and judgement tone, understand above-mentioned intention of issuing an order.
The algorithm of Chinese word segmentation is based on the method for dictionary with rule, and the information such as the Chinese character string of issuing an order and the entry stored in audio database mated one by one, if find certain character string in audio database, then the match is successful.Get word be get the Chinese sentence of issuing an order from left to right m character as matching field, m is most long word bar number in audio database.
Search audio database and mate, if the match is successful, then using this matching field as a word segmentation out.If mate unsuccessful, then removed by the last character of this matching field, remaining character string, as new matching field, is again mated, is repeated above process, until be syncopated as all words.
Obtaining of the keyword of issuing an order is the weighted value first setting up common words in audio database, and through weight ratio comparatively, the meaning of one's words obtaining large probability expresses tendency.By in the context bank preset in keyword assignment to audio database, as having a meal, having a rest, go out, go home, amusement, greeting, tourism, inquiry, encyclopaedic knowledge etc. context bank, the Keywords matching in context bank has response words and phrases as answer.Context bank when robot corresponding to keyword, searches the word that matches with keyword as answer, thus produces interaction with the people that issues an order.
English participle, get the eustasy of word, punctuate and judgement tone, understand the meaning of one's words of voice etc. similar with Chinese, do not repeat them here.
In addition, the sound that available Microsoft Chinese speech SDK gathers the people that issues an order is issued an order, and application vb programming, the single-chip microcomputer analysis corresponding by serial data instruction performs corresponding issuing an order.
The image acquisition units that visual analysis module is assembled by head catches external image, the image etc. of the article that the expression of the people that such as issues an order, the people that issues an order show, the pattern of issuing an order people's Freehandhand-drawing, extraneous audio-video terminal plays.Visual analysis module is compared and probability analysis process image by coordinate setting, greyscale transformation, binaryzation, profile description, data, realizes the intention following the tracks of and understand the people that issues an order.In addition, image acquisition units can adopt common camera, or zoom camera, can adapt to different far and near identification needs like this.
Image acquisition units by ezVidC60.ocx control, by the imagery exploitation VB IMAQ that gathers and store images program.
Image procossing is as follows: first, to the image first coordinate setting gathered, finds digit position place in image.Manual adjustments on that also can assemble in robot or external display screen, is conducive to improving recognition accuracy.Secondly, change according to the rgb value of formula Gray=r × 0.39+g × 0.5+b × 0.11 to the coloured image collected, realize image gray processing.Wherein, Gray is gray value, and r, g, b are respectively red, green and blue color component value.Last according to ambient light during IMAQ, regulate threshold value, image binaryzation.Preferably, threshold value is set in about 145, can adapt to most of environment.The identification of the adjustment direct relation image of threshold value.Image after binaryzation carries out profile description.Such as, when algorithm identification calculates, the handwritten numeral Image Automatic Segmentation collected becomes the numeral, logical operator etc. stored in several piece and audio database to contrast by visual analysis module, identifies numeral and logical operator and calculate by probability analysis.
Countenance image procossing is as follows: by the reflective difference of face to light, the countenance image captured is carried out binaryzation analysis, and instruction robot head follows the rotation of this face.
Sense of touch Understanding Module experiences the touch motion of the people that issues an order by the sensor of robot head, head face or the change of the signal of telecommunication, and the intention of the people that issues an order is understood in the position touched by this, dynamics and direction.
Emotion module: comprise sound emotion cognition module, expression emotion cognition module and sense of touch emotion cognition module.
The language message comparison that sound emotion cognition module will store in the output of speech understanding module and audio database, determines emotion weights.The emotion weight setting of the word in audio database is three classes: friendly, neutral is unfriendly.As " you are good ", the emotion weights of " we make friends " this kind of words belong to friendly.Emotion weights as " I will walk ", " ask that you step aside road " this kind of words belong to neutral.As " your temper is very bad ", " scarcely knowing what one has said " this kind of emotion weights belong to unfriendly.
Expression emotion cognition module stores information comparison by the output of visual analysis module and expression data storehouse, determines emotion weights.Also kinds of surface is set as three classes in expression data storehouse: friendly, neutral is unfriendly.Such as, smile, laugh, put first-class emotion weights and belong to friendly, frown, the corners of the mouth is drop-down, stare etc. emotion weights belong to unfriendly, the emotion weights facing, do not have smile etc. belong to neutral.
Sense of touch emotion cognition module stores information comparison by the output of sense of touch Understanding Module and behavioral pattern data storehouse, determines emotion weights.Also multiple behavior is divided three classes in behavioral pattern data storehouse: friendly, neutral is unfriendly.Such as clap shoulder, some nose, clap one's hands, get to know, shake hands, the emotion weights of to wave etc. belong to friendly.The emotion weights of make a fist, to push and shove etc. belong to unfriendly.The emotion weights of hands at the back, attentioning, standing at ease, passing by etc. belong to neutral.
Transfer information corresponding in audio database, expression data storehouse, behavioral pattern data storehouse according to emotion weights, the single-chip microcomputer to correspondence sends instruction, and instruction comprises the information such as angle and time; This single-chip microcomputer controls corresponding steering wheel according to instruction, the action respectively of the executive items such as the voice playing unit that this servo driving is corresponding, the incidence of robot, four limbs and trunk, namely produce corresponding action selectively to express the happiness of robot, the mood such as dejected, active, stupefied, make robot produce corresponding subjective action.Like this when robot receives the information of friendly emotion weights, then sounding greetings, raise eyebrows, rotation eyes, open face, closed face, rotation head, saw the air.When robot receives the information of disagreeableness emotion weights, then rotate head, droop ear, in other words " annoying, pay no attention to you " etc.
Steering wheel is a kind of driver of position servo, and being applicable to those needs angle constantly to change and the control system that can keep.Its operation principle is: control signal, by the passage entering signal modulation chip of receiver, obtains DC offset voltage.There is a reference circuit its inside, and the generation cycle is 20ms, and width is the reference signal of 1.5ms, by the DC offset voltage of acquisition and the voltage compare of potentiometer, obtains voltage difference and exports.Finally, the positive and negative motor drive ic that outputs to of voltage difference determines corresponding motor positive and inverse.When this motor speed one timing, drive potentiometer to rotate by cascade reduction gearing, make voltage difference be 0, this motor stalls.
SCM system realizes the control to steering wheel output corner, has first needed two tasks: be first produce basic PWM periodic signal, the present invention produces the periodic signal of 20ms; Next is the adjustment of pulsewidth, i.e. the output of single-chip simulation pwm signal, and adjusts dutycycle.The present invention can adopt metal gear steering wheel, and moment reaches 13Kg/cm, 60 degree/second of speed, the precision of control 0.1 degree, control range 0-180 degree.The present invention utilizes single-chip microcomputer timer T0 and T1 to control 16 road steering wheels respectively, and 16 road steering wheels are the corresponding action controlling above-mentioned executive item respectively.
When robot receives voice, word or image, when not identifying by the audio database of correspondence, expression data storehouse, behavioral pattern data storehouse, these voice, word or image can be stored in corresponding audio database, expression data storehouse, behavioral pattern data storehouse by study module, during in order to these voice, word or image repetition, be analyzed as previously mentioned.Study module also can make robot issue an order according to the external world, such as teaching software, teaching network, teacher give lessons learn by oneself, robot can be enriched by this self-study mode and improve audio database, expression data storehouse, behavioral pattern data storehouse.Due to the learning ability that robot is certain, can play with robot more than spare time, loosen body and mind.
In addition, robot of the present invention also can set situational dialogue, leisure, calculating isotype.Under situational dialogue pattern, robot can produce interaction with the people that issues an order, and the emotion weights mainly determined according to emotion module transfer information corresponding in audio database, expression data storehouse and/or behavioral pattern data storehouse and the incidence of the voice playing unit of control correspondence, robot, four limbs and/or body work.
Under leisure patterns, walk robot autonomously, play music, independently movable head neck, four limbs and/or trunk voluntarily, as pet, build a kind of household atmosphere of warmth.
Under computation schema, the numeral, logical operator etc. that store in the operational data storehouse in extraneous arithmetic expression and audio database contrast by robot, identify numeral and logical operator and calculate by probability analysis.
Robot of the present invention not only can issue an order with the external world and produce the interaction of question-response formula, and the emotion weights that can also issue an order according to the external world are responded with the action of emotion, language.Such as happy, worried, rotate head, droop ear, in other words " annoying, pay no attention to you ", or for issuing an order people or refuel " excellent, excellent " for oneself, or observe external environment change and send promptings " carefully, attention " etc.
Robot architecture of the present invention is simple, powerful, and unique study module can simulate the process of learning to read with the aid of pictures of baby, for life adds much enjoyment, enters huge numbers of families, and be broad masses of the people's service, its application prospect is more wide.
Above-described embodiment and accompanying drawing non-limiting product form of the present invention and style, any person of an ordinary skill in the technical field, to its suitable change done or modification, all should be considered as not departing from patent category of the present invention.

Claims (2)

1. a cognitive system for intelligent cognition robot, is characterized in that, comprising: Understanding Module and emotion module, and Understanding Module comprises:
Speech understanding module: according to the sound of issuing an order, getting word, punctuate and tone by dividing, understanding the intention of issuing an order;
Visual analysis module: according to the image of issuing an order, is compared and probability analysis process image by coordinate setting, greyscale transformation, binaryzation, profile description, data, realizes following the tracks of and understanding the intention of issuing an order;
Sense of touch Understanding Module: put on the position touched of robot, dynamics and direction according to the external world of issuing an order, realizes following the tracks of and understanding the intention of issuing an order;
Emotion module comprises sound emotion cognition module, expression emotion cognition module and sense of touch emotion cognition module;
Sound emotion cognition module stores information comparison by the output of speech understanding module and audio database, determines emotion weights,
Expression emotion cognition module stores information comparison by the output of visual analysis module and expression data storehouse, determines emotion weights,
Sense of touch emotion cognition module stores information comparison by the output of sense of touch Understanding Module and behavioral pattern data storehouse, determines emotion weights,
The responsing reaction with emotion is made according to the corresponding information executive item that also control is corresponding transferred in audio database, expression data storehouse and/or behavioral pattern data storehouse of emotion weights.
2. an intelligent cognition robot, is characterized in that, this robot comprises head, trunk and the four limbs that humanoid is movably connected, and also comprises cognitive system, and cognitive system comprises Understanding Module and emotion module;
Understanding Module comprises:
Speech understanding module: according to the sound of issuing an order, getting word, punctuate and tone by dividing, understanding the intention of issuing an order;
Visual analysis module: according to the image of issuing an order, is compared and probability analysis process image by coordinate setting, greyscale transformation, binaryzation, profile description, data, realizes following the tracks of and understanding the intention of issuing an order;
Sense of touch Understanding Module: put on the position touched of robot, dynamics and direction according to the external world of issuing an order, realizes following the tracks of and understanding the intention of issuing an order;
Emotion module comprises sound emotion cognition module, expression emotion cognition module and sense of touch emotion cognition module;
Sound emotion cognition module stores information comparison by the output of speech understanding module and audio database, determines emotion weights,
Expression emotion cognition module stores information comparison by the output of visual analysis module and expression data storehouse, determines emotion weights,
Sense of touch emotion cognition module stores information comparison by the output of sense of touch Understanding Module and behavioral pattern data storehouse, determines emotion weights,
Transfer information in audio database, expression data storehouse and/or behavioral pattern data storehouse according to emotion weights correspondence and control corresponding voice playing unit, the incidence of robot, four limbs and/or trunk and make responsing reaction with emotion.
CN201410660457.5A 2014-11-17 2014-11-17 Intelligent cognitive robot and cognitive system thereof Pending CN104493827A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410660457.5A CN104493827A (en) 2014-11-17 2014-11-17 Intelligent cognitive robot and cognitive system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410660457.5A CN104493827A (en) 2014-11-17 2014-11-17 Intelligent cognitive robot and cognitive system thereof

Publications (1)

Publication Number Publication Date
CN104493827A true CN104493827A (en) 2015-04-08

Family

ID=52935327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410660457.5A Pending CN104493827A (en) 2014-11-17 2014-11-17 Intelligent cognitive robot and cognitive system thereof

Country Status (1)

Country Link
CN (1) CN104493827A (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105058389A (en) * 2015-07-15 2015-11-18 深圳乐行天下科技有限公司 Robot system, robot control method, and robot
CN105137824A (en) * 2015-07-29 2015-12-09 百度在线网络技术(北京)有限公司 Artificial intelligence-based intelligent robot expression feedback method and device
CN105345822A (en) * 2015-12-17 2016-02-24 成都英博格科技有限公司 Intelligent robot control method and device
CN105690407A (en) * 2016-04-27 2016-06-22 深圳前海勇艺达机器人有限公司 Intelligent robot with expression display function
CN105690408A (en) * 2016-04-27 2016-06-22 深圳前海勇艺达机器人有限公司 Emotion recognition robot based on data dictionary
CN105718921A (en) * 2016-02-29 2016-06-29 深圳前海勇艺达机器人有限公司 Method capable of realizing robot intelligent emotion recording
CN105867633A (en) * 2016-04-26 2016-08-17 北京光年无限科技有限公司 Intelligent robot oriented information processing method and system
CN106531148A (en) * 2016-10-24 2017-03-22 咪咕数字传媒有限公司 Cartoon dubbing method and apparatus based on voice synthesis
CN106803377A (en) * 2017-02-27 2017-06-06 合肥慧动智能科技有限公司 A kind of English study manages robot
CN106863300A (en) * 2017-02-20 2017-06-20 北京光年无限科技有限公司 A kind of data processing method and device for intelligent robot
CN107199571A (en) * 2016-03-16 2017-09-26 富士施乐株式会社 Robot control system
CN107450367A (en) * 2017-08-11 2017-12-08 上海思依暄机器人科技股份有限公司 A kind of voice transparent transmission method, apparatus and robot
WO2018006470A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Artificial intelligence processing method and device
WO2018006471A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for updating robot emotion data
CN107718014A (en) * 2017-11-09 2018-02-23 深圳市小村机器人智能科技有限公司 Highly emulated robot head construction and its method of controlling operation
CN107850904A (en) * 2016-06-16 2018-03-27 深圳市创客工场科技有限公司 Steering wheel
CN108161953A (en) * 2018-02-24 2018-06-15 上海理工大学 A kind of intelligent robot head system
CN108297109A (en) * 2018-02-24 2018-07-20 上海理工大学 A kind of intelligent robot system
CN108681412A (en) * 2018-04-12 2018-10-19 清华大学 A kind of emotion recognition device and method based on arrayed tactile sensor
CN108724205A (en) * 2017-04-19 2018-11-02 松下知识产权经营株式会社 Interactive device, interactive approach, interactive process and robot
CN109108962A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
CN110000772A (en) * 2019-04-30 2019-07-12 广东工业大学 A kind of mouth mechanism of Wire driven robot
CN110382174A (en) * 2017-01-10 2019-10-25 直觉机器人有限公司 It is a kind of for executing mood posture with the device with customer interaction
CN111086012A (en) * 2019-12-30 2020-05-01 深圳市优必选科技股份有限公司 Head structure and robot
CN111399727A (en) * 2020-02-25 2020-07-10 帕利国际科技(深圳)有限公司 Man-machine interaction equipment and interaction method
CN111428006A (en) * 2020-04-27 2020-07-17 齐鲁工业大学 Auxiliary teaching system and method based on NAO robot
CN114029980A (en) * 2021-12-17 2022-02-11 赣州市第一中学 Situation display robot for education

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060122837A1 (en) * 2004-12-08 2006-06-08 Electronics And Telecommunications Research Institute Voice interface system and speech recognition method
CN101206522A (en) * 2006-12-21 2008-06-25 财团法人工业技术研究院 Movable device with surface display information and interaction function
CN101604204A (en) * 2009-07-09 2009-12-16 北京科技大学 Distributed cognitive technology for intelligent emotional robot
CN103119644A (en) * 2010-07-23 2013-05-22 奥尔德巴伦机器人公司 Humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103456314A (en) * 2013-09-03 2013-12-18 广州创维平面显示科技有限公司 Emotion recognition method and device
CN104102346A (en) * 2014-07-01 2014-10-15 华中科技大学 Household information acquisition and user emotion recognition equipment and working method thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060122837A1 (en) * 2004-12-08 2006-06-08 Electronics And Telecommunications Research Institute Voice interface system and speech recognition method
CN101206522A (en) * 2006-12-21 2008-06-25 财团法人工业技术研究院 Movable device with surface display information and interaction function
CN101604204A (en) * 2009-07-09 2009-12-16 北京科技大学 Distributed cognitive technology for intelligent emotional robot
CN103119644A (en) * 2010-07-23 2013-05-22 奥尔德巴伦机器人公司 Humanoid robot equipped with a natural dialogue interface, method for controlling the robot and corresponding program
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system
CN103456314A (en) * 2013-09-03 2013-12-18 广州创维平面显示科技有限公司 Emotion recognition method and device
CN104102346A (en) * 2014-07-01 2014-10-15 华中科技大学 Household information acquisition and user emotion recognition equipment and working method thereof

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105058389A (en) * 2015-07-15 2015-11-18 深圳乐行天下科技有限公司 Robot system, robot control method, and robot
CN105137824A (en) * 2015-07-29 2015-12-09 百度在线网络技术(北京)有限公司 Artificial intelligence-based intelligent robot expression feedback method and device
CN105345822A (en) * 2015-12-17 2016-02-24 成都英博格科技有限公司 Intelligent robot control method and device
CN105345822B (en) * 2015-12-17 2017-05-10 成都英博格科技有限公司 Intelligent robot control method and device
CN105718921A (en) * 2016-02-29 2016-06-29 深圳前海勇艺达机器人有限公司 Method capable of realizing robot intelligent emotion recording
CN107199571A (en) * 2016-03-16 2017-09-26 富士施乐株式会社 Robot control system
CN107199571B (en) * 2016-03-16 2021-11-05 富士胶片商业创新有限公司 Robot control system
CN105867633A (en) * 2016-04-26 2016-08-17 北京光年无限科技有限公司 Intelligent robot oriented information processing method and system
CN105867633B (en) * 2016-04-26 2019-09-27 北京光年无限科技有限公司 Information processing method and system towards intelligent robot
CN105690407A (en) * 2016-04-27 2016-06-22 深圳前海勇艺达机器人有限公司 Intelligent robot with expression display function
CN105690408A (en) * 2016-04-27 2016-06-22 深圳前海勇艺达机器人有限公司 Emotion recognition robot based on data dictionary
CN107850904A (en) * 2016-06-16 2018-03-27 深圳市创客工场科技有限公司 Steering wheel
WO2018006470A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Artificial intelligence processing method and device
WO2018006471A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for updating robot emotion data
CN106531148A (en) * 2016-10-24 2017-03-22 咪咕数字传媒有限公司 Cartoon dubbing method and apparatus based on voice synthesis
CN110382174A (en) * 2017-01-10 2019-10-25 直觉机器人有限公司 It is a kind of for executing mood posture with the device with customer interaction
CN106863300A (en) * 2017-02-20 2017-06-20 北京光年无限科技有限公司 A kind of data processing method and device for intelligent robot
CN106803377A (en) * 2017-02-27 2017-06-06 合肥慧动智能科技有限公司 A kind of English study manages robot
CN108724205B (en) * 2017-04-19 2022-07-26 松下知识产权经营株式会社 Interaction device, interaction method, interaction program, and robot
CN108724205A (en) * 2017-04-19 2018-11-02 松下知识产权经营株式会社 Interactive device, interactive approach, interactive process and robot
CN109108962A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
CN107450367A (en) * 2017-08-11 2017-12-08 上海思依暄机器人科技股份有限公司 A kind of voice transparent transmission method, apparatus and robot
CN107718014A (en) * 2017-11-09 2018-02-23 深圳市小村机器人智能科技有限公司 Highly emulated robot head construction and its method of controlling operation
CN108297109A (en) * 2018-02-24 2018-07-20 上海理工大学 A kind of intelligent robot system
CN108161953A (en) * 2018-02-24 2018-06-15 上海理工大学 A kind of intelligent robot head system
CN108681412A (en) * 2018-04-12 2018-10-19 清华大学 A kind of emotion recognition device and method based on arrayed tactile sensor
CN108681412B (en) * 2018-04-12 2020-06-02 清华大学 Emotion recognition device and method based on array type touch sensor
CN110000772A (en) * 2019-04-30 2019-07-12 广东工业大学 A kind of mouth mechanism of Wire driven robot
CN110000772B (en) * 2019-04-30 2024-05-24 广东工业大学 Flexible rope driven mouth mechanism
CN111086012B (en) * 2019-12-30 2021-09-17 深圳市优必选科技股份有限公司 Head structure and robot
CN111086012A (en) * 2019-12-30 2020-05-01 深圳市优必选科技股份有限公司 Head structure and robot
CN111399727A (en) * 2020-02-25 2020-07-10 帕利国际科技(深圳)有限公司 Man-machine interaction equipment and interaction method
CN111428006A (en) * 2020-04-27 2020-07-17 齐鲁工业大学 Auxiliary teaching system and method based on NAO robot
CN114029980A (en) * 2021-12-17 2022-02-11 赣州市第一中学 Situation display robot for education
CN114029980B (en) * 2021-12-17 2024-03-15 赣州市第一中学 Situation display robot for education

Similar Documents

Publication Publication Date Title
CN104493827A (en) Intelligent cognitive robot and cognitive system thereof
CN107030691B (en) Data processing method and device for nursing robot
Peng et al. Robotic dance in social robotics—a taxonomy
Cangelosi et al. Developmental robotics: From babies to robots
CN108000526A (en) Dialogue exchange method and system for intelligent robot
CN111081371A (en) Virtual reality-based early autism screening and evaluating system and method
CN204706208U (en) Intelligence children education robot
CN206021605U (en) Intelligent robot point-of-reading system
CN204791614U (en) Juvenile study machine people of intelligence
Lui et al. An affective mood booster robot based on emotional processing unit
Shidujaman et al. “roboquin”: A mannequin robot with natural humanoid movements
Cederborg et al. From language to motor gavagai: unified imitation learning of multiple linguistic and nonlinguistic sensorimotor skills
Lim et al. A Sign Language Recognition System with Pepper, Lightweight-Transformer, and LLM
Şen et al. Artificial intelligence and innovative applications in special education
CN111984161A (en) Control method and device of intelligent robot
Geiger et al. The robot ALIAS as a gaming platform for elderly persons
Maeda et al. Interactive emotion communication between human and robot
Lee et al. Therapeutic behavior of robot for treating autistic child using artificial neural network
Naeem et al. An AI based Voice Controlled Humanoid Robot
Farinelli Design and implementation of a multi-modal framework for scenic actions classification in autonomous actor-robot theatre improvisations
Cangelosi et al. Language and Communication
Weng The living machine initiative
Rohlfing et al. Grounding language in action
Gonzalez-Billandon et al. Cognitive architecture aided by working-memory for self-supervised multi-modal humans recognition
CN110322737A (en) A kind of artificial intelligence educational robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150408