CN106873773A - Robot interactive control method, server and robot - Google Patents

Robot interactive control method, server and robot Download PDF

Info

Publication number
CN106873773A
CN106873773A CN201710013365.1A CN201710013365A CN106873773A CN 106873773 A CN106873773 A CN 106873773A CN 201710013365 A CN201710013365 A CN 201710013365A CN 106873773 A CN106873773 A CN 106873773A
Authority
CN
China
Prior art keywords
robot
behavior
data
information
behavior type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710013365.1A
Other languages
Chinese (zh)
Other versions
CN106873773B (en
Inventor
何坚强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201710013365.1A priority Critical patent/CN106873773B/en
Publication of CN106873773A publication Critical patent/CN106873773A/en
Application granted granted Critical
Publication of CN106873773B publication Critical patent/CN106873773B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a kind of robot interactive control method, server and robot, wherein method has including step:Receive the audio/video flow or associated sensor data of its supervision object for belonging to the acquisition of its image unit of robot upload;According to the characteristic extracted from the audio/video flow or the sensing data, corresponding behavior type is matched from predefined action typelib;The machine behavior command corresponding with behavior type is determined from default corresponding relation database;The machine behavior command is sent to robot to drive the robot to perform the machine behavior command, the robot is performed corresponding output behavior, as the feedback to its supervision object.The method and apparatus that the present invention is provided is greatly improved the level of intelligence of robot so that the interaction of machine person to person more convenient, form is more enriched.

Description

Robot interactive control method, server and robot
Technical field
The present invention relates to field of computer technology, and in particular to robotics, more particularly to robot interactive control Method processed, server and robot.
Background technology
Robot (Robot) is the general designation of the installations that can automatically perform work, is related to multiple branches of learning and subjects The wisdom of humanity is crystallized, wherein the intelligent robot for possessing sense of independence to a certain degree is even more and receives significant attention.
In the prior art, robot generally has instruction reception unit, information process unit and execution unit, can pass through Default or autonomous mode execute instruction.On industrial production line, such robot people have been that the minister of public works in ancient china is shown in it It is used, but the lives of the people and family are incorporated, the robot that can be smoothly linked up with kinsfolk yet there are no space in a newspaper, among these The technical barrier that needs are captured also has a lot.Robot of the prior art is nearly all to complete deliberate action according to preset instructions , it is necessary to be carried out by man-machine interaction component or human-computer interaction interface when being interacted with user, this kind of interactive component or friendship Mutual interface is arranged in robot as a hardware composition part, by the finger for receiving instruction that user provides or input Information is made to complete corresponding deliberate action.And want to accomplish independently to tackle user's temporary information or order, such as only done with people AC is linked up or processes provisional instruction, and present robot cannot just accomplish.Furthermore, due to existing robot still There are problems that reply action command mode it is single, cannot be flexible, therefore want to make robot have in a certain application field The ability of standby people to a certain degree, the family for making it smoothly enter people, such as in application fields such as the children that accompany and attend to, old man companions In, can't realize very well.
The content of the invention
Based on this, primary and foremost purpose of the invention aims to solve the problem that above-mentioned at least one problem, there is provided a kind of robot interactive control Method processed, and server and robot are accordingly provided to run the method described in previous target.
To realize above-mentioned purpose of the invention, adopt the following technical scheme that:
A kind of robot interactive control method of the invention, comprises the following steps:
Receive the audio/video flow or related transducer of its supervision object for belonging to the acquisition of its image unit of robot upload Device data;
According to the characteristic extracted from the audio/video flow or the sensing data, from predefined action typelib Match corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
The behavior command is sent to robot to drive the robot to perform the machine behavior command, the robot is held The corresponding output behavior of row, as the feedback to its supervision object.
Further, the characteristic that the foundation is extracted from the audio/video flow, from predefined action typelib The step of allotting corresponding behavior type includes:
Extract the audio feature information or key word information of the audio/video flow sound intermediate frequency data;
Prestoring in the characteristic information or key word information and predefined action typelib or is prestored at audio feature information Key word information matches, and the corresponding behavior type of the voice data is determined according to this.
Wherein in one embodiment, the characteristic that the foundation is extracted from the audio/video flow, from predefined action The step of corresponding behavior type is matched in typelib includes:
Extract the video features frame information of video data in the audio/video flow;
The video features frame information is matched with the video features frame information that prestores in predefined action typelib, according to this Determine the corresponding behavior type of the video data.
Further, the video features frame information is made up of the pictorial information of record behavior act, connects in the scheduled time Continuous a number of described pictorial information is the video features frame information.
It is described according to sensing data wherein in one embodiment, matched from predefined action typelib corresponding Behavior type the step of include:
Extract particular data word section or the Meaning of Information in the sensing data;
By the pre-stored data field in particular data word section or described information implication and predefined action typelib or pre- Deposit Meaning of Information to match, the corresponding behavior type of the sensing data is determined according to this.
Further, the sensing data is geography information sensing data, extracts the geography information sensor number Geographic information data in, the geographic information data is matched with the geographic information data that prestores, and is determined according to this describedly The behavior type of information sensor data characterization is managed, is determined from default corresponding relation database corresponding with behavior type Machine behavior command, the machine behavior command includes rate information, directional information and distance information, sends the behavior command To robot to drive the robot to perform the machine behavior command, the robot is performed corresponding output behavior, change machine Geography information of the device people relative to its supervision object.
Further, the sensing data is optical touch screen sensor data, touching in the extraction optical touch screen sensor data Control operation information and by its with predefined action typelib in the touch control operation information match that prestores, determine that the touch screen is passed according to this The corresponding behavior type of sensor data.
Wherein in one embodiment, the default corresponding relation database includes default behavior type tables of data, and With the machine behavior command table that default behavior type tables of data has mapping relations;It is described true from default corresponding relation database Determine concretely comprising the following steps for machine behavior command corresponding with behavior type:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines The default behavior type of matching, the mapping relations mapping generation machine behavior command according to the default behavior type.
Wherein in one embodiment, the machine behavior command includes:
Phonetic order, voice is exported with driven machine people's voice playback interface, and/or
Video instructions, video is exported with driven machine people's video playback interface, and/or
Action executing is instructed, with driven machine people's moving cell output action.
Wherein in one embodiment, its supervision object for belonging to the acquisition of its image unit of robot upload is being received Audio/video flow or pertinent sensor data the step of before also include, receive robot upload biological information sensor Data, when the biological information sensing data and the biometric information matches that prestore, determine the biological information The identity authority of the corresponding user of sensing data, in response to the corresponding default feedback command of the identity permission build.
Further, the biological information sensing data includes fingerprint information data, iris information data or sound At least one in sound information data, determines the identity authority of user with the contrast of default identity authority information according to this, in response to The identity authority opens corresponding default feedback authority.
Wherein in one embodiment, the default behavior type storehouse and/or the default corresponding relation database are by cloud End server or the user's setting associated with cloud server update.
Wherein in one embodiment, the upload mode is upload in real time.
Present invention also offers a kind of robot interactive control method, comprise the following steps:
Get the audio/video flow or associated sensor data to supervision object for belonging to image unit acquisition;
According to the characteristic extracted from the audio/video flow or the sensing data, from predefined action typelib Match corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
Corresponding output behavior is made according to the behavior command driven machine people relevant interface or unit, is supervised as to it Depending on the feedback of object.
Further, the characteristic that the foundation is extracted from the audio/video flow, from predefined action typelib The step of allotting corresponding behavior type includes:
Extract the audio feature information or key word information of the audio/video flow sound intermediate frequency data;
Prestoring in the characteristic information or key word information and predefined action typelib or is prestored at audio feature information Key word information matches, and the corresponding behavior type of the voice data is determined according to this.
Wherein in one embodiment, the characteristic that the foundation is extracted from the audio/video flow, from predefined action The step of corresponding behavior type is matched in typelib includes:
Extract the video features frame information of video data in the audio/video flow;
The video features frame information is matched with the video features frame information that prestores in predefined action typelib, according to this Determine the corresponding behavior type of the video data.
It is described according to sensing data wherein in one embodiment, matched from predefined action typelib corresponding Behavior type the step of include:
Extract particular data word section or the Meaning of Information in the sensing data;
By the pre-stored data field in particular data word section or described information implication and predefined action typelib or pre- Deposit Meaning of Information to match, the corresponding behavior type of the sensing data is determined according to this.
Further, the sensing data is geography information sensing data, extracts the geography information sensor number Geographic information data in, the geographic information data is matched with the geographic information data that prestores, and is determined according to this describedly The behavior type of information sensor data characterization is managed, is determined from default corresponding relation database corresponding with behavior type Machine behavior command, the machine behavior command includes rate information, directional information and distance information, sends the behavior command To robot to drive the robot to perform the machine behavior command, the robot is performed corresponding output behavior, change machine Geography information of the device people relative to its supervision object.
Further, the sensing data is optical touch screen sensor data, touching in the extraction optical touch screen sensor data Control operation information and by its with predefined action typelib in the touch control operation information match that prestores, determine that the touch screen is passed according to this The corresponding behavior type of sensor data.
Wherein in one embodiment, the default corresponding relation database includes default behavior type tables of data, and With the machine behavior command table that default behavior type tables of data has mapping relations;It is described true from default corresponding relation database Determine concretely comprising the following steps for machine behavior command corresponding with behavior type:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines The default behavior type of matching, the mapping relations mapping generation machine behavior command according to the default behavior type.
Wherein in one embodiment, the machine behavior command includes:
Phonetic order, voice is exported with driven machine people's voice playback interface, and/or
Video instructions, video is exported with driven machine people's video playback interface, and/or
Action executing is instructed, with driven machine people's moving cell output action.
Wherein in one embodiment, the audio/video flow or phase of its supervision object for belonging to image unit acquisition are being got Also include before the step of closing sensing data, receive biological information sensing data, when the biological information is passed Sensor data and the biometric information matches that prestore, determine the body of the corresponding user of the biological information sensing data Part authority, in response to the corresponding default feedback command of the identity permission build.
Wherein in one embodiment, the biological information sensing data includes that fingerprint information data, iris are believed At least one in breath data or acoustic information data, determines the rights relating the person of user with the contrast of default identity authority information according to this Limit, corresponding default feedback authority is opened in response to the identity authority.
Wherein in one embodiment, the default behavior type storehouse and/or the default corresponding relation database by with Family set or under be downloaded from cloud server.
Further, the download is carried out by bluetooth, Wi-Fi network, mobile data network or cable network.
The invention provides a kind of robot interactive server, including:
Receiver module, the audio frequency and video of its supervision object for belonging to the acquisition of its image unit for receiving robot upload Stream or associated sensor data;
Parsing module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from pre- Determine to match corresponding behavior type in behavior type storehouse;
Data generation module, for determining machine behavior corresponding with behavior type from default corresponding relation database Instruction;
Sending module, is referred to for sending the behavior command to robot with driving the robot to perform the machine behavior Order, makes the robot make corresponding output behavior, as the feedback to its supervision object.
Present invention also offers a kind of robot, including:
Acquisition module, the audio/video flow or related sensor to supervision object of image unit acquisition are belonged to for getting Data;
Analysis module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from pre- Determine to match corresponding behavior type in behavior type storehouse, then determine and behavior type from default corresponding relation database Corresponding machine behavior command;
Performing module, for making corresponding output row according to the behavior command driven machine people relevant interface or unit For as the feedback to its supervision object.
The method and apparatus that the present invention is provided is greatly improved the level of intelligence of robot so that the friendship of machine person to person Mutually more convenient, form is more enriched, and its supervision object is analyzed by the audio of robot itself, video or sensor device Some action, language, even express one's feelings, it becomes possible to corresponding feedback behavior is made, so as to enable a person to by diversified forms Realize interacting with robot.And, the feedback behavior of robot also more horn of plenty not only can be by sound, video or two Person is combined, it is also possible to some behaviors of people are fed back by acting so that even children and old man, it is also possible to robot Convenient interaction, accomplishes that robot accompanies and attends to children and old man so that the possibility that robot becomes a reality into the family of people.
Brief description of the drawings
Fig. 1 is one embodiment of the invention robot interactive control method flow chart;
Fig. 2 is one embodiment of the invention robot interactive server architecture schematic diagram;
Fig. 3 is another embodiment of the present invention robot interactive control method flow chart;
Fig. 4 is the structural representation of one embodiment of the invention robot.
Specific embodiment
Embodiments of the invention are described below in detail, the example of the embodiment is shown in the drawings, wherein from start to finish Same or similar label represents same or similar element or the element with same or like function.Below with reference to attached It is exemplary to scheme the embodiment of description, is only used for explaining the present invention, and is not construed as limiting the claims.
Those skilled in the art of the present technique are appreciated that unless expressly stated, singulative " " used herein, " one It is individual ", " described " and " being somebody's turn to do " may also comprise plural form.It is to be further understood that what is used in specification of the invention arranges Diction " including " refer to the presence of the feature, integer, step, operation, element and/or component, but it is not excluded that in the presence of or addition One or more other features, integer, step, operation, element, component and/or their group.It should be understood that when we claim unit Part is " connected " or during " coupled " to another element, and it can be directly connected or coupled to other elements, or can also exist Intermediary element.Additionally, " connection " used herein or " coupling " can include wireless connection or wireless coupling.It is used herein to arrange Diction "and/or" includes one or more associated wholes or any cell of listing item and all combines.
Those skilled in the art of the present technique are appreciated that unless otherwise defined, all terms used herein (including technology art Language and scientific terminology), with art of the present invention in those of ordinary skill general understanding identical meaning.Should also Understand, those terms defined in such as general dictionary, it should be understood that with the context with prior art The consistent meaning of meaning, and unless by specific definitions as here, will not otherwise use idealization or excessively formal implication To explain.
Those skilled in the art of the present technique are appreciated that " terminal " used herein above, " terminal device " both include wireless communication The equipment of number receiver, the equipment of its wireless signal receiver for only possessing non-emissive ability, and including receiving and transmitting hardware Equipment, its equipment with reception that two-way communication on bidirectional communication link, can be carried out and transmitting hardware.This equipment Can include:Honeycomb or other communication equipments, it has single line display or multi-line display or is shown without multi-line The honeycomb of device or other communication equipments;PCS (Personal Communications Service, PCS Personal Communications System), it can With combine voice, data processing, fax and/or its communication ability;PDA (Personal Digital Assistant, it is personal Digital assistants), it can include radio frequency receiver, pager, the Internet/intranet access, web browser, notepad, day Go through and/or GPS (Global Positioning System, global positioning system) receiver;Conventional laptop and/or palm Type computer or other equipment, its have and/or conventional laptop and/or palmtop computer including radio frequency receiver or its His equipment." terminal " used herein above, " terminal device " they can be portable, can transport, installed in the vehicles (aviation, Sea-freight and/or land) in, or be suitable for and/or be configured in local runtime, and/or with distribution form, operate in the earth And/or any other position operation in space." terminal " used herein above, " terminal device " can also be communication terminal, on Network termination, music/video playback terminal, for example, can be PDA, MID (Mobile Internet Device, mobile Internet Equipment) and/or the equipment such as mobile phone, or intelligent television, Set Top Box with music/video playing function.
Robot is typically made up of executing agency, drive device, detection means and control system and complicated machinery etc., can be certainly Dynamic control, repeats programming, multi-functional, there is several frees degree, can fix or move, in relevant automatic system.
One embodiment of the invention provides a kind of robot interactive control method, as shown in figure 1, the interaction control method includes The following steps:
S110:Receive the audio/video flow or correlation of its supervision object for belonging to the acquisition of its image unit of robot upload Sensing data.
As the first step for realizing interaction, it is necessary first to obtain as far as possible abundant raw information, these raw informations include User's various actions or instruction, namely robot supervision object various actions or instruction, including by machine the person The audio-video frequency content that the camera of upper installation gets, shows in machinery equipment it is then audio/video flow, as retouching for personalizing The sound of Shu Zeshi robots " hearing " and the image of " seeing ", such as microphone installed in robot are indexed to supervision object Voice, or the camera record installed is to one section of action consecutive image about supervision object etc., and the sensor number of correlation According to being then the data that collect of various kinds of sensors installed in robot, such as range sensor, temperature sensor, acceleration are passed These sensors, according to different use occasions or function, valid can be selected by sensor or smell sensor etc. Select, accordingly, equivalent to " sensation " organ of robot, the sensing data of acquisition is then its " sensation " to these sensors.When Robot is got after these initial data, is uploaded and gives cloud server and be further processed, by high in the clouds The powerful data-handling capacity of server and big data resource, this treatment can be carried out more quickly and accurately.It is preferred that , the mode that robot uploads data to cloud server is upload in real time, and for the robot of practical application, network state is good Can be well the basic environmental factors for speculating, uploading can realize the quick communication machine of robot and cloud server in real time System, makes robot possess good information processing capability and quick reactivity worth.
S120:According to the characteristic extracted from audio/video flow or sensing data, from predefined action typelib Allot corresponding behavior type.
Cloud server carries out dissection process to it after getting audio/video flow or sensing data, and sound is determined in analysis The behavior pattern that monitored object representation representated by video flowing or sensing data goes out.Always included in every kind of data message Most basic characteristic, distinguishes different behaviors according to this, and these characteristics can according to the summary and induction of pertinent art To form correlated characteristic database, certain behavior is collectively formed by a certain characteristic or certain several characteristic, some In the case of even exist and include the characteristic of same number of species, and the arrangement precedence of these characteristics is different, The behavior pattern that it can also represented is different.The characteristic that every kind of behavior is possessed is sorted out and combines same various actions Predefined action typelib, the characteristic in cloud server parses audio/video flow or sensing data are formed in individual database According to, matching is searched in predefined action typelib further according to these characteristics, find the corresponding row of the characteristic of matching For, you can determine audio/video flow or the corresponding behavior type of sensing data.It is to be appreciated that some behavior acts are included incessantly One characteristic, and characteristic is more much more perfect, then it is more accurate to describe behavior act, so in predefined action typelib The characteristic of each behavior should be that continuous renewal is perfect, while the ability of cloud server parsing initial data is also Constantly accurate, refinement and perfect.
Preferably, matched according to the characteristic extracted from audio/video flow and further from predefined action typelib The specific method of corresponding behavior type is:Extract the audio feature information or keyword of the audio/video flow sound intermediate frequency data Information, by prestore audio feature information or the key that prestores in the characteristic information or key word information and predefined action typelib Word information matches, and the corresponding behavior type of the voice data is determined according to this.Include in one section of voice messaging the time, One or more key word information such as point, personage, object or reason, additionally it is possible to which audio is special including sound frequency, tone, tone color etc. Reference ceases, and those skilled in the art can accomplish properly to analyze these audio frequency characteristics letter from the voice data in audio/video flow Breath or key word information.When analysis draws these audio feature informations or key word information, next by its with prestore beyond the clouds Pre-stored data in server compares, namely prestore audio feature information or the keyword that prestores in predefined action typelib In information search, once find out occurrence, determine that out these prestore audio feature information or the key word information that prestores it is corresponding Prestore behavior type, so that cloud server also determines that out behavior type corresponding with voice data.This technical scheme makes Even the old man or child for not knowing how to operate computer are obtained, as long as sound can be sent, it becomes possible to sent to robot Instruction, realization is interacted with the communication of robot.
Preferably, matched according to the characteristic extracted from audio/video flow and further from predefined action typelib The specific method of corresponding behavior type is:The video features frame information of video data in audio/video flow is extracted, is regarded described Frequency feature frame information matches with the video features frame information that prestores in predefined action typelib, and the video data is determined according to this Corresponding behavior type.Video is made up of numerous static pictures in certain hour, and each secondary picture is one in principle Frame, continuous frame just forms video.Each sub-picture has its characteristic feature, such as fundamental figure position therein, line strip Shape, light and shade distribution etc., the technology for analyzing pictorial feature is image recognition technology, therefore is extracted by image recognition technology Video features frame information in video data, the video features frame information is made up of the pictorial information of record behavior act, is used for Record various behavior types with characteristic feature, such as the walking of people, run, stand, sitting down, facial expression or gesture, Each video features frame information is the picture that recite some action in a kind of behavior type, includes and represents the action picture The recognizable information of the computers such as the fundamental figure position in face, stripe shape or light and shade branch, such video features frame information is past Toward being many frame informations, and make it is continuous a number of in the given time, such as continuous ten in one second, or 0.1 Continuous ten in second, the time is shorter and quantity is more, then video features frame information is more accurate.Data of server beyond the clouds It is default behavior type storehouse to have one in storehouse, and default behavior type storehouse has various behavior types, each behavior type catalogue Include that (it is recognised that the video features frame information that prestores is more, correspondence presets behavior type to some video features frame informations that prestore down It is more accurate), the video features frame information that prestores is to gather and store the video features in computer or in the middle of network in advance Frame information, when cloud server is detected under the video features frame information and certain behavior type catalogue extracted from audio/video flow The video features frame information that prestores match, it is determined that go out the corresponding behavior type of video data in the audio/video flow, certainly, this It is not video feature information very identical with the video features frame information that prestores to plant matching, but there is certain fuzziness, This matching is realized by the fuzzy algorithmic approach in computer picture recognition, in further detail specific method by with computer picture Recognize that related technical staff can realize by related art method.Even this technical scheme causes not knowing how behaviour Make the old man or child of computer, it is also possible to send instruction to robot by acting, realization is interacted with the communication of robot.
As another preferred scheme, according to the characteristic extracted from sensing data, from predefined action type The specific method that corresponding behavior type is matched in storehouse is:Extract particular data word section or the letter in the sensing data Breath implication;By pre-stored data field or prestored information in particular data word section or described information implication and predefined action typelib Implication matches, and the corresponding behavior type of the sensing data is determined according to this.Included in sensing data polytype Data, with different periods for, with regard to the available free segment data data of holding state (sensor be in) and work segment data, work Necessarily (the performance degree of detection content is weighed, such as power comprising time data, detection content data and intensity data in segment data Size, speed speed) etc..According to certain algorithm or processing method, extracted from sensing data particular data word section or Meaning of Information, for specific algorithm or processing method, the technical staff of relevant sensing data treatment can complete.Predetermined row To include numerous behavior types in typelib, one or more corresponding pre-stored data fields are included under each behavior type Or prestored information implication, cloud server is according to the data field or Meaning of Information extracted in sensing data in pre-stored data word Retrieved in section or prestored information implication, find occurrence, the corresponding behavior type of sensing data can be determined.According to difference Functional requirement, different types of sensor is installed, such as in the middle of family in robot, it may be necessary to detect room temperature or family The body temperature of front yard member, it is therefore desirable to have temperature sensor;Whether it is likely to need have natural gas leaking in detection kitchen, therefore can To install gas detection sensor;It is likely to need robot with some kinsfolk, function is looked after in realization, it is necessary to With the kinsfolk, therefore at least need to use geography information sensor.
S130:The machine behavior command corresponding with behavior type is determined from default corresponding relation database.
After determining audio/video flow or the corresponding behavior type of sensing data by step S120, according to behavior class Type is searched in server database beyond the clouds, and it is default corresponding relation database to search object.In default corresponding relation database Including behavior type and machine behavior command corresponding with behavior type.Certain in default corresponding relation database is matched Plant behavior type, you can draw corresponding machine behavior command and carry out the operation of next step.Preferably, corresponding relation number is preset Include default behavior type data according to storehouse, numerous default behavior type data are grouped together into default behavior type data Table, each default behavior type data correspond to a machine behavior command table, that is, preset behavior type data and machine row To have mapping relations between instruction, certain this mapping relations are not mapping relations one by one, may multiple different behaviors Type all corresponds to same machine behavior command, such as speech act " taking cup ", and the expression identical meaning that uses gesture, finally Machine behavior command be all probably to realize " bringing cup ".Determine and behavior class from default corresponding relation database Type corresponding machine behavior command is concretely comprised the following steps:With behavior type as keyword, in retrieval corresponding relation database Default behavior type tables of data, determines the default behavior type of matching, and the mapping relations according to the default behavior type are reflected Penetrate generation machine behavior command.Certainly, the mode for determining here can also be calculated, by robot to key word information Judgement, alignment processing is carried out to relevant information and obtains machine behavior command, such as " pass me cup " this behavior, robot Know " cup " this specific thing, and cup position, know the implication of " I " this object, and " I " position, lead to Cross self-position judgement, and various scene objects where environment geography information, calculate machine behavior command, it is possible to Related task is completed by performing the machine behavior command.When default behavior type or default corresponding relation database are richer Richness, the behavior that robot can feed back is abundanter, therefore default behavior type storehouse and/or default corresponding relation database need Having constantly update and supplement, otherwise update to the mode of supplement directly can carry out on server beyond the clouds related setting, Supplement updates, it is also possible to is set or is updated by the user associated with cloud server so that default behavior class The data source of type database and/or default corresponding relation database is more enriched, it is also possible to more closing to reality, this technology Scheme causes that person skilled fully analyzes the details of human behavior, the default behavior class of constantly improve using internet big data Type database, makes the intellectuality of robot fully improve, and the degree for making it personalize is also more abundant.
Preferably, and its behavior command includes:Phonetic order, with driven machine people's voice playback interface output voice, and/ Or video instructions, video, and/or action executing instruction are exported with driven machine people's video playback interface, transported with driven machine people Moving cell output action.In face of the various expression ways of the mankind, robot should also be as with various behavior expression modes, or with sound Sound is expressed, such as when the mankind propose to allow robot to play a certain section of sound, instruction " learning duck to cry " is sent to robot, Then robot receives this and instructs and upload, by after cloud server treatment, output device instruction " plays the cry of duck ", The order-driven robot opens phonation unit and plays correspondence audio.It is or robot captures some expressions of people, such as micro- Laugh at, by uploading onto the server, server analysis treatment obtains corresponding machine instruction, can driven machine people it is corresponding Video playback interface plays one section of cheerful and light-hearted video, or shows the expression of smiling face, also or show that corresponding action is held Row instruction, can driven machine people's moving cell, such as hand indicates arm, palm associated machine component to be formed and claps hands action.This The machine instruction of a little types is possible isolated operation, it is also possible to which while operation, such as when Human-to-Machine people says, " we jump one section Dancing ", then robot the voice messaging data are uploaded to server, after server is judged, send machine behavior Instruction, the machine behavior command can make the sound system of robot play music, also make video playback interface such as display screen Corresponding dancing picture is played, or makes expression, and can be with driven machine people's relative motion unit, such as leg exercise list Unit and hand exercise unit are moved with music together.The content of these machine behavior commands is various equivalent modifications by compiling What journey can be realized.
S140:The behavior command is sent to robot to drive the robot to perform the machine behavior command, makes the machine Device people performs corresponding output behavior, used as the feedback to its supervision object.
Machine behavior command produce after still server beyond the clouds, final purpose Shi Yaoshi robots feed back its supervision object Certain behavior, therefore the cloud server behavior command that draws for the treatment of needs to be immediately transmitted in robot, robot according to The machine behavior command performs corresponding output behavior, and corresponding feedback is shown to its supervision object.The execution of behavior command The code that the behavior command for having direct basis detailed is included is carried out, for example, simply play one section of musical instruction, Huo Zhebo Put one section of behavior command of video, it is also possible to by robot calculating disposal ability in itself, according to behavior instruction in Put computer and calculate actual each part executable code, such as when being " taking a book " when behavior command, robot according to The calculating such as the distance between the position of itself, the position of bookcase, the layout in residing room, and these objects judge and perform.
Used as a kind of preferred scheme, when sensing data is geography information sensing data, cloud server is over the ground The treatment for managing information sensor data is as follows:When sensing data is geography information sensing data, geographical letter therein is extracted Breath data, geographic information data is matched with the geographic information data that prestores for prestoring in the server, determines that the geography is believed The behavior type of data characterization is ceased, determines that the machine behavior corresponding with behavior type refers to from default corresponding relation database Order, and the machine behavior command includes rate information, directional information and distance information, sends the behavior command to robot To drive the robot to perform the machine behavior command, the robot is performed corresponding output behavior, change robot relative In the geography information of its supervision object.For example, as the partner of a certain individual in needing robot to turn into kinsfolk (usually Old man or children), the robot is just set often with the range of the certain distance that it sets supervision object, when supervision object fortune When dynamic, robot can also be moved accordingly, and still keep the distance range of setting again with the supervision object.It is right when monitoring After as movement, beyond the distance range of the setting, the geography information sensor with robot can get supervision object Geography information, such as spacing, longitude and latitude, have clear or relative coordinate etc., upload these information to cloud server Afterwards, cloud server carries out respective handling or calculating, show that robot needs the corresponding machine behavior command for performing, these rows For instruction includes the speed of movement in need, the distance in the direction of movement and movement or path etc. content, final realization with The mobile change of supervision object its own geography information (relative to supervision object position or distance etc.), continue keep with prison Depending on object distance within limits, complete company to supervision object.
Used as another preferred scheme, sensing data is optical touch screen sensor data, extracts the optical touch screen sensor data In touch control operation information and by its with predefined action typelib in the touch control operation information match that prestores, determine according to this described The corresponding behavior type of optical touch screen sensor data.By setting optical touch screen sensor in robot, such as present touch screen is exactly A kind of very typical touch sensing equipment, sets some conventional instruction options, when the supervision object pair of robot thereon After a certain instruction on touch screen is selected, selection information is uploaded to cloud server by robot, and cloud server is extracted Go out the touch control operation information that wherein represents and generate corresponding behavior type, determined further according to behavior type afterwards corresponding Machine behavior command and transfer to robot to perform.This technical scheme can reduce instruction difficulty of implementation, mitigate cloud server Processing load, makes cloud server processing data speed faster, and instruction output is more accurate, also finally causes that robot is faster anti- The instruction of the monitored object of feedback.
Preferably, usual robot is a kind of valuable equipment, and identification is the function that it generally requires to possess, slave Device people is developed and comes application, and the object of its contact has many, and the debugging stage for example after finishing assembly grinds, it is necessary to contact Hair personnel, commissioning staff etc., and when in use, the user of different identity can be run into again, these personnel should relative to robot When there are different Permission Levels, can accordingly have the instruction modification authority of this rank permission in corresponding Permission Levels, be System renewal authority or access right etc., therefore, receiving its monitoring for belonging to the acquisition of its image unit of robot upload Also include before the step of audio/video flow or pertinent sensor data of object, the biological information for receiving robot upload is passed Sensor data, when the biological information sensing data and the biometric information matches that prestore, determine the biological characteristic The identity authority of the corresponding user of information sensor data, refers in response to the corresponding default feedback of the identity permission build Order.A step identity was set before robot receives audio/video flow or associated sensor data and carries out the behavior of corresponding operating to test Card operation so that robot identifies the identity of its supervision object, and subsequent operation is carried out again after corresponding authority to be confirmed.Example Such as, in face of kinsfolk, robot according to the identity authority of the biological information that prestores can listen to order and perform action, and right In interim guest, robot does not include that the biological characteristic of the interim guest is believed in then identifying its biological information that prestores Breath, will not listen to the instruction of the interim guest, or perform action according to the corresponding instruction of interim guest, such as say hello.It is excellent Choosing, biological information sensing data include fingerprint information data, iris information data or acoustic information data in extremely Few one kind, determines the identity authority of user with the contrast of default identity authority information according to this, is opened in response to the identity authority Corresponding default feedback authority.Accurate authentication operation is obtained in that by biological information, and these are biological The collection of characteristic information is quite convenient to.
Foregoing method is adapted to, based on modularized thoughts, one embodiment of the invention provides a kind of robot interactive service Device, as shown in Fig. 2 including:
Receiver module 110, the sound of its supervision object for belonging to the acquisition of its image unit for receiving robot upload Video flowing or associated sensor data.Treat that robot gets data by the shooting recording access of itself and every sensor Afterwards, passed through receiver module 110 to upload in the middle of cloud server.
Parsing module 120, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from Corresponding behavior type is matched in predefined action typelib.Cloud server gets audio frequency and video by receiver module 110 Dissection process is carried out to it after stream or sensing data, audio/video flow or being supervised representated by sensing data are determined in analysis Depending on the behavior pattern that object representation goes out.Most basic characteristic is always included in every kind of data message, distinguishes different according to this Behavior, these characteristics can form correlated characteristic database according to the summary and induction of pertinent art, certain behavior Collectively formed by a certain characteristic or certain several characteristic, even exist under certain situation and include same species number The characteristic of amount, and the arrangement precedence of these characteristics is different, the behavior pattern that it can also represented is different.Will be every The characteristic that the behavior of kind possesses is sorted out and various actions is combined into the formation predefined action typelib in same database, when Cloud server parses the characteristic extracted in audio/video flow or sensing data, further according to these characteristics predetermined Matching is searched in behavior type storehouse, the corresponding behavior of the characteristic of matching is found, you can determine audio/video flow or sensor The corresponding behavior type of data.
Data generation module 130, for determining machine corresponding with behavior type from default corresponding relation database Behavior command.After audio/video flow or the corresponding behavior type of sensing data being determined by parsing module 120, data genaration Module 130 is searched according in behavior type beyond the clouds server database, and it is default corresponding relation database to search object.In advance If corresponding relation database includes behavior type and machine behavior command corresponding with behavior type.It is default right when matching Certain behavior type in relational database is answered, data generation module 130 can draw corresponding machine behavior command and carry out The operation of next step.
Sending module 140, for sending the behavior command to robot to drive the robot to perform the machine behavior Instruction, makes the robot make corresponding output behavior, as the feedback to its supervision object.After the generation of machine behavior command still Right server beyond the clouds, final purpose Shi Yaoshi robots are fed back at certain behavior, therefore cloud server of its supervision object The behavior command that reason draws needs to be immediately transmitted in robot, and then robot performs corresponding according to the machine behavior command Output behavior, corresponding feedback is shown to its supervision object.
Robot interactive control method and server that one embodiment of the invention is provided cause the level of intelligence of robot Increase substantially, allow man-machine interaction more to facilitate, by the audio of robot itself, video or sensor device point Some action, language of its supervision object are analysed, is even expressed one's feelings, it becomes possible to corresponding feedback behavior is made, so as to enable a person to Realize interacting with robot by diversified forms.
Another embodiment of the present invention provides a kind of robot interactive control method, as shown in figure 3, the interaction control method Comprise the following steps:
S210:Get the audio/video flow or associated sensor data to supervision object for belonging to image unit acquisition.
The development of modern computer technology is suprised to be tongue-tied, and the chip of small volume and powerful emerges in an endless stream, thus can Progressively to make the entity that robot itself is received directly as information, information processing and information are performed, and by cloud server Supplemented as one kind, can increase substantially the efficiency of information processing.In the present embodiment, as the first of realization interaction Step, it is necessary first to which robot obtains as far as possible abundant raw information, these raw informations include user's various actions or refer to Order, namely robot supervision object various actions or instruction, including by with robot install camera get Audio-video frequency content, show in machinery equipment it is then audio/video flow, be then robot " hearing " as the description for personalizing The image of sound and " seeing ", such as microphone installed in robot is indexed to the voice of supervision object, or the shooting installed Machine recorded one section of action consecutive image about supervision object etc., and the sensing data of correlation is installed in robot The data that various kinds of sensors is collected, such as range sensor, temperature sensor, acceleration transducer or smell sensor etc. Deng, according to different use occasions or function, these sensors valid can be selected, accordingly, these sensings Equivalent to " sensation " organ of robot, the sensing data of acquisition is then its " sensation " to device.When robot gets these originals After beginning data, it is transmitted to other processing modules and is further processed, by the powerful of self-contained computer Data-handling capacity, this treatment can be carried out quickly and accurately.
S220:According to the characteristic extracted from the audio/video flow or the sensing data, from predefined action class Corresponding behavior type is matched in type storehouse.
Robot carries out dissection process to it after getting audio/video flow or sensing data, and audio frequency and video are determined in analysis The behavior pattern that monitored object representation representated by stream or sensing data goes out.Most root is always included in every kind of data message This characteristic, distinguishes different behaviors according to this, and these characteristics can be with shape according to the summary and induction of pertinent art Into correlated characteristic database, certain behavior is collectively formed by a certain characteristic or certain several characteristic, certain situation It is lower even to there is the characteristic for including same number of species, and the arrangement precedence of these characteristics is different, also can The behavior pattern for representing it is different.The characteristic that every kind of behavior is possessed is sorted out and combines in same number various actions According to formation predefined action typelib, the characteristic in robot parses audio/video flow or sensing data, then root in storehouse Matching is searched in predefined action typelib according to these characteristics, the corresponding behavior of the characteristic of matching is found, you can Determine audio/video flow or the corresponding behavior type of sensing data.It is to be appreciated that some behavior acts include that more than one is special Levy data, and characteristic is more much more perfect, then it is more accurate to describe behavior act, thus in predefined action typelib each The characteristic of behavior should be constantly update it is perfect, while the ability of robot parsing initial data be also constantly it is accurate, Refine and perfect.
Preferably, matched according to the characteristic extracted from audio/video flow and further from predefined action typelib The specific method of corresponding behavior type is:Extract the audio feature information or keyword of the audio/video flow sound intermediate frequency data Information, by prestore audio feature information or the key that prestores in the characteristic information or key word information and predefined action typelib Word information matches, and the corresponding behavior type of the voice data is determined according to this.Include in one section of voice messaging the time, One or more key word information such as point, personage, object or reason, additionally it is possible to which audio is special including sound frequency, tone, tone color etc. Reference ceases, and those skilled in the art can accomplish properly to analyze these audio frequency characteristics letter from the voice data in audio/video flow Breath or key word information.When analysis draws these audio feature informations or key word information, next by its with prestore beyond the clouds Pre-stored data in server compares, namely prestore audio feature information or the keyword that prestores in predefined action typelib In information search, once find out occurrence, determine that out these prestore audio feature information or the key word information that prestores it is corresponding Prestore behavior type, so that robot also determines that out behavior type corresponding with voice data.For example, robot hears being supervised The sound " helping me to pass me the cup on desk " sent depending on object, robot can extract key message " on desk Cup passs me ", or further for service object " I ", operation object " cup ", operation object positional information " desk ", Operation content " passing " etc., actual audio characteristic information or key word information are extracted and are not limited to this mode, can basis Being actually needed carries out more appropriate operation, and then robot is searched according to these keywords in predefined action typelib, is found Corresponding prestore audio feature information or the key word information that prestores, so that it is determined that going out behavior type for " to ' I ' transmission cup Son ".Even the technical program causes the old man or child for not knowing how to operate computer, it is also possible to by some of itself Act and send instruction to robot, realization interacts communication with robot.
Preferably, matched according to the characteristic extracted from audio/video flow and further from predefined action typelib The specific method of corresponding behavior type is:The video features frame information of video data in audio/video flow is extracted, is regarded described Frequency feature frame information matches with the video features frame information that prestores in predefined action typelib, and the video data is determined according to this Corresponding behavior type.Video is made up of numerous static pictures in certain hour, and each secondary picture is one in principle Frame, continuous frame just forms video.Each sub-picture has its characteristic feature, such as fundamental figure position therein, line strip Shape, light and shade distribution etc., the technology for analyzing pictorial feature is image recognition technology, therefore is extracted by image recognition technology Video features frame information in video data, such video features frame information is often many frame informations.Server beyond the clouds It is default behavior type storehouse to have one in database, and default behavior type storehouse has various behavior types, each behavior type Include that (it is recognised that the video features frame information that prestores is more, correspondence presets behavior to some video features frame informations that prestore under catalogue Type is more accurate), when cloud server detects the video features frame information extracted from audio/video flow and certain behavior type The video features frame information that prestores under catalogue matches, it is determined that go out the corresponding behavior type of video data in the audio/video flow. For example, when the camera of robot captures its supervision object --- the picture that children cry, robot then analyzes this and includes The video flowing of children's sobbing content, extracts video features frame information therein, and the video features frame information potentially includes a width width Children's face drops tears or crys the picture of expression, by it with robot own database in the video features frame information that prestores Compare, match prestore video features frame information of this children with sobbing facial characteristics, just correspondence determines the children In sobbing state.Such as, a certain robot supervision object sends palm traverse finger and closes up and to the left to robot Gesture, robot captures this information, analyzes and closes up with palm traverse finger and the video of gesture to the left is special Frame information is levied, is searched in the behavior type storehouse that itself prestores, found and prestore under " to the left walk " this behavior type catalogue that prestores The video features frame information with same feature so that correspondence judge that supervision object wishes its behavior class to left movement Type.In sum, even this technical scheme causes the old man or child for not knowing how to operate computer, it is also possible to pass through Some of itself are acted and send instruction to robot, and realization is interacted with the communication of robot.
As another preferred scheme, according to the characteristic extracted from sensing data, from predefined action type The specific method that corresponding behavior type is matched in storehouse is:Extract particular data word section or the letter in the sensing data Breath implication;By pre-stored data field or prestored information in particular data word section or described information implication and predefined action typelib Implication matches, and the corresponding behavior type of the sensing data is determined according to this.Included in sensing data polytype Data, with different periods for, with regard to the available free segment data data of holding state (sensor be in) and work segment data, work Necessarily (the performance degree of detection content is weighed, such as power comprising time data, detection content data and intensity data in segment data Size, speed speed) etc..According to certain algorithm or processing method, extracted from sensing data particular data word section or Meaning of Information, for specific algorithm or processing method, the technical staff of relevant sensing data treatment can complete.Predetermined row To include numerous behavior types in typelib, one or more corresponding pre-stored data fields are included under each behavior type Or prestored information implication, robot according to the data field or Meaning of Information extracted in sensing data in pre-stored data field or Retrieved in prestored information implication, find occurrence, the corresponding behavior type of sensing data can be determined.According to different work( Can demand, different types of sensor is installed, such as in the middle of family in robot, it may be necessary to detect room temperature or family into The body temperature of member, it is therefore desirable to have temperature sensor;Whether it is likely to need have natural gas leaking in detection kitchen, therefore can pacifies Dress gas detection sensor;It is likely to need robot with some kinsfolk, function is looked after in realization, it is necessary to follow this Kinsfolk, therefore at least need to use geography information sensor.
S230:The machine behavior command corresponding with behavior type is determined from default corresponding relation database.
After determining audio/video flow or the corresponding behavior type of sensing data by step S220, according to behavior class Type is searched in robot data storehouse, and it is default corresponding relation database to search object.Default corresponding relation database includes Behavior type and machine behavior command corresponding with behavior type.Certain row in default corresponding relation database is matched It is type, you can draw corresponding machine behavior command and carry out the operation of next step.Preferably, corresponding relation database is preset Include default behavior type data, numerous default behavior type data are grouped together into default behavior type tables of data, Each default behavior type data corresponds to a machine behavior command table, that is, preset behavior type data and refer to machine behavior There are mapping relations, certain this mapping relations are not mapping relations one by one, may multiple different behavior types between order All correspond to same machine behavior command, such as speech act " taking cup ", and use gesture the expression identical meaning, final machine Device behavior command is all probably to realize " bringing cup ".Determine and behavior type phase from default corresponding relation database Corresponding machine behavior command is concretely comprised the following steps:It is default in retrieval corresponding relation database with behavior type as keyword Behavior type tables of data, determines the default behavior type of matching, the mapping relations mapping life according to the default behavior type Into machine behavior command.Certainly, the mode for determining here can also be calculated, and key word information is sentenced by robot It is disconnected, alignment processing is carried out to relevant information and obtains machine behavior command, this behavior of for example " passing cup to me ", robot is known " cup " this specific thing, and cup position, know the implication of " I " this object, and " I " position, by from The geography information of environment, calculates machine behavior command, it is possible to pass through where body position judgment, and various scene objects Perform the machine behavior command and complete related task, the implementation method those skilled in the art of deeper degree are by compiling Journey can be realized.When default behavior type or default corresponding relation database are abundanter, the behavior that robot can feed back is got over It is abundant, therefore default behavior type storehouse and/or default corresponding relation database need constantly renewal and supplement, update and mend Or the setting of correlation, supplement are carried out in the interface that the mode filled can be provided directly in the robot by user or are updated, By being connected with cloud server and down loading updating can also be carried out so that default behavior type database and/or default correspondence The data source of relational database is more enriched, it is also possible to more closing to reality, and this technical scheme causes person skilled The details of human behavior is fully analyzed using internet big data, the default behavior type database of constantly improve makes robot Intelligent fully to improve, the degree for making it personalize is also more abundant.
Preferably, and its behavior command includes:Phonetic order, with driven machine people's voice playback interface output voice, and/ Or video instructions, video, and/or action executing instruction are exported with driven machine people's video playback interface, transported with driven machine people Moving cell output action.In face of the various expression ways of the mankind, robot should also be as with various behavior expression modes, or with sound Sound is expressed, such as when the mankind propose to allow robot to play a certain section of sound, " we sing together state to send instruction to robot Song ", then robot receives the instruction, after treatment, and output device instruction " broadcasting national anthem ", the order-driven robot is opened Open phonation unit and play correspondence audio.Or robot captures some expressions of people, such as smile, treatment by analysis is obtained Corresponding machine instruction, being capable of the corresponding video playback interface one section of cheerful and light-hearted video of broadcasting of driven machine people, Huo Zhexian The expression of smiling face is shown, also or corresponding action executing instruction is drawn, can driven machine people's moving cell, such as hand, Arm, palm associated machine component is indicated to form action of clapping hands.The machine instruction of these types is possible isolated operation, it is also possible to Run simultaneously, such as, when Human-to-Machine people says " we jump one section of dancing ", then robot is directed to the voice messaging data After judging, machine behavior command is sent, the machine behavior command can make the sound system of robot play music, also make Video playback interface such as display screen plays corresponding dancing picture, or makes expression, and can be related with driven machine people Moving cell, such as leg exercise unit and hand exercise unit are moved with music together.The content of these machine behavior commands It is that various equivalent modifications can be realized by programming.
S240:Corresponding output behavior is made according to the machine behavior command driven machine people relevant interface or unit, As the feedback to its supervision object.
Robot is determined after machine behavior command, in addition it is also necessary to is transferred to its own all parts and is got on to perform, such as The moving component (such as various servomotors) of sound playback interface, video playback interface and/or correlation.The execution of behavior command has The code that the detailed behavior command of direct basis is included is carried out, for example, simply play one section of musical instruction, or play One section of behavior command of video, it is also possible to by robot calculating disposal ability in itself, according to behavior instruction by built-in Computer calculates actual each part executable code, such as when behavior command is " being gone to the desk side in bedroom ", machine Device people is according to the position of itself, the position in bedroom, the position of living room table, the layout in residing whole room, and these objects The distance between wait calculating judge and perform.
Used as a kind of preferred scheme, when sensing data is geography information sensing data, robot is believed geography The treatment for ceasing sensing data is as follows:When sensing data is geography information sensing data, geography information number therein is extracted According to, geographic information data is matched with the geographic information data that prestores being pre-stored in robot, determine the geography information number According to the behavior type for characterizing, the machine behavior command corresponding with behavior type is determined from default corresponding relation database, And the machine behavior command includes rate information, directional information and distance information, send the behavior command to robot with Drive the robot to perform the machine behavior command, the robot is performed corresponding output behavior, change robot relative to The geography information of its supervision object.For example, (usually old as the partner of a certain individual in needing robot to turn into kinsfolk People or children), the robot is just set often with the range of the certain distance that it sets supervision object, when supervision object motion When, robot can also be moved accordingly, and still keep the distance range of setting again with the supervision object.Work as supervision object After movement, beyond the distance range of the setting, the geography information sensor with robot can get supervision object Geography information, such as spacing, longitude and latitude, there are clear or relative coordinate etc., robot carries out respective handling or calculating, Drawing robot needs the corresponding machine behavior command for performing, the speed of these behavior commands including movement in need, movement Direction and mobile distance or path etc. content, finally realize with the mobile change of the supervision object geography information of its own (relative to the position of supervision object or distance etc.), within limits, it is right to complete for the distance of continuation holding and supervision object The company of supervision object.
Used as another preferred scheme, sensing data is optical touch screen sensor data, extracts the optical touch screen sensor data In touch control operation information and by its with predefined action typelib in the touch control operation information match that prestores, determine according to this described The corresponding behavior type of optical touch screen sensor data.By setting optical touch screen sensor in robot, such as present touch screen is exactly A kind of very typical touch sensing equipment, sets some conventional instruction options, when the supervision object pair of robot thereon After a certain instruction on touch screen is selected, robot extracts the touch control operation information that wherein represents and generates corresponding row It is type, determines corresponding machine behavior command further according to behavior type afterwards and transfer to robot associated components to hold OK.This technical scheme can reduce instruction difficulty of implementation, mitigate the burden of robot built-in computer, make robot processing data Faster, instruction output is more accurate for speed, the also final instruction for robot is faster fed back monitored object.
Preferably, usual robot is a kind of valuable equipment, and identification is the function that it generally requires to possess, slave Device people is developed and comes application, and the object of its contact has many, and the debugging stage for example after finishing assembly grinds, it is necessary to contact Hair personnel, commissioning staff etc., and when in use, the user of different identity can be run into again, these personnel should relative to robot When there are different Permission Levels, can accordingly have the instruction modification authority of this rank permission in corresponding Permission Levels, be System renewal authority or access right etc., therefore, receiving its monitoring for belonging to the acquisition of its image unit of robot upload Also include before the step of audio/video flow or pertinent sensor data of object, the biological information for receiving robot upload is passed Sensor data, when the biological information sensing data and the biometric information matches that prestore, determine the biological characteristic The identity authority of the corresponding user of information sensor data, refers in response to the corresponding default feedback of the identity permission build Order.A step identity was set before robot receives audio/video flow or associated sensor data and carries out the behavior of corresponding operating to test Card operation so that robot identifies the identity of its supervision object, and subsequent operation is carried out again after corresponding authority to be confirmed.Example Such as, in face of kinsfolk, robot according to the identity authority of the biological information that prestores can listen to order and perform action, and right In the interim guest for arriving, robot does not include that the biology of the interim guest is special in then identifying its biological information that prestores Reference ceases, and will not listen to the instruction of the guest, or performs action according to preset instructions corresponding with guest, such as say hello Deng.Preferably, biological information sensing data is including in fingerprint information data, iris information data or acoustic information data At least one, according to this with default identity authority information contrast determine user identity authority, in response to the identity authority The corresponding default feedback authority of opening.Accurate authentication operation, and these are obtained in that by biological information The collection of biological information is quite convenient to.
Foregoing method is adapted to, based on modularized thoughts, another embodiment of the present invention provides a kind of robot, such as Fig. 4 It is shown, including:
Acquisition module 210, the audio/video flow or related biography to supervision object of image unit acquisition are belonged to for getting Sensor data.
Analysis module 220, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from Corresponding behavior type is matched in predefined action typelib, then is determined and behavior class from default corresponding relation database The corresponding machine behavior command of type.
Performing module 230, for making corresponding defeated according to the behavior command driven machine people relevant interface or unit Go on a journey and be, as the feedback to its supervision object.
The robot interactive method and robot that another embodiment of the present invention is provided can more preferably realize people and robot Interaction, make its not only ensure robot have intelligent level higher, and process problem efficiency faster, make robot The degree that personalizes is also higher.In addition, robot rationally using high in the clouds big data advantage, constantly update itself behavior is judged with And the data category and quantity of reply, the problem that can be tackled more extensively, it is more deeply and accurate, finally realize people with The good interaction of robot.
Implementation of the invention is easier to understand for ease of those skilled in the art, in the example below narration actual scene How the interacting of people and robot is completed.
One of scene:In the good family of network state, child with its robot at one's side to saying:" I wants to drink Water.", robot captures this voice messaging by installed in the microphone of its head, is immediately believed this voice by network Breath is sent to cloud server, and cloud server sends out the behavior command that water is filled a cup of to child, robot to robot After getting, itself various moving component is driven to operate according to this behavior command, most Zhongdao water dispenser or drinking bottle go out to obtain One glass of water is got, child is sent at one's side.
The two of scene:In the good family of network state, child gets off from the chair in bedroom, goes to the sand in parlor Hair, the change of child position is got as the robot of its supporter, and it has exceeded certain model with the spacing of child Enclose, then this information is immediately transmitted to cloud server by robot, server judges that the position of child exists after treatment Change, behavior command is sent to robot, each part of robot instructs motion, child followed, if small according to the behavior Child is the parlor arrived on the run, then robot gait of march is relatively large, and if climbing what is gone, then its gait of march is relative It is smaller, until the two spacing meets certain scope again.
The three of scene:In the good family of network state, guest is carried out suddenly within one day in family, as the machine of kinsfolk Device people captures the facial image of the guest by camera, and this facial image uploads onto the server immediately, and server leads to The identity that contrast identifies the guest is crossed, behavior command is sent to robot, if the guest is acquaintance, wrapped in behavior command The action of reception is included, guides the guest to be rested to designated area, if the guest is stranger, behavior command includes courtesy Greeting term is simultaneously played by audio frequency play interface, starts simultaneously at a series of actions information for recording the stranger.
Or such process:
In the family, robot is the partner of child, when child out of doors when, robot by measurement with it is small The spacing of child and the movement velocity of child, determine the movement velocity and the spacing with child of oneself, work as child Point to a flower and ask " what flower this is", the microphone with robot gets this acoustic information, and judges child Want the behavior of acquisition information, then the sensing of child is observed by the image unit with it, and determine that child inquires Object, further find out the information related to the object in own database, and by the form of speech play to small Child explains the relevant information of its signified flower, such as title, kind etc..
Those skilled in the art of the present technique be appreciated that can be realized with computer program instructions these structure charts and/or The combination of the frame in each frame and these structure charts and/or block diagram and/or flow graph in block diagram and/or flow graph.This technology is led Field technique personnel be appreciated that can by these computer program instructions be supplied to all-purpose computer, special purpose computer or other The processor of programmable data processing method is realized, so that by the treatment of computer or other programmable data processing methods Device is come the scheme specified in the frame or multiple frame that perform structure chart disclosed by the invention and/or block diagram and/or flow graph.
Those skilled in the art of the present technique are appreciated that in various operations, method, the flow discussed in the present invention Step, measure, scheme can be replaced, changed, combined or deleted.Further, it is each with what is discussed in the present invention Other steps, measure in kind operation, method, flow, scheme can also be replaced, changed, reset, decomposed, combined or deleted. Further, it is of the prior art with various operations, method, the flow disclosed in the present invention in step, measure, scheme Can also be replaced, changed, reset, decomposed, combined or deleted.
The above is only some embodiments of the invention, it is noted that for the ordinary skill people of the art For member, under the premise without departing from the principles of the invention, some improvements and modifications can also be made, these improvements and modifications also should It is considered as protection scope of the present invention.In sum, technical scheme provided by the present invention is as described below:
A1, a kind of robot interactive control method, comprise the following steps:
Receive the audio/video flow or related transducer of its supervision object for belonging to the acquisition of its image unit of robot upload Device data;
According to the characteristic extracted from the audio/video flow or the sensing data, from predefined action typelib Match corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
The machine behavior command is sent to robot to drive the robot to perform the machine behavior command, makes the machine People performs corresponding output behavior, used as the feedback to its supervision object.
A2, the robot interactive control method according to A1, the feature that the foundation is extracted from the audio/video flow Data, include the step of corresponding behavior type is matched from predefined action typelib:
Extract the audio feature information or key word information of the audio/video flow sound intermediate frequency data;
Prestoring in the characteristic information or key word information and predefined action typelib or is prestored at audio feature information Key word information matches, and the corresponding behavior type of the voice data is determined according to this.
A3, the robot interactive control method according to 1, the characteristic that the foundation is extracted from the audio/video flow According to including the step of corresponding behavior type is matched from predefined action typelib:
Extract the video features frame information of video data in the audio/video flow;
The video features frame information is matched with the video features frame information that prestores in predefined action typelib, according to this Determine the corresponding behavior type of the video data.
A4, the robot interactive control method according to A3, the video features frame information is by record behavior act Pictorial information is constituted, and the continuous a number of pictorial information is the video features frame information in the scheduled time.
A5, the robot interactive approach according to A1, it is described according to sensing data, from predefined action typelib The step of allotting corresponding behavior type includes:
Extract particular data word section or the Meaning of Information in the sensing data;
By the pre-stored data field in particular data word section or described information implication and predefined action typelib or pre- Deposit Meaning of Information to match, the corresponding behavior type of the sensing data is determined according to this.
A6, the robot interactive control method according to A5, the sensing data are geography information sensing data, The geographic information data in the geography information sensing data is extracted, by the geographic information data and the geography information number that prestores According to matching, the behavior type that the geography information sensing data is characterized is determined according to this, from default corresponding relation database It is determined that the machine behavior command corresponding with behavior type, the machine behavior command include rate information, directional information and Distance information, sends the behavior command to robot to drive the robot to perform the machine behavior command, makes the robot Corresponding output behavior is performed, changes geography information of the robot relative to its supervision object.
A7, the robot interactive control method according to A5, the sensing data are optical touch screen sensor data, are extracted Touch control operation information in the optical touch screen sensor data and by its with predefined action typelib in the touch control operation information that prestores Match, the corresponding behavior type of the optical touch screen sensor data is determined according to this.
A8, the robot interactive approach according to A1, the default corresponding relation database include default behavior class Type tables of data, and there is the machine behavior command table of mapping relations with default behavior type tables of data;It is described to be closed from default correspondence It is that concretely comprising the following steps for the machine behavior command corresponding with behavior type is determined in database:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines The default behavior type of matching, the mapping relations mapping generation machine behavior command according to the default behavior type.
A9, the robot interactive control method according to A1, the machine behavior command include:
Phonetic order, voice is exported with driven machine people's voice playback interface, and/or
Video instructions, video is exported with driven machine people's video playback interface, and/or
Action executing is instructed, with driven machine people's moving cell output action.
A10, the robot interactive method according to A1, obtain in its image unit that belongs to for receiving robot upload Its supervision object audio/video flow or pertinent sensor data the step of before also include, the biology for receiving robot upload is special Reference ceases sensing data, when the biological information sensing data and the biometric information matches that prestore, it is determined that described The identity authority of the corresponding user of biological information sensing data, in response to corresponding pre- of the identity permission build If feedback command.
A11, the robot interactive method according to A10, the biological information sensing data are believed including fingerprint At least one in breath data, iris information data or acoustic information data, contrasts with default identity authority information determine according to this The identity authority of user, corresponding default feedback authority is opened in response to the identity authority.
A12, the robot interactive control method according to A1, the default behavior type storehouse and/or described default right Relational database is answered by cloud server or the user associated with cloud server setting or is updated.
A13, the robot interactive control method according to A1, the upload mode are upload in real time.
B14, a kind of robot interactive control method, comprise the following steps:
Get the audio/video flow or associated sensor data to supervision object for belonging to image unit acquisition;
According to the characteristic extracted from the audio/video flow or the sensing data, from predefined action typelib Match corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
Corresponding output behavior is made according to the machine behavior command driven machine people relevant interface or unit, as right The feedback of its supervision object.
B15, the robot interactive control method according to B14, the spy that the foundation is extracted from the audio/video flow Data are levied, is included the step of corresponding behavior type is matched from predefined action typelib:
Extract the audio feature information or key word information of the audio/video flow sound intermediate frequency data;
Prestoring in the characteristic information or key word information and predefined action typelib or is prestored at audio feature information Key word information matches, and the corresponding behavior type of the voice data is determined according to this.
B16, the robot interactive control method according to B14, the spy that the foundation is extracted from the audio/video flow Data are levied, is included the step of corresponding behavior type is matched from predefined action typelib:
Extract the video features frame information of video data in the audio/video flow;
The video features frame information is matched with the video features frame information that prestores in predefined action typelib, according to this Determine the corresponding behavior type of the video data.
B17, the robot interactive control method according to B14, it is described according to sensing data, from predefined action type The step of corresponding behavior type is matched in storehouse includes:
Extract particular data word section or the Meaning of Information in the sensing data;
By the pre-stored data field in particular data word section or described information implication and predefined action typelib or pre- Deposit Meaning of Information to match, the corresponding behavior type of the sensing data is determined according to this.
B18, the robot interactive control method according to B17, the sensing data are geography information sensor number According to, the geographic information data in the geography information sensing data is extracted, the geographic information data and the geography that prestores are believed Breath data match, determines the behavior type that the geography information sensing data is characterized, from default corresponding relation data according to this The machine behavior command corresponding with behavior type is determined in storehouse, the machine behavior command includes that rate information, direction are believed Breath and distance information, send the behavior command to robot to drive the robot to perform the machine behavior command, make the machine Device people performs corresponding output behavior, changes geography information of the robot relative to its supervision object.
B19, the robot interactive control method according to B17, the sensing data are optical touch screen sensor data, are carried Take the touch control operation information in the optical touch screen sensor data and it is believed with the touch control operation that prestores in predefined action typelib Manner of breathing is matched, and the corresponding behavior type of the optical touch screen sensor data is determined according to this.
B20, the robot interactive control method according to B14, the default corresponding relation database include default Behavior type tables of data, and there is the machine behavior command table of mapping relations with default behavior type tables of data;It is described from default Concretely comprising the following steps for machine behavior command corresponding with behavior type is determined in corresponding relation database:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines The default behavior type of matching, the mapping relations mapping generation machine behavior command according to the default behavior type.
B21, the robot interactive control method according to B14, the machine behavior command include:
Phonetic order, voice is exported with driven machine people's voice playback interface, and/or
Video instructions, video is exported with driven machine people's video playback interface, and/or
Action executing is instructed, with driven machine people's moving cell output action.
B22, the robot interactive control method according to B14, its monitoring that image unit is obtained is belonged to getting Also include before the step of audio/video flow or pertinent sensor data of object, receive biological information sensing data, when The biological information sensing data and the biometric information matches that prestore, determine the biological information sensor number According to the identity authority of corresponding user, in response to the corresponding default feedback command of the identity permission build.
B23, the robot interactive control method according to right wants B22, it is characterised in that the biological information Sensing data includes at least one in fingerprint information data, iris information data or acoustic information data, according to this with it is default The contrast of identity authority information determines the identity authority of user, and corresponding default feedback power is opened in response to the identity authority Limit.
B24, the robot interactive control method according to B14, the default behavior type storehouse and/or described default right Answer relational database be set by the user or under be downloaded from cloud server.
B25, the robot interactive control method according to B24, the download is by bluetooth, Wi-Fi network, mobile number Carried out according to network or cable network.
C26, a kind of robot interactive server, including:
Receiver module, the audio frequency and video of its supervision object for belonging to the acquisition of its image unit for receiving robot upload Stream or associated sensor data;
Parsing module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from pre- Determine to match corresponding behavior type in behavior type storehouse;
Data generation module, for determining machine behavior corresponding with behavior type from default corresponding relation database Instruction;
Sending module, is referred to for sending the behavior command to robot with driving the robot to perform the machine behavior Order, makes the robot make corresponding output behavior, as the feedback to its supervision object.
D27, a kind of robot, including:
Acquisition module, the audio/video flow or related sensor to supervision object of image unit acquisition are belonged to for getting Data;
Analysis module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from pre- Determine to match corresponding behavior type in behavior type storehouse, then determine and behavior type from default corresponding relation database Corresponding machine behavior command;
Performing module, for making corresponding output row according to the behavior command driven machine people relevant interface or unit For as the feedback to its supervision object.

Claims (10)

1. a kind of robot interactive control method, it is characterised in that comprise the following steps:
Receive the audio/video flow or related sensor number of its supervision object for belonging to the acquisition of its image unit of robot upload According to;
According to the characteristic extracted from the audio/video flow or the sensing data, matched from predefined action typelib Go out corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
The machine behavior command is sent to robot to drive the robot to perform the machine behavior command, the robot is held The corresponding output behavior of row, as the feedback to its supervision object.
2. robot interactive control method according to claim 1, it is characterised in that the foundation is from the audio/video flow The characteristic of middle extraction, includes the step of corresponding behavior type is matched from predefined action typelib:
Extract the audio feature information or key word information of the audio/video flow sound intermediate frequency data;
By prestore audio feature information or the key that prestores in the characteristic information or key word information and predefined action typelib Word information matches, and the corresponding behavior type of the voice data is determined according to this.
3. robot interactive control method according to claim 1, it is characterised in that the foundation is from the audio/video flow The characteristic of middle extraction, includes the step of corresponding behavior type is matched from predefined action typelib:
Extract the video features frame information of video data in the audio/video flow;
The video features frame information is matched with the video features frame information that prestores in predefined action typelib, is determined according to this The corresponding behavior type of the video data.
4. robot interactive approach according to claim 1, it is characterised in that described according to sensing data, from predetermined The step of corresponding behavior type is matched in behavior type storehouse includes:
Extract particular data word section or the Meaning of Information in the sensing data;
By pre-stored data field or the letter that prestores in particular data word section or described information implication and predefined action typelib Breath implication matches, and the corresponding behavior type of the sensing data is determined according to this.
5. robot interactive approach according to claim 1, it is characterised in that wrapped in the default corresponding relation database Default behavior type tables of data is included, and there is the machine behavior command table of mapping relations with default behavior type tables of data;It is described Concretely comprising the following steps for the machine behavior command corresponding with behavior type is determined from default corresponding relation database:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines matching Default behavior type, according to the default behavior type mapping relations mapping generation machine behavior command.
6. a kind of robot interactive control method, it is characterised in that comprise the following steps:
Get the audio/video flow or associated sensor data to supervision object for belonging to image unit acquisition;
According to the characteristic extracted from the audio/video flow or the sensing data, matched from predefined action typelib Go out corresponding behavior type;
The machine behavior command corresponding with behavior type is determined from default corresponding relation database;
Corresponding output behavior is made according to the machine behavior command driven machine people relevant interface or unit, is supervised as to it Depending on the feedback of object.
7. robot interactive control method according to claim 6, it is characterised in that the default corresponding relation database Include default behavior type tables of data, and there is the machine behavior command table of mapping relations with default behavior type tables of data; It is described that concretely comprising the following steps for machine behavior command corresponding with behavior type is determined from default corresponding relation database:
Default behavior type tables of data in the behavior type as keyword retrieval corresponding relation database, determines matching Default behavior type, according to the default behavior type mapping relations mapping generation machine behavior command.
8. robot interactive control method according to claim 6, it is characterised in that belong to image unit and obtain getting Also include before the step of audio/video flow or pertinent sensor data of its supervision object for taking, receive biological information sensing Device data, when the biological information sensing data and the biometric information matches that prestore, determine the biological characteristic letter The identity authority of the corresponding user of breath sensing data, refers in response to the corresponding default feedback of the identity permission build Order.
9. a kind of robot interactive server, it is characterised in that including:
Receiver module, for receive robot upload belong to its image unit acquisition its supervision object audio/video flow or Associated sensor data;
Parsing module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from predetermined row To match corresponding behavior type in typelib;
Data generation module, for determining that machine behavior corresponding with behavior type refers to from default corresponding relation database Order;
Sending module, for sending the behavior command to robot to drive the robot to perform the machine behavior command, makes The robot makes corresponding output behavior, used as the feedback to its supervision object.
10. a kind of robot, it is characterised in that including:
Acquisition module, the audio/video flow or related sensor number to supervision object of image unit acquisition are belonged to for getting According to;
Analysis module, for the characteristic that foundation is extracted from the audio/video flow or the sensing data, from predetermined row To match corresponding behavior type in typelib, then determine from default corresponding relation database relative with behavior type The machine behavior command answered;
Performing module, for making corresponding output behavior according to the behavior command driven machine people relevant interface or unit, As the feedback to its supervision object.
CN201710013365.1A 2017-01-09 2017-01-09 Robot interaction control method, server and robot Active CN106873773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710013365.1A CN106873773B (en) 2017-01-09 2017-01-09 Robot interaction control method, server and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710013365.1A CN106873773B (en) 2017-01-09 2017-01-09 Robot interaction control method, server and robot

Publications (2)

Publication Number Publication Date
CN106873773A true CN106873773A (en) 2017-06-20
CN106873773B CN106873773B (en) 2021-02-05

Family

ID=59164756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710013365.1A Active CN106873773B (en) 2017-01-09 2017-01-09 Robot interaction control method, server and robot

Country Status (1)

Country Link
CN (1) CN106873773B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107221332A (en) * 2017-06-28 2017-09-29 上海与德通讯技术有限公司 The exchange method and system of robot
CN107463626A (en) * 2017-07-07 2017-12-12 深圳市科迈爱康科技有限公司 A kind of voice-control educational method, mobile terminal, system and storage medium
CN107770271A (en) * 2017-10-20 2018-03-06 南方电网科学研究院有限责任公司 Cluster robot cloud control method, device and system
CN107807734A (en) * 2017-09-27 2018-03-16 北京光年无限科技有限公司 A kind of interaction output intent and system for intelligent robot
CN107908429A (en) * 2017-08-10 2018-04-13 广州真诺电子科技有限公司 Human-computer interaction and programing system applied to robot software engineer
CN108124008A (en) * 2017-12-20 2018-06-05 山东大学 A kind of old man under intelligent space environment accompanies and attends to system and method
CN108491790A (en) * 2018-03-20 2018-09-04 上海乐愚智能科技有限公司 A kind of determination method, apparatus, storage medium and the robot of object
CN108780361A (en) * 2018-02-05 2018-11-09 深圳前海达闼云端智能科技有限公司 Human-computer interaction method and device, robot and computer readable storage medium
CN109165057A (en) * 2017-06-28 2019-01-08 华为技术有限公司 A kind of method and apparatus that smart machine executes task
CN109166386A (en) * 2018-10-25 2019-01-08 重庆鲁班机器人技术研究院有限公司 Children's logical thinking supplemental training method, apparatus and robot
CN109189363A (en) * 2018-07-24 2019-01-11 上海常仁信息科技有限公司 A kind of robot of human-computer interaction
CN110135317A (en) * 2019-05-08 2019-08-16 深圳达实智能股份有限公司 Behavior monitoring and management system and method based on cooperated computing system
CN110415688A (en) * 2018-04-26 2019-11-05 杭州萤石软件有限公司 Information interaction method and robot
CN110505309A (en) * 2019-08-30 2019-11-26 苏州博众机器人有限公司 Network communication method, device, equipment and storage medium
CN110497404A (en) * 2019-08-12 2019-11-26 安徽云探索网络科技有限公司 A kind of robot bionic formula intelligent decision system
CN110695989A (en) * 2019-09-20 2020-01-17 浙江树人学院(浙江树人大学) Audio-visual interaction system for intelligent robot and interaction control method thereof
WO2020077631A1 (en) * 2018-10-19 2020-04-23 深圳配天智能技术研究院有限公司 Method for controlling robot, server, storage medium and cloud service platform
CN111079116A (en) * 2019-12-29 2020-04-28 钟艳平 Identity recognition method and device based on simulation cockpit and computer equipment
CN111860231A (en) * 2020-07-03 2020-10-30 厦门欧准卫浴有限公司 Universal water module based on household occasions
CN112017030A (en) * 2020-09-01 2020-12-01 中国银行股份有限公司 Intelligent service method and device for bank outlets
CN113625662A (en) * 2021-07-30 2021-11-09 广州玺明机械科技有限公司 Rhythm dynamic control system for data acquisition and transmission of beverage shaking robot
TWI759039B (en) * 2020-03-31 2022-03-21 大陸商北京市商湯科技開發有限公司 Methdos and apparatuses for driving interaction object, devices and storage media

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105320872A (en) * 2015-11-05 2016-02-10 上海聚虹光电科技有限公司 Robot operation authorization setting method based on iris identification
CN105468145A (en) * 2015-11-18 2016-04-06 北京航空航天大学 Robot man-machine interaction method and device based on gesture and voice recognition
CN105912128A (en) * 2016-04-29 2016-08-31 北京光年无限科技有限公司 Smart robot-oriented multimodal interactive data processing method and apparatus
CN106055105A (en) * 2016-06-02 2016-10-26 上海慧模智能科技有限公司 Robot and man-machine interactive system
CN106200962A (en) * 2016-07-08 2016-12-07 北京光年无限科技有限公司 Exchange method and system towards intelligent robot
CN106228978A (en) * 2016-08-04 2016-12-14 成都佳荣科技有限公司 A kind of audio recognition method
CN106239511A (en) * 2016-08-26 2016-12-21 广州小瓦智能科技有限公司 A kind of robot based on head movement moves control mode
CN106250400A (en) * 2016-07-19 2016-12-21 腾讯科技(深圳)有限公司 A kind of audio data processing method, device and system
US9552056B1 (en) * 2011-08-27 2017-01-24 Fellow Robots, Inc. Gesture enabled telepresence robot and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9552056B1 (en) * 2011-08-27 2017-01-24 Fellow Robots, Inc. Gesture enabled telepresence robot and system
CN105320872A (en) * 2015-11-05 2016-02-10 上海聚虹光电科技有限公司 Robot operation authorization setting method based on iris identification
CN105468145A (en) * 2015-11-18 2016-04-06 北京航空航天大学 Robot man-machine interaction method and device based on gesture and voice recognition
CN105912128A (en) * 2016-04-29 2016-08-31 北京光年无限科技有限公司 Smart robot-oriented multimodal interactive data processing method and apparatus
CN106055105A (en) * 2016-06-02 2016-10-26 上海慧模智能科技有限公司 Robot and man-machine interactive system
CN106200962A (en) * 2016-07-08 2016-12-07 北京光年无限科技有限公司 Exchange method and system towards intelligent robot
CN106250400A (en) * 2016-07-19 2016-12-21 腾讯科技(深圳)有限公司 A kind of audio data processing method, device and system
CN106228978A (en) * 2016-08-04 2016-12-14 成都佳荣科技有限公司 A kind of audio recognition method
CN106239511A (en) * 2016-08-26 2016-12-21 广州小瓦智能科技有限公司 A kind of robot based on head movement moves control mode

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
熊根良等: ""物理人_机器人交互研究与发展现状"", 《光学 精密工程》 *

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165057A (en) * 2017-06-28 2019-01-08 华为技术有限公司 A kind of method and apparatus that smart machine executes task
CN107221332A (en) * 2017-06-28 2017-09-29 上海与德通讯技术有限公司 The exchange method and system of robot
CN109165057B (en) * 2017-06-28 2021-03-30 华为技术有限公司 Method and device for executing task by intelligent equipment
CN107463626A (en) * 2017-07-07 2017-12-12 深圳市科迈爱康科技有限公司 A kind of voice-control educational method, mobile terminal, system and storage medium
CN107908429B (en) * 2017-08-10 2021-07-23 广州真诺电子科技有限公司 Human-computer interaction and programming system applied to robot software engineer
CN107908429A (en) * 2017-08-10 2018-04-13 广州真诺电子科技有限公司 Human-computer interaction and programing system applied to robot software engineer
CN107807734B (en) * 2017-09-27 2021-06-15 北京光年无限科技有限公司 Interactive output method and system for intelligent robot
CN107807734A (en) * 2017-09-27 2018-03-16 北京光年无限科技有限公司 A kind of interaction output intent and system for intelligent robot
CN107770271A (en) * 2017-10-20 2018-03-06 南方电网科学研究院有限责任公司 Cluster robot cloud control method, device and system
CN108124008A (en) * 2017-12-20 2018-06-05 山东大学 A kind of old man under intelligent space environment accompanies and attends to system and method
CN108780361A (en) * 2018-02-05 2018-11-09 深圳前海达闼云端智能科技有限公司 Human-computer interaction method and device, robot and computer readable storage medium
WO2019148491A1 (en) * 2018-02-05 2019-08-08 深圳前海达闼云端智能科技有限公司 Human-computer interaction method and device, robot, and computer readable storage medium
CN108491790A (en) * 2018-03-20 2018-09-04 上海乐愚智能科技有限公司 A kind of determination method, apparatus, storage medium and the robot of object
CN110415688B (en) * 2018-04-26 2022-02-08 杭州萤石软件有限公司 Information interaction method and robot
CN110415688A (en) * 2018-04-26 2019-11-05 杭州萤石软件有限公司 Information interaction method and robot
CN109189363A (en) * 2018-07-24 2019-01-11 上海常仁信息科技有限公司 A kind of robot of human-computer interaction
CN111630475B (en) * 2018-10-19 2024-02-27 深圳配天机器人技术有限公司 Method for controlling robot, server, storage medium and cloud service platform
WO2020077631A1 (en) * 2018-10-19 2020-04-23 深圳配天智能技术研究院有限公司 Method for controlling robot, server, storage medium and cloud service platform
CN111630475A (en) * 2018-10-19 2020-09-04 深圳配天智能技术研究院有限公司 Method for controlling robot, server, storage medium and cloud service platform
CN109166386A (en) * 2018-10-25 2019-01-08 重庆鲁班机器人技术研究院有限公司 Children's logical thinking supplemental training method, apparatus and robot
CN110135317A (en) * 2019-05-08 2019-08-16 深圳达实智能股份有限公司 Behavior monitoring and management system and method based on cooperated computing system
CN110497404A (en) * 2019-08-12 2019-11-26 安徽云探索网络科技有限公司 A kind of robot bionic formula intelligent decision system
CN110505309A (en) * 2019-08-30 2019-11-26 苏州博众机器人有限公司 Network communication method, device, equipment and storage medium
CN110505309B (en) * 2019-08-30 2022-02-25 苏州博众机器人有限公司 Network communication method, device, equipment and storage medium
CN110695989A (en) * 2019-09-20 2020-01-17 浙江树人学院(浙江树人大学) Audio-visual interaction system for intelligent robot and interaction control method thereof
CN111079116B (en) * 2019-12-29 2020-11-24 钟艳平 Identity recognition method and device based on simulation cockpit and computer equipment
CN111079116A (en) * 2019-12-29 2020-04-28 钟艳平 Identity recognition method and device based on simulation cockpit and computer equipment
TWI759039B (en) * 2020-03-31 2022-03-21 大陸商北京市商湯科技開發有限公司 Methdos and apparatuses for driving interaction object, devices and storage media
CN111860231A (en) * 2020-07-03 2020-10-30 厦门欧准卫浴有限公司 Universal water module based on household occasions
CN112017030A (en) * 2020-09-01 2020-12-01 中国银行股份有限公司 Intelligent service method and device for bank outlets
CN113625662A (en) * 2021-07-30 2021-11-09 广州玺明机械科技有限公司 Rhythm dynamic control system for data acquisition and transmission of beverage shaking robot

Also Published As

Publication number Publication date
CN106873773B (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN106873773A (en) Robot interactive control method, server and robot
US8126832B2 (en) Artificial intelligence system
JP5866728B2 (en) Knowledge information processing server system with image recognition system
US11511436B2 (en) Robot control method and companion robot
CN107272607A (en) A kind of intelligent home control system and method
CN110168530A (en) Electronic equipment and the method for operating the electronic equipment
CN106773766A (en) Smart home house keeper central control system and its control method with learning functionality
US8103382B2 (en) Method and system for sharing information through a mobile multimedia platform
CN106202165A (en) The intellectual learning method and device of man-machine interaction
CN106791565A (en) Robot video calling control method, device and terminal
WO2018033066A1 (en) Robot control method and companion robot
KR102309682B1 (en) Method and platform for providing ai entities being evolved through reinforcement machine learning
CN108476258A (en) Method and electronic equipment for electronic equipment control object
CN107942695A (en) emotion intelligent sound system
CN107666536A (en) A kind of method and apparatus for finding terminal, a kind of device for being used to find terminal
JP2010224715A (en) Image display system, digital photo-frame, information processing system, program, and information storage medium
CN110209777A (en) The method and electronic equipment of question and answer
CN111491123A (en) Video background processing method and device and electronic equipment
KR20190076870A (en) Device and method for recommeding contact information
CN109074329A (en) Information processing equipment, information processing method and program
CN105388786B (en) A kind of intelligent marionette idol control method
CN106325113A (en) Robot control engine and system
CN104584004B (en) Keep search context
CN106777066B (en) Method and device for image recognition and media file matching
CN117573822A (en) Man-machine interaction method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant