CN108732943A - Expression robot man-machine interaction method - Google Patents
Expression robot man-machine interaction method Download PDFInfo
- Publication number
- CN108732943A CN108732943A CN201710254658.9A CN201710254658A CN108732943A CN 108732943 A CN108732943 A CN 108732943A CN 201710254658 A CN201710254658 A CN 201710254658A CN 108732943 A CN108732943 A CN 108732943A
- Authority
- CN
- China
- Prior art keywords
- expression
- rule
- module
- robot
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
- B25J11/0015—Face robots, animated artificial faces for imitating human expressions
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Feedback Control In General (AREA)
Abstract
The invention discloses a kind of expression robot man-machine interaction methods, this method expression robot acquires information by various kinds of sensors, and gathered data is preserved respectively into particular address memory, then the data information of each sensing module is transferred using timer, after being merged by analysis to information, contradistinction system knowledge base and preset rule base judge that expression robot exports task, and the control of the relevant parameter of steering engine is realized by control module, complete the expression of expression robot.
Description
Technical field
The invention belongs to human-computer interaction technique fields, refer in particular to a kind of exchange method based on robot.
Background technology
Robot is the automatic installations for executing work.It can not only receive mankind commander, but also can run advance volume
The program of row, can also be according to principle program action formulated with artificial intelligence technology.Its task is to assist or replace the mankind
The work of work, such as production industry, construction industry, or dangerous work.The raising of technical merit at any time, the intelligence of robot
It is higher and higher, step into routine work and the sphere of life of people.
In remote control system, the mankind are realized by human-computer interaction interface, communication network and tele-robotic and remote loop
The manipulation to REMOTE MACHINE people is realized in the interaction in border.On the one hand people by the intervention of operator, solve robot artificial
Under conditions of intelligence and sensing technology limitation, to task object, capability and decision-making capability are insufficient really;On the other hand machine is utilized
Advantage of the device people in terms of the calculating such as performance accuracy, complex task distribution and path planning, by the essence of the intelligence of people and robot
True property organically combines, and while ensureing that tele-robotic automatically carries out high-precision operation, mitigates the task of operator
Burden.It in man-machine interactive system, is mutually adapted by people and robot, is mutually coordinated, respectively playing intelligent, reaching a kind of
The man-machine interaction effect of remarkable fluency is considered as always the main target of human-computer interaction research field by researchers.
A kind of generation method of robot interactive content as disclosed in patent application 201680001753.1, including:It obtains
Take the expression information at family;Obtain the text emotion information of user;Determine that user anticipates according to expression information and text emotion information
Figure;According to the expression information, text emotion information and the user view, generated in conjunction with current robot life-time axis
Robot interactive content.Life-time axis where robot is added in the interaction content generation of robot by the present invention,
It more personalizes when machine person to person being made to interact so that the robot life style with the mankind, party in life-time axis
Method is capable of the personification of hoisting machine people's interaction content generation, promotes man-machine interaction experience, improves intelligent.
However, the above method is required for plurality of devices or tool to match, it could realize that the positioning of robot, positioning are multiple
It is miscellaneous, and error is big, it is difficult to accomplish accurate and reliable.
Invention content
In view of the above-mentioned problems, the purpose of the present invention is to provide a kind of expression robot man-machine interaction method, this method is comprehensive
It closes and considers the factors such as analysis interactive software functional requirement, System Framework structure, human-computer interaction interface, establish a series of be based on
The rules of interaction of user, the final harmony realized between people and apery expression robot interact.
It is another object of the present invention to provide a kind of expression robot man-machine interaction method, this method passes through robot
Itself obtain around environmental characteristic, line discipline of going forward side by side judge, to make interaction, need not by external equipment or tool,
It realizes easy.
To achieve the above object, the technical scheme is that:
A kind of expression robot man-machine interaction method, it is characterised in that this method expression robot is adopted by various kinds of sensors
Collect information, and gathered data is preserved respectively into particular address memory, the number of each sensing module is then transferred using timer
It is believed that breath, after being merged by analysis to information, contradistinction system knowledge base and preset rule base judge expression machine
People exports task, and the control of the relevant parameter by control module realization steering engine, completes the expression of expression robot.
Further, this method initially sets up database, and the database includes knowledge base and rule base, the knowledge base
Expression data, odor data for storing user, expression data, odor data classification storage are in knowledge base, the rule
The preset rule of library storage, to make expression robot carry out corresponding expression.
Further, the information of the sensor acquisition includes expression signal and gas type, and the expression signal is to utilize
User's expression of sensor capture, and stored by expression module, it is then transferred to control module;The gas type,
It is to be transferred to smell module after sensor acquires odiferous information, smell module is completed gas according to corresponding recognizer and is identified, into
And confirm gas type.
Further, the control flow of the smell module is:
101, smell module initialization;
102, sampling interval is set, system default value 0.5s can also be set according to actual conditions;
103, sampled data is judged whether there is, is had, carries out in next step, otherwise continuing to execute this step;
104, sampled data is read, and updates implementation sampled data, is shown;
105, judge whether sampled data is more than threshold values, more than then thinking to malfunction, return to step 103 reacquires hits
According to, be carry out in next step;
106, gas identification is carried out, which kind of gas judgement is;
107, setting mark, and keep gas recognition result.
Further, the control module includes mainly perception information processing and motion control two subsystems.Perception information
Processing subsystem includes mainly visual identity module and smell identification module, and visual identity module is connected to expression module, and smell is known
Other module is then connected to smell module;The perception information internal system of each sensing passage carries out information analysis, arrangement and fusion, in turn
Obtain user information;Motion control subsystem principal security expression robot expression output control, and the realization of expression be according to
Rely motor driving, so motion control main task is the control completed by motion control card to multipath servo motor combination.
Further, the data volume in the knowledge base of robot is arranged using production rule representation in the knowledge base,
Every rule of production rule is described as:
R(i):If RLS then RRS (i=1,2 ..., n)
Wherein, Rule is keyword;R (i) is known as the i-th rule in rule system;RLS (Rule Left Side) is
The condition part of i rules, including arithmetic or it is true indicate item, can be multiple if-clauses and/or logic
Combination;RRS (Rule Right Side) is the action part of the i-th rule, refers to that robot motion executes.
Further, this method first has to clear Expression Recognition result and smell type information, according to two kinds of recognition results
Various combination constructs Different Rule instruction, that is, controls executing instruction for robot output.
Further, the implementation of rules of interaction sequence is controlled, is suitably controlled according to relevant interactive knowledge-chosen
Fact of case is tested with regular premise, is matched, and then infers new information by system strategy, the explanation of implementation rule body
And execution, reasoning flow are:
201, perception recognition result parameter is obtained;
202, the data in goal condition and database are matched;
203, judge whether successful match, successful then carry out in next step, otherwise return to step 202, match again;
204, a rule is taken out from rule base, its premise is matched with the data in current database;
205, judge whether successful match, successful then execute next step, otherwise return to step 204 continues to match;
206, the movement content of executing rule defined.
Further, this method calls data using dynamic link library .dll and executes program, dynamic link library .dll packets
Containing a large amount of code segments and data information to complete resource-sharing between multiple programs.
Further, the method, it includes header file .h, static link library .lib and dynamic link library to call file
.dll, the file format of wherein modules is as follows:
(1) visual identity module:DSStream.h,DSStream.lib,DSStream.dll
(2) odor identification module:Senser.h,Senser.lib,Senser.dll
(3) serial communication module:Mscomm.h,Mscomm.lib,Mscomm.dll.
The present invention considers analysis interactive software functional requirement, body by the collection and analysis to environmental characteristic
It is the factors such as frame structure, human-computer interaction interface, establishes a series of rules of interaction based on user, it is final to realize people and apery table
Harmony interaction between feelings robot can greatly improve the life of people to bring more services and facility
Bioplasm amount.
Description of the drawings
Fig. 1 by the present invention implementation smell module control flow chart.
Fig. 2 by the present invention implementation first part rules of interaction condition and rule conclusion correspondence restriction relation figure.
Fig. 3 by the present invention implementation second part rules of interaction condition and rule conclusion correspondence restriction relation figure.
The reasoning flow chart that Fig. 4 selects by institute's implementation man-machine interactive system of the invention.
Fig. 5 by the present invention implementation control module control flow chart.
Specific implementation mode
In order to make the purpose , technical scheme and advantage of the present invention be clearer, with reference to the accompanying drawings and embodiments, right
The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and
It is not used in the restriction present invention.
Man-machine interaction method of the present invention mainly realizes three basic functions:Environment sensing function, interface communication function and control
Algorithm processed realizes function, effectively to realize above system function, now carries out modularized design to interactive system.So-called modularization is set
Meter refers to not by typing machine code or computer instruction one by one, but in advance will in the compilation process of program
The frames such as the structure and main-process stream main program of interactive software, subprogram and subprocess are depicted and to ensure between each frame
The definition and debugging of corresponding input/output relation.Stepwise Refinement is retouched the result is that obtaining a series of programs as unit of functional block
It states, this process is known as modularization.Modularized design has significant advantage compared with conventional programming techniques:Because improving system source generation
The reuse rate of code, has saved Installed System Memory;According to the module design principle of the strong cohesion of weak coupling, reduce program compiling,
The error rate of execution, improves system reliability, in short, modularized design reduces system program complexity, is compiled convenient for program
Write, change and the later stage optimization etc. operations.
Environment sensing generally includes the acquisition of expression signal, the acquisition of surrounding Taste Signals, in this regard, including two aspects
Design:
1, expression module design
The major function of expression module is to capture the expression data of user using sensor and store, and is used for judging
The state at family.
2, smell module design
The task of smell module includes mainly gas type identification and the aspect of host computer realtime curve two, into promoting the circulation of qi
It can identify that sampling interval, system default value 0.5s, gas is arranged in child window in the gas of human-computer interaction main interface before body identification
After sampler body (sampling is realized by sensor), then serial data is read out and is synchronized and realizes that real time data is aobvious
Show, gas identification is finally completed according to corresponding recognizer, gathered data curve is completed from a left side by designing CDataShow classes
To right mobile display, main flow is as shown in Figure 1.
101, smell module initialization;
102, sampling interval is set, system default value 0.5s can also be set according to actual conditions;
103, sampled data is judged whether there is, is had, carries out in next step, otherwise continuing to execute this step;
104, sampled data is read, and updates implementation sampled data, is shown;
105, judge whether sampled data is more than threshold values, more than then thinking to malfunction, return to step 103 reacquires hits
According to, be carry out in next step;
106, gas identification is carried out, which kind of gas judgement is;
107, setting mark, and keep gas recognition result.
3, control module designs
Control module is the modularization to control system, is to dominate expression robot hardware, coordinate target action and realize
Indispensable tie.Control module includes mainly perception information processing and motion control two subsystems.Perception information processing
Subsystem includes mainly visual identity module and smell identification module, and visual identity module is connected to expression module, odor identification mould
Block is then connected to smell module;The perception information internal system of each sensing passage carries out information analysis, arrangement and fusion, and then obtains
User information.The control of motion control subsystem principal security expression robot expression output, and the realization of expression is to rely on electricity
Machine drives, so motion control main task is the control for completing to combine multipath servo steering engine by motion control card.
Robot does not have the Automatic thoughts ability as the mankind as a things, can believe environmental stimuli
Number carry out it is spontaneous make ideal response, so wanting to realize that the premise that robot is exchanged with the harmony between the mankind is just to aid in machine
Device people increases its " knowledge quantity ", it is made to possess the information based on the systematization of interpersonal exchange profile in daily life
Set, more common implementation method is exactly that a series of Knowledge based engineering rules of interaction are added in interactive system, when user carries
When the input of confession meets regular premise, robot just exports corresponding conclusion behavior.
The data volume in the knowledge base of robot is arranged using production rule representation in the present invention.It can per rule
To be described as:
R(i):If RLS then RRS (i=1,2 ..., n)
Wherein, Rule is keyword;R (i) is known as the i-th rule in rule system;RLS (Rule Left Side) is
The condition part of i rules, include mainly arithmetic or it is true indicate item, can be multiple if-clauses and/or
Logical combination;RRS (Rule Right Side) is the action part of the i-th rule, is primarily referred to as robot motion execution, can
Can also be the domain logic of multiple operations or conclusion to be an operation or conclusion.
Needed based on human-computer interaction content, the man-machine rules of interaction former piece of the present invention include mainly facial expression recognition result and
Odor types information constructs Different Rule consequent, that is, controls robot output according to the various combination of two kinds of recognition results
It executes instruction.For example, under the premise of only gas recognition result is as interactive system stimulus signal, alcohol smell is liked by robot
Do not like cigarette smoke, smell have cigarette taste with user when, expression out of sorts can be exposed or shown the whites of one's eyes and detested with showing
It dislikes;And smell around user when having alcohol smell, robot responds smile expression;Or there are methane in " smelling " ambient enviroment
When gas, explanation may around there are inflammable gases, are a kind of danger signals, then robot can respond a surprised expression.
Under the premise of Expression Recognition result and gas recognition result are collectively as interactive system stimulus signal, recognize and dissipated around user
The result for sending out alcohol smell and facial expression recognition is to smile, then explainable user mood is pretty good, drinks and adds to the fun, so machine
People can respond a smile expression, but when facial expression recognition is the result is that when sad or angry, then may user emotion not
It is good, so that drinking sorrow down, sympathized with showing then robot can also return an expression for sadness.
Rules of interaction condition and the correspondence restriction relation of rule conclusion are simply enumerated below, as shown in Figure 2 and Figure 3.First row
For man-machine cohesion, it is owner when interactive system acquiescence has human face expression input information at present, can be identified later by robot
Specific someone sets it to owner, other expressions importer is guest, helps to improve the autonomous intelligence of interactive system.The
Two are classified as gas recognition result, and third is classified as facial expression recognition result (when no video acquisition signal inputs, without this row),
4th expression for being classified as robot responds type.
The certain embodiments of rules of interaction in the man-machine interactive software system of the present invention are given below.
Ifx=smiles without and y=alcohol then ctrl_command=;
Ifx=is angry without and y=cigarette then ctrl_command=;
Ifx=detests without and y=methane then ctrl_command=;
Ifx=smile and y=cigarette then ctrl_command=smile;
The surprised and y=methane then ctrl_command=of Ifx=anger or are taken aback;
It is sad that Ifx=detests or sadness and y=methane then ctrl_command=;
In view of a variety of possibilities of condition part combination, then a large amount of initial data of typing and rules of interaction conduct are needed
The basis of interaction condition coupling, then just need to establish interactive system rule base, to rules of interaction carry out system administration planning with
Ensure the completeness and reliability of operation of man-machine interactive software system.
4, the foundation of database
Database is the description collections for the storage data being related to interactive system inside, by the input information from hardware
Be converted to host computer it will be appreciated that internal system symbol, for establish effective inference rule provide data support.Database
Content mainly includes the definition of data name, data type and length.
Under normal conditions, above-mentioned knowledge base and rule base, are included in database.
5, the control strategy of reasoning
For performing effectively for guarantee production rules of interaction, the sequence of the implementation to rules of interaction is needed to control, according to
Fact of case is tested with regular premise, is matched, and then inferred by the relevant suitable control strategy of interactive knowledge-chosen
New information, the explanation and execution of implementation rule body, complete model premised on -- the interactive process of action.The present inventor
The selection of machine interactive system is forward reasoning, and reasoning process is illustrated in fig. 4 shown below.
201, perception recognition result parameter is obtained;
202, the data in goal condition and database are matched;
203, judge whether successful match, successful then carry out in next step, otherwise return to step 202, match again;
204, a rule is taken out from rule base, its premise is matched with the data in current database;
205, judge whether successful match, successful then execute next step, otherwise return to step 204 continues to match;
206, the movement content of executing rule defined.
(5) module main control process flow
Program first initializes man-machine interactive system after executing, and expression robot is acquired by various kinds of sensors to be believed
Breath, and gathered data is preserved respectively into particular address memory, control module transfers each sensing module using system timer
Data information, after being merged by analysis to information, contradistinction system knowledge base and preset rule base judge expression
Robot exports task, and the control of relevant parameter is realized by motion control two subsystems, completes the expression of expression robot.
The main control process flow of control module is illustrated in fig. 5 shown below.
301, system starts, and is initialized;
302, recognition result is obtained, is judged whether to 50ms timings, is to continue in next step, it is no in cycle criterion;
303, it is compared with last time recognition result, judges whether there is update, have, continued in next step, it is no to return to 302
Step;
304, the recognition result for preserving this, understands flag bit;
305, output control data are generated according to output rule;
306, control command is sent.
The present invention mainly uses Serial Communication as the data interpreting means between computer and peripheral hardware, uses
The MSComm communication controls of VC++, the control provide the communications command interface of a large amount of standard specifications, it is easy to accomplish to serial ports program
Establishment exploitation.
Expression robot man-machine interactive system includes multiple modules with relatively independent function, and module is led between each other
Information interchange, fusion are crossed, common coordination controls the output task of robot, then, each intermodule realizes the general of information sharing
Source file then can be by repeated multiple times calling, if each function module is all whole by whole source codes during compilation run
It closes in .exe executable files, then not only taking up a large amount of memory headroom, wastes system resource, and take in program chain
It is susceptible to mistake in journey, increases system complexity, therefore each intermodule is mutually called and is highly desirable.
The present invention calls data using dynamic link library .dll and executes program, and dynamic link library .dll includes a large amount of generations
Code section and data information between multiple programs to complete resource-sharing.The use of dynamic link library (DLL) is conducive to interactive system
It unites modular realization.For example information processing system is divided into the function modules such as vision, smell, can be divided as unit of module
It is not acquired the processing analysis of data, runs generation module .dll files and is loaded onto in interactive software main program, realize whole
A interaction flow improves interactive system efficiency.In addition, the resource-sharing characteristic of DLL can make the modification and update of software systems
It is conveniently and efficiently applied in correlation module, and is unlikely to influence the other parts of system program, such as to video frequency collection card
It is replaced, then needs to be appropriately modified video acquisition relative program in vision module, rewrite, and pass through video
Acquisition module DLL can close this change to inside it, be regenerated so as to avoid to entire program.
It includes header file .h, static link library .lib and dynamic link library .dll that the present invention, which calls file,.Wherein each mould
The file format of block is as follows:
(1) visual identity module:DSStream.h,DSStream.lib,DSStream.dll
(2) odor identification module:Senser.h,Senser.lib,Senser.dll
(3) serial communication module:Mscomm.h,Mscomm.lib,Mscomm.dll.
By the collection and analysis to environmental characteristic, consider analysis interactive software function needs the present invention as a result,
Ask, the factors such as System Framework structure, human-computer interaction interface, establish a series of rules of interaction based on user, it is final realize people with
Harmony interaction between apery expression robot can greatly improve people to bring more services and facility
Quality of life
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention
All any modification, equivalent and improvement etc., should all be included in the protection scope of the present invention made by within refreshing and principle.
Claims (10)
1. a kind of expression robot man-machine interaction method, it is characterised in that this method expression robot is acquired by various kinds of sensors
Information, and gathered data is preserved respectively into particular address memory, the data of each sensing module are then transferred using timer
Information, after being merged by analysis to information, contradistinction system knowledge base and preset rule base judge expression robot
Output task, and the control of the relevant parameter by control module realization steering engine, complete the expression of expression robot.
2. expression robot man-machine interaction method as described in claim 1, it is characterised in that this method initially sets up database,
The database includes knowledge base and rule base, and the knowledge base is used to store expression data, the odor data of user, expression
Data, odor data classification storage are in knowledge base, the preset rule of the rule library storage, to make expression robot
Carry out corresponding expression.
3. expression robot man-machine interaction method as described in claim 1, it is characterised in that the information of the sensor acquisition
Including expression signal and gas type, the expression signal is the user's expression captured using sensor, and passes through expression module
It is stored, is then transferred to control module;The gas type is to be transferred to smell mould after sensor acquires odiferous information
Block, smell module is completed gas according to corresponding recognizer and is identified, and then confirms gas type.
4. expression robot man-machine interaction method as claimed in claim 3, it is characterised in that the control stream of the smell module
Cheng Wei:
101, smell module initialization;
102, sampling interval is set, system default value 0.5s can also be set according to actual conditions;
103, sampled data is judged whether there is, is had, carries out in next step, otherwise continuing to execute this step;
104, sampled data is read, and updates implementation sampled data, is shown;
105, judge whether sampled data is more than threshold values, more than then thinking to malfunction, return to step 103 reacquires sampled data,
It is to carry out in next step;
106, gas identification is carried out, which kind of gas judgement is;
107, setting mark, and keep gas recognition result.
5. expression robot man-machine interaction method as described in claim 1, it is characterised in that the control module includes mainly
Perception information processing and motion control two subsystems, perception information processing subsystem include mainly visual identity module and smell
Identification module, visual identity module are connected to expression module, and odor identification module is then connected to smell module;The perception of each sensing passage
Information analysis, arrangement and fusion are carried out inside information system, and then obtain user information;Motion control subsystem principal security table
The control of feelings robot expression output, and the realization of expression is to rely on motor driving, so motion control main task is to pass through
Motion control card completes the control to multipath servo motor combination.
6. expression robot man-machine interaction method as claimed in claim 2, it is characterised in that the knowledge base, using production
The data volume in the knowledge base of robot is arranged in regular expression, and every rule of production rule is described as:
R(i):If RLS then RRS (i=1,2 ..., n)
Wherein, Rule is keyword;R (i) is known as the i-th rule in rule system;RLS (Rule Left Side) is i-th
The condition part of rule, including arithmetic or it is true indicate item, can be multiple if-clauses and/or logical groups
It closes;RRS (Rule Right Side) is the action part of the i-th rule, refers to that robot motion executes.
7. expression robot man-machine interaction method as claimed in claim 6, it is characterised in that this method first has to clear expression
Recognition result and smell type information construct Different Rule instruction, i.e. control machine according to the various combination of two kinds of recognition results
Device people output executes instruction.
8. expression robot man-machine interaction method as claimed in claim 7, it is characterised in that the implementation sequence of rules of interaction
Controlled, according to the relevant interactive suitable control strategy of knowledge-chosen, fact of case is tested with regular premise,
Match, and then infer new information, the explanation and execution of implementation rule body, reasoning flow is:
201, perception recognition result parameter is obtained;
202, the data in goal condition and database are matched;
203, judge whether successful match, successful then carry out in next step, otherwise return to step 202, match again;
204, a rule is taken out from rule base, its premise is matched with the data in current database;
205, judge whether successful match, successful then execute next step, otherwise return to step 204 continues to match;
206, the movement content of executing rule defined.
9. expression robot man-machine interaction method as described in claim 1, it is characterised in that this method uses dynamic link library
.dll it data is called and executes program, dynamic link library .dll includes a large amount of code segments and data information so as in multiple programs
Between complete resource-sharing.
10. expression robot man-machine interaction method as claimed in claim 9, it is characterised in that the method calls file packet
Header file .h, static link library .lib and dynamic link library .dll are included, the file format of wherein modules is as follows:
(1) visual identity module:DSStream.h,DSStream.lib,DSStream.dll
(2) odor identification module:Senser.h,Senser.lib,Senser.dll
(3) serial communication module:Mscomm.h,Mscomm.lib,Mscomm.dll.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710254658.9A CN108732943A (en) | 2017-04-18 | 2017-04-18 | Expression robot man-machine interaction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710254658.9A CN108732943A (en) | 2017-04-18 | 2017-04-18 | Expression robot man-machine interaction method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108732943A true CN108732943A (en) | 2018-11-02 |
Family
ID=63923973
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710254658.9A Pending CN108732943A (en) | 2017-04-18 | 2017-04-18 | Expression robot man-machine interaction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108732943A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112497217A (en) * | 2020-12-02 | 2021-03-16 | 深圳市香蕉智能科技有限公司 | Robot interaction method and device, terminal equipment and readable storage medium |
WO2021120684A1 (en) * | 2019-12-16 | 2021-06-24 | 苏宁云计算有限公司 | Human-computer interaction device and method for intelligent apparatus |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101458778A (en) * | 2008-12-26 | 2009-06-17 | 哈尔滨工业大学 | Artificial head robot with facial expression and multiple perceptional functions |
CN101618280A (en) * | 2009-06-30 | 2010-01-06 | 哈尔滨工业大学 | Humanoid-head robot device with human-computer interaction function and behavior control method thereof |
CN106096716A (en) * | 2016-06-01 | 2016-11-09 | 安徽声讯信息技术有限公司 | A kind of facial expression robot multi-channel information emotional expression mapping method |
-
2017
- 2017-04-18 CN CN201710254658.9A patent/CN108732943A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101458778A (en) * | 2008-12-26 | 2009-06-17 | 哈尔滨工业大学 | Artificial head robot with facial expression and multiple perceptional functions |
CN101618280A (en) * | 2009-06-30 | 2010-01-06 | 哈尔滨工业大学 | Humanoid-head robot device with human-computer interaction function and behavior control method thereof |
CN106096716A (en) * | 2016-06-01 | 2016-11-09 | 安徽声讯信息技术有限公司 | A kind of facial expression robot multi-channel information emotional expression mapping method |
Non-Patent Citations (1)
Title |
---|
刘晓娜: "表情机器人人机交互应用研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021120684A1 (en) * | 2019-12-16 | 2021-06-24 | 苏宁云计算有限公司 | Human-computer interaction device and method for intelligent apparatus |
CN112497217A (en) * | 2020-12-02 | 2021-03-16 | 深圳市香蕉智能科技有限公司 | Robot interaction method and device, terminal equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110569795B (en) | Image identification method and device and related equipment | |
CN103003792B (en) | Use Analysis of Hierarchy Structure data | |
CN109871204A (en) | Text visualizes visual programming integrating device, processing equipment and storage medium | |
CN107451653A (en) | Computational methods, device and the readable storage medium storing program for executing of deep neural network | |
CN110287489A (en) | Document creation method, device, storage medium and electronic equipment | |
CN106021816B (en) | A kind of implementation method of the distributed system behavior simulation analysis tool of Behavior-based control tree | |
Murray et al. | Knowledge-based simulation model specification | |
CN104679839A (en) | Information push method and information push device | |
US20180204107A1 (en) | Cognitive-emotional conversational interaction system | |
CN106648662A (en) | Engineering cost calculation description language BCL-based report generation device and generation method | |
CN110532883A (en) | On-line tracking is improved using off-line tracking algorithm | |
CN105912530A (en) | Intelligent robot-oriented information processing method and system | |
CN107367686A (en) | A kind of generation method of RTL hardware Trojan horses test vector | |
CN108664241A (en) | A method of SysML models are subjected to simulating, verifying | |
CN112069916B (en) | Face beauty prediction method, device and system and readable storage medium | |
CN106022294A (en) | Intelligent robot-oriented man-machine interaction method and intelligent robot-oriented man-machine interaction device | |
CN108732943A (en) | Expression robot man-machine interaction method | |
CN109421044A (en) | Intelligent robot | |
CN112784926A (en) | Gesture interaction method and system | |
Vinokurov et al. | A metacognitive classifier using a hybrid ACT-R/Leabra architecture | |
Ivanov et al. | Forming the multi-modal situation context in ambient intelligence systems on the basis of self-organizing cognitive architectures | |
CN103955368B (en) | A kind of adaptive support system of software obfuscation and development approach | |
Wenzheng | Human activity recognition based on acceleration sensor and neural network | |
CN106295301B (en) | Control method, control device and home appliance system | |
Yin et al. | Design and implementation of petri net for brain-computer interface system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181102 |