CN101618280A - Humanoid-head robot device with human-computer interaction function and behavior control method thereof - Google Patents

Humanoid-head robot device with human-computer interaction function and behavior control method thereof Download PDF

Info

Publication number
CN101618280A
CN101618280A CN200910072405A CN200910072405A CN101618280A CN 101618280 A CN101618280 A CN 101618280A CN 200910072405 A CN200910072405 A CN 200910072405A CN 200910072405 A CN200910072405 A CN 200910072405A CN 101618280 A CN101618280 A CN 101618280A
Authority
CN
China
Prior art keywords
robot
humanoid
emotion
human
facial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200910072405A
Other languages
Chinese (zh)
Other versions
CN101618280B (en
Inventor
吴伟国
孟庆梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN2009100724055A priority Critical patent/CN101618280B/en
Publication of CN101618280A publication Critical patent/CN101618280A/en
Application granted granted Critical
Publication of CN101618280B publication Critical patent/CN101618280B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention relates to a humanoid-head robot and a behavior control method thereof, in particular to a humanoid-head robot device with human-computer interaction function and the behavior control method thereof, solving the problems that the prior humanoid-head robot can not completely realize the reproduction of human facial expressions, has limited perceptive function and does not have manual feeling models and the human-computer interaction function. The behavior control method comprises the following steps: a sensor perceptive system outputs perceived information to a main control computer for processing; control system software in a robot behavior control system obtains the relative control quantity of a corresponding motor according to the manual feeling models, executes a motion control instruction to output a PWM pulse by a motion control card to drive the corresponding motor to move to an appointed position and realize the human-computer interaction function and various feeling reactions of a robot; and the sensor perceptive system perceives external feeling signals, recognizes corresponding feeling signals and utilizes the manual feeling models to realize the behavior control of the robot. The invention realizes the reproduction of the human facial expressions and has anthropopathic multi-perception function, such as smell, touch, vision, and the like.

Description

Humanoid-head robot device and behavior control method with human-computer interaction function
Technical field
The present invention relates to a kind of humanoid-head robot device and behavior control method thereof, belong to the robot application field.
Background technology
The research of anthropomorphic robot started from the sixties in last century, had become one of the main direction of studying in Robotics field at present through the development in more than 50 years.It integrates multi-door science such as machinery, electronics, computer, material, sensor, control technology, is representing the development in Hi-Tech level of a country.The meaning of " apery " is that robot has class people's perception, decision-making, behavior and interaction capabilities.Humanoid-head robot is to realize the mutual important directions of man-machine emotion in the anthropomorphic robot research field.Emotion can improve the convenience and the confidence level of robot, and the feedback informations such as internal state, target and intention of robot can be provided to the user simultaneously.In man-machine interaction, the machine and the people that are designed to have emotion set up friendly interactive interface, make it have the ability that participates in social affairs and carry out human communication, are easilier accepted by the mankind; The while machine has " life " makes the thinking (or behavior) of its main body have clear and definite purpose and directionality, thereby improves the efficient and the speed of thinking (or behavior) significantly; Dynamic, unpredictable and have in the environment of potential " danger ", give creativeness on the mind over machine and the consciousness in the behavior, improve the ability that machine conforms.
At present, the development of some humanoid-head robots does not have multiple perceptional function, and the realization of basic facial expression also is only limited to a kind of expression.Find that by literature search the Chinese patent notification number is that CN 201088839 patent No.s are 200720189947.7 and are called " expression muscle actuating mechanism is laughed at by robot ", this mechanism comprises head frame, rubber epidermis, and the rubber epidermal patch is on the head frame.Its characteristics are that mechanism simply can realize the various expressions of laughing at.This mechanism's weak point is to realize other facial expressions.The Chinese patent notification number is that the CN 101020315A patent No. is 200710038295.1 and is called " head system of anthropomorphic robot ".This system comprises the tandem type mechanism of a six degree of freedom, is the scalable processors network of main controlled node with DSPF2812.The serial mechanism body of six degree of freedom is driven by six steering wheels, is used for simulating the motion of eyes, neck and chin.The scalable processors network is made up of video processor, speech processor, interface module and DSPF2812 governor circuit, can realize motion control and calculation requirement in the man-machine interaction process.But this head system of anthropomorphic robot does not have the elasticity skin of face, therefore can not realize the reproduction to human facial expression, does not have as anthropomorphic many perceptional functions such as sense of smell, sense of touch, vision.Robot does not have artificial emotion model and human-computer interaction function in the patent of above application simultaneously.
Summary of the invention
In view of above-mentioned state of the art, the purpose of this invention is to provide a kind of humanoid-head robot device and behavior control method, can not realize fully that reproduction, perceptional function to human facial expression is limited, not have the problem of artificial emotion model and human-computer interaction function to solve existing humanoid-head robot with human-computer interaction function.
The present invention solves the problems of the technologies described above the technical scheme that is adopted:
Humanoid-head robot device with human-computer interaction function of the present invention is made up of humanoid-head robot body, robot behavior control system and sensor senses system three parts; Described humanoid-head robot body comprises eye movement unit, the upper jaw and the lower jaw moving cell, artificial lung device, facial expression and degree of lip-rounding driving mechanism, front supporting plate, rear carrier plate, stand, facial housing and facial cutis elastica; The eye movement unit is made of two eyeballs, eyeball transmission mechanism, two eyeball servomotors, two eyelids, eyelid transmission mechanism, two eyelid servomotors and servomotor; The upper jaw and the lower jaw moving cell is made of last Hubei Province, lower jaw, motor and rotating shaft; Artificial lung device is made of flexible pipe, cylinder, piston, nut, the gas-powered axis of guide, gas-powered leading screw, drive motors and air inlet pipe; Facial expression and apery degree of lip-rounding driving mechanism are driven servomotor, the moving servomotor of 4 wheel driven, the 5th driving servomotor, the 6th driving servomotor and are organized skin of face driving rope sheave mechanism more by the first driving servomotor, the second driving servomotor, the 3rd and constitute; The robot behavior control system comprises control system hardware and control system software, and described control system hardware comprises main control computer and motion control card; Described control system software is the behavior control method; The sensor senses system comprises two small ccd sensors, speech recognition single-chip microcomputer, olfactory sensor; Front supporting plate, rear carrier plate be arranged in parallel and are fixed together with stand and constitute the head skeleton of humanoid-head robot, and eye movement unit, the upper jaw and the lower jaw moving cell are installed in the stand from top to bottom; In each eyeball, embed the binocular vision that a small ccd sensor forms robot, each eyelid is arranged on the top of corresponding eyeball, two eyeball servomotors drive two eyeball left-right rotation by the eyeball transmission mechanism respectively, two eyelid servomotors drive two eyelid movements by the eyelid transmission mechanism respectively, and it is moving up and down together that servomotor drives two eyeballs simultaneously; Last Hubei Province is arranged on the top of lower jaw, and the driven by motor rotating shaft drives jaw movement down; Olfactory sensor is installed in the stand between eye movement unit and the upper jaw and the lower jaw moving cell, one end of flexible pipe is connected with olfactory sensor, the other end of flexible pipe is connected with cylinder, motor and gas-powered leading screw are rotationally connected, nut is installed on the gas-powered leading screw, and nut moves along the gas-powered axis of guide, drives the piston motion that is connected on the nut, realize the olfactory function of robot, an end of air inlet pipe is connected with cylinder; The first driving servomotor, second drives servomotor, the 3rd driving servomotor, the moving servomotor of 4 wheel driven, the 5th drives servomotor and the 6th driving servomotor is installed on the humanoid-head robot head skeleton that is made of front supporting plate, rear carrier plate and stand, and described six driving servomotors drive rope sheave mechanism by skin of face and are connected with the control corresponding point of facial cutis elastica; Facial housing and facial cutis elastica are consistent with the contour shape of people's face, and facial housing, facial cutis elastica are installed in the outer contoured shape of formation humanoid-head robot device on the front end of eye movement unit, the upper jaw and the lower jaw moving cell from inside to outside; Motion control card is installed on the rear carrier plate, and the speech recognition single-chip microcomputer is installed on the upper end of front supporting plate; The sensor senses system exports to main control computer with the information that perceives and handles, control system software in the robot behavior control system obtains the relevant controlled quentity controlled variable of corresponding motor according to artificial emotion model, motion control instruction is driven corresponding motor movement to appointed positions by motion control card (electric machine controller) output pwm pulse, thereby realize the human-computer interaction function and the various emotional responses of robot.
Above-mentioned behavior control method with humanoid-head robot device of human-computer interaction function realizes according to the following steps:
Step 1, the emotion signal by sensor senses system (sensor CCD and MIC) the perception external world, to the emotion signal that perceives analyze, feature extraction and identification;
Step 2, the emotion signal after will discerning pass to artificial emotion model; Described artificial emotion model mainly comprises three parts: the emotion of robot produces, emotion is mutual and emotional expression; Emotion produces and comprises that mainly stimulus signal collection, emotion definition, emotion drive, emotion conversion four module; Two threshold alpha of definition in emotion drives, β; α is the activation threshold of emotion, and β is the saturation degree threshold value of emotion; The conversion of affective state will be considered external factor, internal factor and the influence of affective state three aspects in the past;
Can determine on the basis of each state transformational relation with trigger event fully at each state of clear and definite artificial emotion model and some, realize that with the finite state machine of expanding the state between the emotion changes; Increase the variable's attribute set on the definition status, as a kind of extended finite state machine EFSM; After utilizing EFSM to analyze to know the emotion interaction models, can determine the variable formation of each function in the artificial emotion model exactly, can avoid the variable-definition conflict effectively, for next step robot control behavior provides foundation;
Step 3, the artificial emotion model of basis calculate the relevant controlled quentity controlled variable (corresponding corner) of respective drive motor, to the facial expression and the apery degree of lip-rounding of robot, and then obtain the behavior that its emotion is expressed by robot.
The present invention has following useful technique effect: the present invention has realized the reproduction to human facial expression, but also has as anthropomorphic many perceptional functions such as sense of smell, sense of touch, vision.The present invention can realize the dynamic degree of lip-rounding of basic facial expression and robot by the elasticity skin of face.Employing is based on the behavior control method of emotion model, realize the behavior control of robot,, calculate identification through artificial emotion model by the emotion stimulation in the sensor senses external world among the robot device, control system realizes the various actions reaction of robot, makes robot have human-computer interaction function.The characteristics of maximum of the present invention are that humanoid-head robot device has been realized the design with 1: 1 ratio of adult head volume, compact conformation.
Description of drawings
Fig. 1 a is the stereogram of robot of the present invention, and Fig. 1 b is the stereogram (for ease of expressing, facial housing and facial cutis elastica etc. do not draw) of robot of the present invention; Fig. 2 a is the upper jaw and the lower jaw moving cell stereogram of robot of the present invention, and Fig. 2 b is the upper jaw and the lower jaw moving cell stereogram (main parallax stereogram) of robot of the present invention; Fig. 3 is the stereogram of robot of the present invention eyeball moving cell; Fig. 4 a is the particular location schematic diagram of definition skin characteristic point on the skin of face of robot, and Fig. 4 b is that the formation principal diagram of the robot of the present invention facial expression and the degree of lip-rounding is intended to; Fig. 5 is a robot of the present invention man-machine interaction principle schematic; Fig. 6 is the block diagram that the behavior control hardware of robot of the present invention constitutes; Fig. 7 a is the particular flow sheet of the behavior control software of robot of the present invention, and Fig. 7 b-1 is the voice signal figure of robot when expressing " Nice to see you ", and figure is that 7b-2 is the corner figure of robot lower jaw drive motors when expressing " Nice to see you "; Fig. 8 a~8c is a robot of the present invention human facial expression recognition fuzzy-neural network method schematic diagram; Fig. 9 is a robot voice recognition methods schematic diagram of the present invention; Figure 10 is the structural representation of robot artificial emotion model of the present invention; Figure 11 is the emotion interaction models schematic diagram of robot of the present invention based on finite state machine; Figure 12 a is the basic degree of lip-rounding picture group of part that robot of the present invention can express, the expression picture group of each time point when Figure 12 b is robot expression " Nice to see you "; Figure 12 c is the man-machine interaction experiment photo picture group of robot of the present invention.
The specific embodiment
The specific embodiment one: as Fig. 1 a, Fig. 1 b, Fig. 2 a, Fig. 2 b, Fig. 3, Fig. 4 a, Fig. 4 b and shown in Figure 6, the described humanoid-head robot device with human-computer interaction function of present embodiment is made up of humanoid-head robot body, robot behavior control system and sensor senses system three parts; Described humanoid-head robot body comprises eye movement unit 1, the upper jaw and the lower jaw moving cell 61, artificial lung device 71, facial expression and degree of lip-rounding driving mechanism 81, front supporting plate 7, rear carrier plate 6, stand 51, facial housing 17 and facial cutis elastica 18; Eye movement unit 1 is made of two eyeballs 12, eyeball transmission mechanism, two eyeball servomotors 14, two eyelids 13, eyelid transmission mechanism, two eyelid servomotors 16 and servomotor 29; The upper jaw and the lower jaw moving cell 61 is made of last Hubei Province 8, lower jaw 9, motor 27 and rotating shaft 28; Artificial lung device 71 is made of flexible pipe 19, cylinder 20, piston 21, nut 22, the gas-powered axis of guide 23, gas-powered leading screw 25, drive motors 26 and air inlet pipe 92; Facial expression and apery degree of lip-rounding driving mechanism 81 are driven servomotor 33, the moving servomotor 40 of 4 wheel driven, the 5th driving servomotor 41, the 6th driving servomotor 43 and are organized skin of face driving rope sheave mechanism more by the first driving servomotor 35, the second driving servomotor the 34, the 3rd and constitute;
The robot behavior control system comprises control system hardware and control system software, and described control system hardware comprises main control computer 91 and motion control card 5; Described control system software is the behavior control method;
The sensor senses system comprises two small ccd sensors 3, speech recognition single-chip microcomputer 4, olfactory sensor 24;
Front supporting plate 7, rear carrier plate 6 be arranged in parallel and are fixed together with stand 51 and constitute the head skeleton of humanoid-head robot, and eye movement unit 1, the upper jaw and the lower jaw moving cell 61 from top to bottom are installed in the stand 51; In each eyeball 12, embed the binocular vision of a small ccd sensor 3 formation robots, each eyelid 13 is arranged on the top of corresponding eyeball 12, two eyeball servomotors 14 drive two eyeball 12 left-right rotation by the eyeball transmission mechanism respectively, two eyelid servomotors 16 drive 13 motions of two eyelids by the eyelid transmission mechanism respectively, and it is moving up and down together that servomotor 29 drives two eyeballs 12 simultaneously; Last Hubei Province 8 is arranged on the top of lower jaw 9, and motor 27 drives rotating shaft 28 and drives lower jaw 9 motions; Olfactory sensor 24 is installed in the stand 51 between eye movement unit 1 and the upper jaw and the lower jaw moving cell 61, one end of flexible pipe 19 is connected with olfactory sensor 24, the other end of flexible pipe 19 is connected with cylinder 20, motor 26 is rotationally connected with gas-powered leading screw 25, nut 22 is installed on the gas-powered leading screw 25, and nut 22 moves along the gas-powered axis of guide 23, drives piston 21 motions that are connected on the nut 22, realize the olfactory function of robot, an end of air inlet pipe 92 is connected with cylinder 20; The first driving servomotor 35, second drives servomotor 34, the 3rd driving servomotor 33, the moving servomotor the 40, the 5th of 4 wheel driven drives servomotor 41 and the 6th driving servomotor 43 is installed on the humanoid-head robot head skeleton that is made of front supporting plate 7, rear carrier plate 6 and stand 51, and described six driving servomotors drive rope sheave mechanism by skin of face and are connected with the control corresponding point of facial cutis elastica 18; Facial housing 17 and facial cutis elastica 18 are consistent with the contour shape of people's face, and facial housing 17, facial cutis elastica 18 are installed in the outer contoured shape of formation humanoid-head robot device on the front end of eye movement unit 1, the upper jaw and the lower jaw moving cell 61 from inside to outside; Motion control card 5 is installed on the rear carrier plate 6, and speech recognition single-chip microcomputer 4 is installed on the upper end of front supporting plate 7; The sensor senses system exports to main control computer 91 with the information that perceives and handles, control system software in the robot behavior control system obtains the relevant controlled quentity controlled variable of corresponding motor according to artificial emotion model, motion control instruction is driven corresponding motor movement to appointed positions by motion control card 5 (electric machine controller) output pwm pulse, thereby realize the human-computer interaction function and the various emotional responses of robot.
The length of the humanoid-head robot device of present embodiment is 162mm, wide for 156mm, height are 184mm, and weight is 2.8kg, has 14 frees degree.Realize the apery face organ motion of robot by eye movement unit 1, the upper jaw and the lower jaw moving cell 61; Drive rotating shaft 28 by motor 27 and drive the motion that lower jaw 9 is realized the mouth of robot.Artificial lung device drives gas-powered leading screw 25 by motor 26 and rotates, and nut 22 moves along the gas-powered axis of guide 23, thereby drives piston 21 motions that are connected on the nut 22, realizes the olfactory function of robot.Can identify smells such as alcohol, cigarette, ammonia, artificial lung device is laid can be in the appropriate location of anthropomorphic robot chest.Fig. 3 is the eye movement driver element 1 of robot of the present invention, and eye movement has 2 frees degree, and eyelid movement has 1 free degree.Embed small ccd sensor 3 respectively at eyeball 12, constitute the binocular vision of robot.The fastest 500deg/s that reaches of the movement velocity of eyeball wherein, the movement velocity of eyelid is the fastest to be 900deg/s.In humanoid-head robot device, consider the concrete space requirement of mechanism, the behavior control employing volume of robot is little, output torque is bigger, be easy to eyeball servomotor 14, eyelid servomotor 16, the servomotor 29 of Position Control, adopt the transmission mechanism of synchronous belt drive mechanism as motor, consider the support problem of motor driving shaft simultaneously, design bearing spider 15 strengthens the rigidity of mechanism.The eye movement mechanism of robot is symmetrical.Wherein eyeball servomotor 14 drives eyeball 12 left-right rotation by corresponding first rope sheave 31 and second rope sheave 11.Servomotor 16 drives corresponding eyelid 13 motions by corresponding the 3rd rope sheave 30, the 4th rope sheave 33.Eyeball 12 moves up and down together in servomotor 29 drivings two.
Consider the communication problem and the reasonable distribution of servo controlling card and main control computer and effectively utilize the hardware resource of motion control card, realize the integrated two aspect problems of hardware of humanoid-head robot system, motion control card 5 is selected the SSC-32 motion control card for use, and the SSC-32 motion control card can be controlled nearly 32 servomotor coordinations.Motion control hardware is that SSC-32 integrated circuit board 5 is own, this steering engine controller adopts RS232 serial ports and PC to communicate, transmit the control instruction signal by PC operation upper computer software to controller, the pwm signal of just can slave controller exporting certain dutycycle is realized multipath servo motor control or control simultaneously separately.The control instruction of control card is simplified, control accuracy is 0.09 °/μ s, can carry out the Position Control and the speed control of steering wheel.
Speech recognition single-chip microcomputer 4 is SPCE061A single-chip microcomputers.Humanoid-head robot adopts main control computer to combine as the auditory system of robot with SPCE061A single-chip microcomputer 4.SPCE061A single-chip microcomputer 4 is a microprocessors of 16 bit that Ling Yang company releases, embedded 32k word flash memory flash, and the processing speed height can be finished the processing of speech recognition and voice signal easily.The system communication partial circuit mainly comprises 32 I/O communicating circuits and universal asynchronous serial interface (UART) communicating circuit, native system is realized the transfer of data between single-chip microcomputer and main control computer by the communication between realization of universal asynchronous serial interface (UART) circuit and the main control computer.The voice suggestion of using during speech recognition realizes by means of the API voice function that Ling Yang company provides.Template matching algorithm during identification adopts the Viterbi algorithm.It is a kind of sweep forward algorithm, and it can be when given corresponding observation sequence, finds out the optimum state sequence of finding out from model, and the masterplate of promptly selecting the output probability maximum is as the output result.The amount of calculation of template training is very big, finish by means of the main control computer platform, the voice signal sample is gathered by the SPCE061A system, communication module by RS232 is sent to voice signal on the main control computer preserves, the uniformity of the phonetic feature that it has guaranteed training usefulness during with identification, thus the inconsistent and error brought of hardware system reduced.
Olfactory sensor 24 adopts FIS serial gas sensor.
The specific embodiment two: the described sensor senses of present embodiment system also comprises touch sensor, and described touch sensor is arranged on the forehead medium position.Other composition and annexation are identical with the specific embodiment one.
The specific embodiment three: the described sensor senses of present embodiment system also comprises two temperature sensors, and described two temperature sensors are separately positioned on the left and right sides of forehead.Other composition and annexation are identical with the specific embodiment one or two.
The specific embodiment four: the described small ccd sensor 3 of present embodiment is to utilize a new generation that Microsoft releases on the basis of ActiveMovie and Video for Windows to handle kit DirectShow based on the Streaming Media of COM technology to carry out Video Capture; Image pick-up card is placed on the mainboard by PCI, the vision of setting up humanoid-head robot based on the windows Software Development Platform by band in the small ccd sensor 3 is finished the identification of environment to external world, mainly refers to the identification of humanoid-head robot to people's face portion expression here; The major function of described small ccd sensor 3 is:
(1) static, dynamic image-acquisition functions comprises that parameter setting, the bitmap images of image pick-up card read in and storage, multiplex image acquisition control, demonstration and switching;
(2) relate to dynamically, still image parser and human facial expression recognition algorithm.Other is identical with the specific embodiment one.
The specific embodiment five: shown in Fig. 1 a, Fig. 1 b, Fig. 4 a, Fig. 4 b, Figure 12 a and Figure 12 b, present embodiment realizes that the method for the facial expression and the apery degree of lip-rounding is:
Step 1, on facial cutis elastica 18 definition skin characteristic point: a, a-1, b, b-1, c, c-1, d, d-1, e, e-1, g, described each point are the motion control point;
Step 2, a slide block is set at each motion control point place, the corresponding control point of described slide block and facial cutis elastica 18 links together, each slide block and one are slidingly connected with corresponding gathering sill, described gathering sill is arranged on the facial housing 17, and the direction that is provided with of described gathering sill determines that each motion control point is subjected to force direction accordingly; The end that skin of face drives the rope of rope sheave mechanism is connected with slide block, and the other end that skin of face drives the rope of rope sheave mechanism is connected with the corresponding driving servomotor;
Step 3, by the expression control point combination and change in displacement, realize the different basic facial expressions and the robot degree of lip-rounding.
In order to realize expression control, also can be at the skin movements guiding bent plate 10 that is provided with gathering sill at corresponding system control point.Be the facial expression and the degree of lip-rounding of realization robot, and definition skin characteristic point on skin of face 18 (a, a-1, b, b-1, c, c-1, d, d-1, e, e-1, g).The control point that the representative of figure orbicular spot is set on robot skin, arrow shows the control point travel direction.In the design of robot facial expression, the combination by controlling these control points and the displacement and the direction of motion realize the basic facial expression and the degree of lip-rounding, the bidirectional-movement of simulating human muscle.In practical set-up, skin of face drives the rope group and links to each other with the control point, and combination and change in displacement by the expression control point realize different basic facial expressions.Table 1 has been expressed each control point and skin of face driving rope group and has been driven the distribution of servomotor.In order effectively to utilize the space, the motion at each control point is symmetrical (being symmetrical arranged in a moving servomotor 40 of 4 wheel driven and the 5th driving servomotor 41 left sides; 38 are symmetrical arranged with 38-1; 39 are symmetrical arranged with 39-1).
The skin control point Drive servomotor Skin of face drives rope sheave
??a,a-1 ??35 ??2-1,2-2
??b,b-1 ??34 ??37-1,37-2
??c,c-1, ??33 ??36-1,36-2
??d,d-1 ??40 ??38,39
??e,e-1 ??41 ??38-1,39-1
??g ??43 ??42
The specific embodiment six: as Fig. 1 a, Fig. 1 b, Fig. 5, Fig. 6 and Fig. 7 a, Fig. 7 b-1, Fig. 7 b-2, shown in Figure 10, Figure 11, Figure 12 a, Figure 12 b and Figure 12 c, the described behavior control method with humanoid-head robot device of human-computer interaction function of present embodiment realizes according to the following steps:
Step 1, the emotion signal by sensor senses system (sensor CCD and MIC) the perception external world, to the emotion signal that perceives analyze, feature extraction and identification; Described emotion signal comprises human basic facial expression and voice signal; Human basic human facial expression recognition adopts structure of fuzzy neural network to discern; Voice signal identification adopts CHMM speech recognition modeling structure to discern;
Step 2, the emotion signal after will discerning pass to artificial emotion model; Described artificial emotion model mainly comprises three parts: the emotion of robot produces, emotion is mutual and emotional expression; Emotion produces and comprises that mainly stimulus signal collection, emotion definition, emotion drive, emotion conversion four module; Two threshold alpha of definition in emotion drives, β; α is the activation threshold of emotion, and β is the saturation degree threshold value of emotion; The conversion of affective state will be considered external factor, internal factor and the influence of affective state three aspects in the past; Realize human perceptional function by sensor during emotion produces, be used for experiencing extraneous incident.Since vision and the sense of hearing can the perception external environment most information, therefore in humanoid-head robot, realize human perceptional function by vision sensor and hearing transducer.Emotional expression comprises facial expression expression and phonetic representation.
Emotion is set up based on Finite State Machine alternately, can determine on the basis of each state transformational relation with trigger event fully at each state of clear and definite emotion model and some, realizes that with finite state machine the state between the emotion changes.Setting up the mutual purpose of emotion is for the behavior of control humanoid-head robot, makes it make corresponding behavior reaction according to affective state.Therefore, just should clearly determine its main function and variable that will use exactly after the affective state machine is set up.At requiring, the basic conception of finite state machine is expanded, increase the variable's attribute set on the definition status, as a kind of extended finite state machine (EFSM).After utilizing the EFSM analysis to know the emotion interaction models, can determine exactly that the variable of each function in the emotion model constitutes, can avoid the variable-definition conflict effectively, controlling for next step robot behavior provides foundation.E among Fig. 8 1Represent individual initial affective state, condition is represented input state, and according to current affective state and input state, the affective state of emotion carrier changes, and makes corresponding behavior.Variable set V on sound in the behavior and the expression expression state;
Step 3, the artificial emotion model of basis calculate the relevant controlled quentity controlled variable (corresponding corner) of respective drive motor, to the facial expression and the apery degree of lip-rounding of robot, and then obtain the behavior that its emotion is expressed by robot.
The specific embodiment seven:, shown in Fig. 8 a, 8b and 8c, present embodiment adopts the detailed process of the human basic facial expression of structure of fuzzy neural network identification to be: network input layer number is 6, i.e. facial expression feature value { θ 1, θ 2, θ 3, θ 4, L 1, L 2; Output layer node number be 7 kinds of basic facial expressions (glad, be taken aback, sad, angry, detest, fear and normally); The value that the expectation network is output as i output node is that 1 all the other output nodes are zero, and actual output is a certain concrete numerical value around the desired value interval; Select according to competition, the input sample class is judged to be in the actual output of network and has peaked output node corresponding class; Making refusal if there are a plurality of maximums simultaneously in the actual output node of network judges; Variation based on individual difference and expression, the relative position of characteristic point be not fix but have certain changeability, and in image acquisition process, the factors such as variation of the distance of people and camera cause the variation of characteristic value, so adopt the characteristic value of dimensionless number, wherein as human facial expression recognition θ ‾ i = θ i Δθ , L ‾ i = L i ΔL . Other is identical with the specific embodiment six.
The specific embodiment eight:, as shown in Figure 9, the detailed process of the described employing of present embodiment CHMM speech recognition modeling structure recognition of speech signals is: voice signal can be regarded the observed quantity that produces from a series of HMM states usually as, and each observation sequence is exactly a frame MFCC parameter; In identifying, realize the end-point detection of voice signal by short-time average energy and short-time zero-crossing rate; Adopt classical Baum-Welch algorithm to realize HMM parameter Estimation problem, adopt dynamic programming algorithm in the identifying---the Viterbi algorithm.Other is identical with the specific embodiment six.
The principle of the inventive method:, and identify corresponding emotion signal by the emotion signal in the sensor CCD and the MIC perception external world.According to element such as the facial expression control point and the basic degree of lip-rounding among the set of the variable in the extended finite state machine in the emotion model V, obtain the behavior that its emotion is expressed by robot.The basic degree of lip-rounding of the part that this enforcement robot can express is shown in Figure 12 a, and each degree of lip-rounding of robot is the pronunciation of the corresponding Chinese phonetic alphabet among the figure.According to artificial emotion model theory, the behavior of environmental stimuli effect of signals robot, extraneous stimulus signal comprises voice signal and visual signal in the experiment.According to different environmental stimuli signals, it is mutual that the machine person to person carries out different emotions, here the mutual main finger speech of emotion the mutual of robot facial expression of making peace.Corresponding reaction is made by robot under the driving of emotion model---expression and voice answering.This control method realizes the behavior of " the class people " of robot mutually alternately by emotion, driving, behavior three parts.What " driving " decision " does " in this system, and how " emotion " decision " is done ".By the emotion signal in the sensor CCD and the MIC perception external world,, obtain the behavior that its emotion is expressed by robot according to element such as the facial expression control point and the basic degree of lip-rounding among the set of the variable in the extended finite state machine in the emotion model V.Fig. 7 a is a humanoid-head robot device behavior control system software flow pattern.After robot merged the facial expression information that collects and voice messaging by sensor, control software obtained the relevant controlled quentity controlled variable of drive motors according to artificial emotion model, and the behavior of control robot is expressed.When Fig. 7 b-1, Fig. 7 b-2 are robot expression " Nice to see you ", the corner of lower jaw drive motors.0-11.8 is human greetings " hello " to robot second in the drawings.Apparatus of the present invention identify extraneous voice signal by the speech recognition single-chip microcomputer, make relative trigger and reply.When replying, draw the response time and the corresponding degree of lip-rounding thereof of each isolated word according to the statement of replying, thereby obtain the corresponding corner of drive motors.

Claims (10)

1, a kind of humanoid-head robot device with human-computer interaction function, described humanoid-head robot device is made up of humanoid-head robot body, robot behavior control system and sensor senses system three parts; It is characterized in that:
Described humanoid-head robot body comprises eye movement unit (1), the upper jaw and the lower jaw moving cell (61), artificial lung device (71), facial expression and degree of lip-rounding driving mechanism (81), front supporting plate (7), rear carrier plate (6), stand (51), facial housing (17) and facial cutis elastica (18); Eye movement unit (1) is made of two eyeballs (12), eyeball transmission mechanism, two eyeball servomotors (14), two eyelids (13), eyelid transmission mechanism, two eyelid servomotors (16) and servomotor (29); The upper jaw and the lower jaw moving cell (61) is made of last Hubei Province (8), lower jaw (9), motor (27) and rotating shaft (28); Artificial lung device (71) is made of flexible pipe (19), cylinder (20), piston (21), nut (22), the gas-powered axis of guide (23), gas-powered leading screw (25), drive motors (26) and air inlet pipe (92); Facial expression and apery degree of lip-rounding driving mechanism (81) drive servomotor (34), three driving servomotor (33), 4 wheel driven moving servomotor (40), five driving servomotor (41), six driving servomotor (43) and organize skin of face by the first driving servomotor (35), second more and drive rope sheave mechanism formation;
The robot behavior control system comprises control system hardware and control system software, and described control system hardware comprises main control computer (91) and motion control card (5); Described control system software is the behavior control method;
The sensor senses system comprises two small ccd sensors (3), speech recognition single-chip microcomputer (4), olfactory sensor (24);
Front supporting plate (7), rear carrier plate (6) be arranged in parallel and are fixed together with stand (51) and constitute the head skeleton of humanoid-head robot, and eye movement unit (1), the upper jaw and the lower jaw moving cell (61) from top to bottom are installed in the stand (51); In each eyeball (12), embed the binocular vision that a small ccd sensor (3) forms robot, each eyelid (13) is arranged on the top of corresponding eyeball (12), two eyeball servomotors (14) drive two eyeballs (12) left-right rotation by the eyeball transmission mechanism respectively, two eyelid servomotors (16) drive two eyelids (13) motion by the eyelid transmission mechanism respectively, and it is moving up and down together that servomotor (29) drives two eyeballs (12) simultaneously; Last Hubei Province (8) is arranged on the top of lower jaw (9), and motor (27) drives rotating shaft (28) and drives lower jaw (9) motion; Olfactory sensor (24) is installed in the stand (51) between eye movement unit (1) and the upper jaw and the lower jaw moving cell (61), one end of flexible pipe (19) is connected with olfactory sensor (24), the other end of flexible pipe (19) is connected with cylinder (20), motor (26) is rotationally connected with gas-powered leading screw (25), nut (22) is installed on the gas-powered leading screw (25), nut (22) moves along the gas-powered axis of guide (23), driving is connected in piston (21) motion on the nut (22), realize the olfactory function of robot, an end of air inlet pipe (92) is connected with cylinder (20); The first driving servomotor (35), second drives servomotor (34), the 3rd driving servomotor (33), the moving servomotor (40) of 4 wheel driven, the 5th drives servomotor (41) and the 6th driving servomotor (43) is installed on the humanoid-head robot head skeleton that is made of front supporting plate (7), rear carrier plate (6) and stand (51), and described six driving servomotors are connected with the control corresponding point of facial cutis elastica (18) by skin of face driving rope sheave mechanism; Facial housing (17) and facial cutis elastica (18) are consistent with the contour shape of people's face, and facial housing (17), facial cutis elastica (18) are installed in the outer contoured shape of formation humanoid-head robot device on the front end of eye movement unit (1), the upper jaw and the lower jaw moving cell (61) from inside to outside; Motion control card (5) is installed on the rear carrier plate (6), and speech recognition single-chip microcomputer (4) is installed on the upper end of front supporting plate (7); The sensor senses system exports to main control computer (91) with the information that perceives and handles, control system software in the robot behavior control system obtains the relevant controlled quentity controlled variable of corresponding motor according to artificial emotion model, motion control instruction is driven corresponding motor movement to appointed positions by motion control card (5) output pwm pulse, thereby realize the human-computer interaction function and the various emotional responses of robot.
2, the humanoid-head robot device with human-computer interaction function according to claim 1 is characterized in that: described sensor senses system also comprises touch sensor, and described touch sensor is arranged on the forehead medium position.
3, the humanoid-head robot device with human-computer interaction function according to claim 1 and 2 is characterized in that: described sensor senses system also comprises two temperature sensors, and described two temperature sensors are separately positioned on the left and right sides of forehead.
4, the humanoid-head robot device with human-computer interaction function according to claim 1 is characterized in that: described small ccd sensor (3) is to utilize a new generation that Microsoft releases on the basis of ActiveMovie and Videofor Windows to handle kit DirectShow based on the Streaming Media of COM technology to carry out Video Capture; Image pick-up card is placed on the mainboard by PCI, the vision of setting up humanoid-head robot based on the windows Software Development Platform by band in the small ccd sensor (3) is finished the identification of environment to external world, mainly refers to the identification of humanoid-head robot to people's face portion expression here; The major function of described small ccd sensor (3) is:
(1) static, dynamic image-acquisition functions comprises that parameter setting, the bitmap images of image pick-up card read in and storage, multiplex image acquisition control, demonstration and switching;
(2) relate to dynamically, still image parser and human facial expression recognition algorithm.
5, the humanoid-head robot device with human-computer interaction function according to claim 1 is characterized in that: the method that realizes the facial expression and the apery degree of lip-rounding is:
Step 1, go up definition skin characteristic point at facial cutis elastica (18): a, a-1, b, b-1, c, c-1, d, d-1, e, e-1, g, described each point are the motion control point;
Step 2, a slide block is set at each motion control point place, the corresponding control point of described slide block and facial cutis elastica (18) links together, each slide block and one are slidingly connected with corresponding gathering sill, described gathering sill is arranged on the facial housing (17), and the direction that is provided with of described gathering sill determines that each motion control point is subjected to force direction accordingly; The end that skin of face drives the rope of rope sheave mechanism is connected with slide block, and the other end that skin of face drives the rope of rope sheave mechanism is connected with the corresponding driving servomotor;
Step 3, by the expression control point combination and change in displacement, realize the different basic facial expressions and the robot degree of lip-rounding.
6, the described behavior control method of claim 1 with humanoid-head robot device of human-computer interaction function, it is characterized in that: described method realizes according to the following steps:
Step 1, the emotion signal by the sensor senses system senses external world, to the emotion signal that perceives analyze, feature extraction and identification;
Step 2, the emotion signal after will discerning pass to artificial emotion model; Described artificial emotion model mainly comprises three parts: the emotion of robot produces, emotion is mutual and emotional expression; Emotion produces and comprises that mainly stimulus signal collection, emotion definition, emotion drive, emotion conversion four module; Two threshold alpha of definition in emotion drives, β; α is the activation threshold of emotion, and β is the saturation degree threshold value of emotion; The conversion of affective state will be considered external factor, internal factor and the influence of affective state three aspects in the past;
Can determine on the basis of each state transformational relation with trigger event fully at each state of clear and definite artificial emotion model and some, realize that with the finite state machine of expanding the state between the emotion changes; Increase the variable's attribute set on the definition status, as a kind of extended finite state machine EFSM; After utilizing EFSM to analyze to know the emotion interaction models, can determine the variable formation of each function in the artificial emotion model exactly, can avoid the variable-definition conflict effectively, for next step robot control behavior provides foundation;
Step 3, the artificial emotion model of basis calculate the relevant controlled quentity controlled variable of respective drive motor, to the facial expression and the apery degree of lip-rounding of robot, and then obtain the behavior that its emotion is expressed by robot.
7, the behavior control method with humanoid-head robot device of human-computer interaction function according to claim 6 is characterized in that: described emotion signal comprises human basic facial expression and voice signal.
8, the behavior control method with humanoid-head robot device of human-computer interaction function according to claim 7 is characterized in that: human basic human facial expression recognition adopts structure of fuzzy neural network to discern; Voice signal identification adopts CHMM speech recognition modeling structure to discern.
9, the behavior control method with humanoid-head robot device of human-computer interaction function according to claim 8, it is characterized in that: adopt the detailed process of the human basic facial expression of structure of fuzzy neural network identification to be: network input layer number is 6, i.e. facial expression feature value { θ 1, θ 2, θ 3, θ 4, L 1, L 2; Output layer node number is 7 kinds of basic facial expressions: glad, be taken aback, sad, angry, detest, fear and normally; The value that the expectation network is output as i output node is that 1 all the other output nodes are zero, and actual output is a certain concrete numerical value around the desired value interval; Select according to competition, the input sample class is judged to be in the actual output of network and has peaked output node corresponding class; Making refusal if there are a plurality of maximums simultaneously in the actual output node of network judges; Variation based on individual difference and expression, the relative position of characteristic point be not fix but have certain changeability, and in image acquisition process, the factors such as variation of the distance of people and camera cause the variation of characteristic value, so adopt the characteristic value of dimensionless number, wherein as human facial expression recognition θ ‾ i = θ i Δθ , L ‾ i = L i ΔL .
10, the behavior control method with humanoid-head robot device of human-computer interaction function according to claim 8, it is characterized in that: adopt the detailed process of CHMM speech recognition modeling structure recognition of speech signals to be: voice signal can be regarded the observed quantity that produces from a series of HMM states usually as, and each observation sequence is exactly a frame MFCC parameter; In identifying, realize the end-point detection of voice signal by short-time average energy and short-time zero-crossing rate; Adopt classical Baum-Welch algorithm to realize HMM parameter Estimation problem, adopt dynamic programming algorithm in the identifying---the Viterbi algorithm.
CN2009100724055A 2009-06-30 2009-06-30 Humanoid-head robot device with human-computer interaction function and behavior control method thereof Active CN101618280B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100724055A CN101618280B (en) 2009-06-30 2009-06-30 Humanoid-head robot device with human-computer interaction function and behavior control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100724055A CN101618280B (en) 2009-06-30 2009-06-30 Humanoid-head robot device with human-computer interaction function and behavior control method thereof

Publications (2)

Publication Number Publication Date
CN101618280A true CN101618280A (en) 2010-01-06
CN101618280B CN101618280B (en) 2011-03-23

Family

ID=41511797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100724055A Active CN101618280B (en) 2009-06-30 2009-06-30 Humanoid-head robot device with human-computer interaction function and behavior control method thereof

Country Status (1)

Country Link
CN (1) CN101618280B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101885181A (en) * 2010-05-24 2010-11-17 上海电气集团股份有限公司 Robot for playing cucurbit flute and control method thereof
CN102825604A (en) * 2012-09-18 2012-12-19 广西玉林正方机械有限公司 Motion control programming system of six-DOF (degree of freedom) robot
CN103358310A (en) * 2013-07-04 2013-10-23 上海大学 Mouth movement mechanism of humanoid robot
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103853071A (en) * 2014-01-20 2014-06-11 南京升泰元机器人科技有限公司 Human-computer facial expression interaction system based on biological signal
CN104091370A (en) * 2014-08-01 2014-10-08 哈尔滨工业大学 Human head imitation portrait robot capable of changing facial form and face organs as well as mathematical modeling method and control method thereof
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN105843118A (en) * 2016-03-25 2016-08-10 北京光年无限科技有限公司 Robot interacting method and robot system
CN105940446A (en) * 2013-10-01 2016-09-14 奥尔德巴伦机器人公司 Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method
CN106203344A (en) * 2016-07-12 2016-12-07 北京光年无限科技有限公司 A kind of Emotion identification method and system for intelligent robot
WO2016206642A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 Method and apparatus for generating control data of robot
CN106325228A (en) * 2015-06-26 2017-01-11 北京贝虎机器人技术有限公司 Method and device for generating control data of robot
CN106371583A (en) * 2016-08-19 2017-02-01 北京智能管家科技有限公司 Control method and apparatus for intelligent device
CN106426196A (en) * 2016-08-31 2017-02-22 佛山博文机器人自动化科技有限公司 Service robot head
CN107813294A (en) * 2017-10-31 2018-03-20 梅其珍 A kind of nonmetal flexible anthropomorphic robot
WO2018095041A1 (en) * 2016-11-28 2018-05-31 深圳光启合众科技有限公司 Robot, and action control method and device therefor
CN108563138A (en) * 2018-07-04 2018-09-21 深圳万发创新进出口贸易有限公司 A kind of intelligent domestic system
CN108577866A (en) * 2018-04-03 2018-09-28 中国地质大学(武汉) A kind of system and method for multidimensional emotion recognition and alleviation
CN108714902A (en) * 2018-06-28 2018-10-30 香港中文大学(深圳) Apery expression robot head construction and robot head control system
CN108732943A (en) * 2017-04-18 2018-11-02 深圳市丰巨泰科电子有限公司 Expression robot man-machine interaction method
CN110103234A (en) * 2019-04-30 2019-08-09 广东工业大学 A kind of robot with humanoid facial expression
CN112825014A (en) * 2019-11-21 2021-05-21 王炼 Artificial intelligence brain
CN112991886A (en) * 2021-03-09 2021-06-18 湖北工业大学 Barrier-free communication learning auxiliary system for deaf-mutes

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1262844A1 (en) * 2001-06-01 2002-12-04 Sony International (Europe) GmbH Method for controlling a man-machine-interface unit
JP2004174692A (en) * 2002-11-29 2004-06-24 Mitsubishi Heavy Ind Ltd Man-machine robot and control method of man machine robot
CN200998593Y (en) * 2007-02-14 2008-01-02 杨建良 Face act device of the robot
CN100460167C (en) * 2007-03-22 2009-02-11 上海交通大学 Head system of anthropomorphic robot
CN101373380B (en) * 2008-07-14 2011-06-15 浙江大学 Humanoid robot control system and robot controlling method

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101885181A (en) * 2010-05-24 2010-11-17 上海电气集团股份有限公司 Robot for playing cucurbit flute and control method thereof
CN101885181B (en) * 2010-05-24 2011-08-10 上海电气集团股份有限公司 Robot for playing cucurbit flute and control method thereof
CN102825604A (en) * 2012-09-18 2012-12-19 广西玉林正方机械有限公司 Motion control programming system of six-DOF (degree of freedom) robot
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN103358310A (en) * 2013-07-04 2013-10-23 上海大学 Mouth movement mechanism of humanoid robot
CN105940446B (en) * 2013-10-01 2020-03-13 奥尔德巴伦机器人公司 Method for machine-to-human interaction, computer storage medium, and humanoid robot
CN105940446A (en) * 2013-10-01 2016-09-14 奥尔德巴伦机器人公司 Method for dialogue between a machine, such as a humanoid robot, and a human interlocutor; computer program product; and humanoid robot for implementing such a method
CN103853071A (en) * 2014-01-20 2014-06-11 南京升泰元机器人科技有限公司 Human-computer facial expression interaction system based on biological signal
CN103853071B (en) * 2014-01-20 2016-09-28 南京升泰元机器人科技有限公司 Man-machine facial expression interactive system based on bio signal
CN104091370A (en) * 2014-08-01 2014-10-08 哈尔滨工业大学 Human head imitation portrait robot capable of changing facial form and face organs as well as mathematical modeling method and control method thereof
CN104091370B (en) * 2014-08-01 2017-02-15 哈尔滨工业大学 Human head imitation portrait robot capable of changing facial form and face organs as well as mathematical modeling method and control method thereof
WO2016206643A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 Method and device for controlling interactive behavior of robot and robot thereof
WO2016206642A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 Method and apparatus for generating control data of robot
CN106325228A (en) * 2015-06-26 2017-01-11 北京贝虎机器人技术有限公司 Method and device for generating control data of robot
CN106325228B (en) * 2015-06-26 2020-03-20 北京贝虎机器人技术有限公司 Method and device for generating control data of robot
CN105244042B (en) * 2015-08-26 2018-11-13 安徽建筑大学 A kind of speech emotional interactive device and method based on finite-state automata
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN105843118A (en) * 2016-03-25 2016-08-10 北京光年无限科技有限公司 Robot interacting method and robot system
CN105843118B (en) * 2016-03-25 2018-07-27 北京光年无限科技有限公司 A kind of robot interactive method and robot system
CN106203344A (en) * 2016-07-12 2016-12-07 北京光年无限科技有限公司 A kind of Emotion identification method and system for intelligent robot
CN106371583A (en) * 2016-08-19 2017-02-01 北京智能管家科技有限公司 Control method and apparatus for intelligent device
CN106426196A (en) * 2016-08-31 2017-02-22 佛山博文机器人自动化科技有限公司 Service robot head
WO2018095041A1 (en) * 2016-11-28 2018-05-31 深圳光启合众科技有限公司 Robot, and action control method and device therefor
CN108732943A (en) * 2017-04-18 2018-11-02 深圳市丰巨泰科电子有限公司 Expression robot man-machine interaction method
CN107813294A (en) * 2017-10-31 2018-03-20 梅其珍 A kind of nonmetal flexible anthropomorphic robot
CN108577866A (en) * 2018-04-03 2018-09-28 中国地质大学(武汉) A kind of system and method for multidimensional emotion recognition and alleviation
CN108714902A (en) * 2018-06-28 2018-10-30 香港中文大学(深圳) Apery expression robot head construction and robot head control system
CN108563138A (en) * 2018-07-04 2018-09-21 深圳万发创新进出口贸易有限公司 A kind of intelligent domestic system
CN110103234A (en) * 2019-04-30 2019-08-09 广东工业大学 A kind of robot with humanoid facial expression
CN110103234B (en) * 2019-04-30 2024-05-24 广东工业大学 Humanoid facial expression robot
CN112825014A (en) * 2019-11-21 2021-05-21 王炼 Artificial intelligence brain
CN112991886A (en) * 2021-03-09 2021-06-18 湖北工业大学 Barrier-free communication learning auxiliary system for deaf-mutes

Also Published As

Publication number Publication date
CN101618280B (en) 2011-03-23

Similar Documents

Publication Publication Date Title
CN101618280B (en) Humanoid-head robot device with human-computer interaction function and behavior control method thereof
CN101458778B (en) Control method of artificial head robot
CN106985137B (en) Multi-modal exchange method and system for intelligent robot
CN101474481B (en) Emotional robot system
CN103853071B (en) Man-machine facial expression interactive system based on bio signal
CN106919251A (en) A kind of collaborative virtual learning environment natural interactive method based on multi-modal emotion recognition
CN104493827A (en) Intelligent cognitive robot and cognitive system thereof
CN110774285A (en) Humanoid robot and method for executing dialogue between humanoid robot and user
JP2004237022A (en) Information processing device and method, program and recording medium
CN109773807B (en) Motion control method and robot
CN205721625U (en) A kind of expression robot interactive system
CN108009573A (en) A kind of robot emotion model generating method, mood model and exchange method
CN112454390B (en) Humanoid robot facial expression simulation method based on deep reinforcement learning
CN110443309A (en) A kind of electromyography signal gesture identification method of combination cross-module state association relation model
CN110134863A (en) The method and device that application program is recommended
Silva et al. Mirroring and recognizing emotions through facial expressions for a RoboKind platform
Al-Aubidy et al. Wheelchair neuro fuzzy control using brain computer interface
Korayem et al. Design and implementation of the voice command recognition and the sound source localization system for human–robot interaction
Costantini et al. Multi-agent system engineering for emphatic human-robot interaction
Xu et al. Turn-Taking Prediction for Human–Robot Collaborative Assembly Considering Human Uncertainty
Weiguo et al. Development of the humanoid head portrait robot system with flexible face and expression
CN214955998U (en) Voice interaction equipment based on deep learning
Kirandziska et al. Human-robot interaction based on human emotions extracted from speech
CN115227246A (en) Driver voice emotion recognition method for intelligent driving
CN209607230U (en) A kind of two-way sign language translation device of intelligence

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant