CN102103707B - Emotion engine, emotion engine system and control method of electronic device - Google Patents

Emotion engine, emotion engine system and control method of electronic device Download PDF

Info

Publication number
CN102103707B
CN102103707B CN200910258052.8A CN200910258052A CN102103707B CN 102103707 B CN102103707 B CN 102103707B CN 200910258052 A CN200910258052 A CN 200910258052A CN 102103707 B CN102103707 B CN 102103707B
Authority
CN
China
Prior art keywords
behavior
behavior pattern
emotion
sensitive information
control module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200910258052.8A
Other languages
Chinese (zh)
Other versions
CN102103707A (en
Inventor
吴立伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Phison Electronics Corp
Original Assignee
Phison Electronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phison Electronics Corp filed Critical Phison Electronics Corp
Priority to CN200910258052.8A priority Critical patent/CN102103707B/en
Publication of CN102103707A publication Critical patent/CN102103707A/en
Application granted granted Critical
Publication of CN102103707B publication Critical patent/CN102103707B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Toys (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to an emotion engine, an emotion engine system and a control method of an electronic device, wherein the emotion engine is suitable for the electronic device. The emotion engine system comprises a behavior control unit, a sensing unit, a time unit and a behavior database, wherein the behavior control unit is used for providing a first behavior mode and a second behavior mode; when the sensing unit is enabled, the sensing unit generates triggering sensing information or initial sensing information to the behavior control unit; the time unit generates time information to the behavior control unit; the behavior database stores a plurality of behavior data, wherein the first behavior mode and the second behavior mode respectively correspond to at least one behavior data in the behavior data. In such way, the behavior control unit decides the behavior data corresponding to the second behavior mode according to the time information, the triggering sensing information and the first behavior mode. With the emotion engine, the emotion engine system and the control method of the electronic device provided by the invention, the electronic device is low in manufacturing cost, and capable of finely describing and expressing the personal features of robots.

Description

The control method of emotion engine, emotion engine system and electronic installation
Technical field
The present invention relates to a kind of system and control method thereof, relate in particular to the control method of a kind of emotion engine, emotion engine system and electronic installation.
Background technology
In recent years, because the electronic installation with artificial intelligence is constantly weeded out the old and bring forth the new, the demand of its emotion technology little by little came into one's own.For example, as the electronic installation of robot, its emotion technology (Emotion Technology) is all the inner unique technology of robotics (Robotics) all the time, the integration technology that it has comprised information engineering and control engineering.
Specifically, in emotion technology, most crucial concept is the realization of emotional space (Emotion Space), and its charming part is that control system can give robot by algorithm and the design of intending raw structure and have the emotion of approximate life entity and interactive ability.It is different from traditional robot can only express mimicry emotion by ice-cold external form or simple and static interaction mode.Therefore, the feature of emotion technology is, it can be given ice-cold train of mechanism and possess more flexible and more significant interaction capability, and shows the emotion of robot heart, to allow ice-cold train of mechanism be unlikely to stiff in interactive process.
But the emotion technology of robot has many technical bottlenecks for a long time always.In control system, control module not only needs to integrate a large amount of information, more needs the artificial intelligence algorithm of high-order to be used as identification rule.Therefore,, under considering cheaply, if will build in a conventional manner the demand that an emotion engine meets existing market, will there is its difficulty.
Summary of the invention
The object of the embodiment of the present invention is to provide a kind of emotion engine, can make electronic installation have lower cost of manufacture, and when electronic installation is during as the application of robot, its personality characters can be described fine and smoothly and express to emotion engine.
The object of the embodiment of the present invention is to provide a kind of emotion engine system, can make electronic installation have lower cost of manufacture, and when electronic installation is during as the application of robot, its personality characters can be described fine and smoothly and express to emotion engine system.
The object of the embodiment of the present invention is to provide a kind of control method of electronic installation.When electronic installation is during as the application of robot, it can describe fine and smoothly and express the personality characters of robot.
The embodiment of the present invention provides a kind of emotion engine, and it is applicable to an electronic installation.This emotion engine comprises a behavior control module, and behavior control module comprises an emotional simulation unit, triggers sensitive information, and one first behavior pattern and one second behavior pattern are provided in order to receive a temporal information.Accordingly, behavior control module is according to temporal information, triggering sensitive information and the first behavior pattern, determine the corresponding behavioral data of the second behavior pattern, wherein at least one behavioral data in the first behavior pattern and the corresponding multiple behavioral datas of the second behavior pattern difference.
Another embodiment of the present invention provides a kind of emotion engine system, and it is applicable to an electronic installation.Emotion engine system comprises a behavior control module, a sensing cell, a time quantum and a behavior database.Behavior control module comprises an emotional simulation unit, in order to one first behavior pattern and one second behavior pattern to be provided.Sensing cell connects behavior control module, and in the time that sensing cell is enabled, produces a triggering sensitive information or an initial sensitive information to behavior control module.Time quantum connects behavior control module, and produces a temporal information to behavior control module.In addition, behavior database connects behavior control module, and stores many behavioral datas, wherein at least one behavioral data in the first behavior pattern and the corresponding behavioral data of the second behavior pattern difference.Accordingly, behavior control module, according to temporal information, triggering sensitive information and the first behavior pattern, determines the corresponding behavioral data of the second behavior pattern.
In an embodiment of the present invention, above-mentioned emotional simulation unit also produces a random signal, and behavior control module is according to temporal information, triggering sensitive information, the first behavior pattern and random signal, determines the corresponding behavioral data of the second behavior pattern.
In an embodiment of the present invention, above-mentioned behavior control module according in temporal information and initial sensitive information at least one of them, determine the corresponding behavioral data of the first behavior pattern.
In an embodiment of the present invention, above-mentioned emotion engine system also comprises an element driver element, and initial sensitive information is a power supply signal.When behavior control module receives after initial sensitive information, provide the first behavior pattern, and driving element driver element is to carry out the first behavior pattern.
In an embodiment of the present invention, above-mentionedly determine that after the corresponding behavioral data of the second behavior pattern, driving element driver element is to carry out the second behavior pattern when behavior control module.
In an embodiment of the present invention, above-mentioned element drives unit comprises in a motor control unit and a multi-media voice control module at least one of them, and sensing cell comprises that a contact sensing module, a sound sensing positioning module and rock in sensing module at least one of them.
In an embodiment of the present invention, above-mentioned temporal information comprises that one by receiving after initial sensitive information to receiving the time span information that triggers between sensitive information or a system age information of electronic installation.
In an embodiment of the present invention, above-mentioned emotional simulation unit produces an emotion point and multiple situation case point of a Virtual Space.At this, each situation case point has the coordinate of a correspondence and the behavior pattern of a correspondence.Emotion point moves to a new coordinate according to temporal information, triggering sensitive information and the first behavior pattern by an old coordinate, and corresponding situation case point is found out to determine the corresponding behavioral data of the second behavior pattern according to new coordinate in emotional simulation unit.
Another embodiment of the present invention provides a kind of control method of electronic installation, and its step is as follows.First, provide one first behavior pattern.Then,, in the time that the sensing cell of electronic installation is enabled, produces one and trigger sensitive information.Afterwards, produce a temporal information.Then,, according to temporal information, triggering sensitive information and the first behavior pattern, determine the corresponding behavioral data of one second behavior pattern.
In an embodiment of the present invention, above-mentioned control method also comprises generation one random signal, and the corresponding behavioral data of the second behavior pattern is to determine according to temporal information, triggering sensitive information, the first behavior pattern and random signal.
In an embodiment of the present invention, above-mentioned control method, also comprises according to an initial sensitive information and determines the corresponding behavioral data of the first behavior pattern.
In an embodiment of the present invention, above-mentioned temporal information comprises that one by receiving after initial sensitive information to receiving the time span information that triggers between sensitive information or a system age information of electronic installation.
In an embodiment of the present invention, above-mentioned control method also comprises the type of analyzing triggering sensitive information.
In an embodiment of the present invention, in the step that the first behavior pattern is provided, be included in and receive after a power supply signal, the first behavior pattern is provided, and carry out the first behavior pattern.
In an embodiment of the present invention, when determining that, after the corresponding behavioral data of the second behavior pattern, an element driver element of drive electronics is to carry out the second behavior pattern.
In an embodiment of the present invention, comprise in the step that determines the corresponding behavioral data of the second behavior pattern the emotion point and the multiple situation case point that produce in a Virtual Space.At this, emotion point moves according to temporal information, triggering sensitive information and the first behavior pattern, and the corresponding behavioral data of the second behavior pattern is corresponding to the most contiguous situation case point of emotion point.
In an embodiment of the present invention, above-mentioned control method also comprises the following steps.First, obtain the emotion vector of a correspondence according to the first behavior pattern, temporal information and triggering sensitive information.Then, the coordinate in Virtual Space according to emotion vector corrected emotion point.Afterwards, obtain and a situation case point of the emotion point bee-line of revising, and determine the corresponding behavioral data of the second behavior pattern according to the situation case point of obtaining.
In an embodiment of the present invention, above-mentioned control method also comprises provides a convergence point, and in the step of the coordinate in Virtual Space according to emotion vector corrected emotion point, provides a recurrence power so that emotion point moves to convergence point.
In an embodiment of the present invention, the difference of above-mentioned information in time and change the corresponding coordinate of convergence point.
In an embodiment of the present invention, above-mentioned Virtual Space is a hyperspace, hyperspace has a plurality of axes of coordinates, and behavior control module obtains the corresponding situation case point of new coordinate with a vector operation, and each axes of coordinates of Virtual Space represents respectively the state of mind behavior that electronic installation is different.
In an embodiment of the present invention, the above-mentioned situation case point according to obtaining determines that the step of the corresponding behavioral data of the second behavior pattern comprises foundation one system age information and the situation case point of obtaining, and provides the second behavior pattern corresponding behavioral data.
Based on above-mentioned, the emotion engine that the embodiment of the present invention provides and system thereof, it can make electronic installation have lower cost of manufacture, and when electronic installation is during as the application of robot, its personality characters can be described fine and smoothly and express to emotion engine and system thereof, and have can modular design architecture, be not limited to external form and the design of specific electronic installation.In addition, the control method of the electronic installation that the embodiment of the present invention provides, it utilizes instant emotion computing and the concept of emotion point.Therefore,, when electronic installation is during as the application of robot, it can describe fine and smoothly and express the personality characters of robot.Under considering cheaply, can meet the demand of existing market.
For above-mentioned feature and advantage of the present invention can be become apparent, special embodiment below, and be described in detail below by reference to the accompanying drawings.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of the emotion engine system of one embodiment of the invention.
Fig. 2 is the detailed maps of the emotion engine system of Fig. 1.
Fig. 3 is the schematic diagram that the emotional responses of robot operates in Virtual Space.
Fig. 4 is the process flow diagram of the robot control method of one embodiment of the invention.
Fig. 5 is the detail flowchart of step S407 in Fig. 4.
Main element symbol description:
100: emotion engine system; 110: behavior control module;
111: tense control module; 113: emotional simulation unit;
112: original state; 114: children's tense module;
116: growth/transition interface; 118: adult's tense module;
114a, 118a: general modfel; 114b, 118b: active event pattern;
114c, 118c: passive event schema; 114d, 118d: sleep pattern;
114e, 118e: the pattern of getting up; 114f, 118f: random behavior pattern;
120: sensing cell; 130: element drives unit;
130a: motor control unit; 130b: multi-media voice control module;
120a: contact sensing module; 120b: sound sensing positioning module;
120c: rock sensing module; 140: behavior database;
140a: the behavioral data of children's tense module; 140b: the behavioral data of adult's tense module;
140c: index search unit; 150: time quantum;
152: physiological clock record sheet; S: emotion point;
Q: situation case point; T: motion track;
V: emotion vector; P: convergence point;
R: interval radius; F: recurrence power;
S401, S403, S405, S407, S409: step;
S501, S503, S505, S507, S509, S511: step.
Embodiment
Fig. 1 is the schematic diagram of the emotion engine system of one embodiment of the invention.Please refer to Fig. 1, the emotion engine system 100 of the present embodiment comprises a behavior control module 110, a sensing cell 120, a time quantum 150, a behavior database 140 and an element driver element 130, and wherein behavior control module 110 comprises a tense control module 111 and an emotional simulation unit 113.At this, sensing cell 120, time quantum 150 and behavior database 140 connect respectively behavior control module 110, in order to assist behavior control module 110 that one first behavior pattern and one second behavior pattern are provided.
In the present embodiment, emotion engine system 100 is applicable to an electronic installation with emotional responses, as robot, electronic pet (not shown).With artificially example of machine in the present embodiment, but not as limit.The corresponding behavior that the emotional responses of robot can be the reaction mechanism that produces with user's interaction, behavior performance that time quantum 150 sets or produces because of the demand of outside situation.
Below, will, take emotion engine system 100 as example, its function mode be described in more detail.
In the present embodiment, tense control module 111 and emotional simulation unit 113 can firmware pattern be arranged in one or more control circuits.For example, to comprise that multiple steering orders (are for example burned onto a program storage, in ROM (read-only memory) (Read Only Memory, referred to as: ROM)) and connect a microprocessor and form the behavior control module 110 with tense control module 111 and emotional simulation unit 113.Therefore,, in the time that emotion engine system 100 operates, can have been carried out by microprocessor the emotion engine administrative mechanism of the embodiment of the present invention in order to form multiple steering orders of tense control module 111 and emotional simulation unit 113.
In the time that behavior control module 110 receives an initial sensitive information, behavior control module 110 can provide one first behavior pattern, and driving element driver element 130 carries out the first behavior pattern, wherein initial sensitive information can be power supply signal or other driving signal.At this, the behavior pattern that robot shows can be one of them that general modfel, an active event pattern, a passive event schema, a sleep pattern, are got up in pattern and a random behavior pattern, and the present invention is not limited to this.For example, in the time that the power supply of robot is unlocked, behavior control module 110 provides the pattern of getting up, with exquisiteness express robot as corresponding behavior or state that living individual was showed.
In the present embodiment, sensing cell 120 connects behavior control module 110.In the time that robot is subject to outside stimulation, sensing cell 120 can be enabled, and triggers sensitive information to behavior control module 110 and produce one.For example, in the time of robot and user's interaction, sensing cell 120 produces according to its interaction mode and triggers sensitive information to behavior control module 110.For example, when robot by user stroke, pat, shake or robot be while being subject to clashing into, behavior control module 110 can judge according to sensing cell 120 type of received triggering sensitive information, and coordinate other reference information, and provide robot to show as the corresponding emotional reactions such as joyful, angry, sad.
In addition, time quantum 150 connects behavior control module 110.In the present embodiment, time quantum 150 can comprise a physiology clock log table 152, in order to the system age information of robot of record, and within a certain period of time, produce a temporal information, provide to behavior control module 110, and then make behavior control module 110 produce different interactive demands according to the different life stage of robot from user.In addition,, when behavior control module 110 receives initial sensitive information and triggers after sensitive information, time quantum 150 also can provide 110 1 time of behavior control module length information.At this, time span information is that behavior control module 110 receives initial sensitive information to the mistiming of triggering sensitive information, to assist behavior control module 110 to determine the corresponding behavioral data of the second behavior pattern.
Therefore, the behavior control module 110 of emotion engine system 100 can be according to the physiological clock of robot and life stage, expresses the behavior performance of robot, the corresponding behavior or the state that show in different life stage processes as living individual fine and smoothly.
In the present embodiment, behavior database 140 is in order to store many behavioral datas, and the behavior pattern of robot is distinguished at least one behavioral data in corresponding behavioral data.In emotion technology, above-mentioned many behavioral datas can be corresponding to the arbitrary coordinate points in Virtual Space, and Virtual Space can be the realization of emotional space.At this, Virtual Space can be the space of a two dimension (2-Dimension), three-dimensional (3-Dimension) or multidimensional, and many stored behavioral datas of behavior database 140 can comprise image data, voice data and motor behavior data, and the present invention is not limited to this.
In the present embodiment, behavior control module 110 can, according to temporal information, triggering sensitive information and the first behavior pattern, determine the corresponding behavioral data of the second behavior pattern.For example, the time quantum 150 of emotion engine system 100 time at noon of understanding is sent a temporal information, so that robot produces a behavior in order to propose to need the demand of feeding to user.Meanwhile, user is feeding after 5 minutes, and sensing cell 120 can produce the triggering sensitive information that receives feeding to behavior control module 110, so that behavior control module 110 controls show corresponding emotional reactions.For example, if the quantity of user's feeding is inadequate, behavior control module 110 controls produce angry emotional reactions.Otherwise robot produces satisfied emotional reactions.At this, the time quantum 150 at noon temporal information sent of time is an initial sensitive information, and now robot generation can correspond to the first behavior pattern in order to the behavior that proposes the demand that needs feeding to user.Then, robot is waiting for the feeding of accepting user after 5 minutes, and this wait of 5 minutes is also a temporal information, and the signal of this feeding can be a triggering sensitive information.Therefore, behavior control module 110 receiving after the signal of feeding, according to the demand, the time of wait and the quantity of feeding that propose to need feeding, judges and will produce satisfied, the angry or emotional reactions of the demand of feeding again.At this, corresponding emotional reactions can be corresponding to the second behavior pattern.Therefore, behavior control module 110 can, according to temporal information, triggering sensitive information and the first behavior pattern, determine the corresponding behavioral data of the second behavior pattern.
Illustrate with another embodiment again.For example, robot is subject to a beating in the time of 11 of nights.Then, after 1 minute, robot is subject to another beating again.Now, robot can produce the very angry emotional reactions of an expression.But if robot was just subject to secondary beating after 1 hour, because robot is long with the interval time of patting for the first time, its mood is calmed down.Moreover the robot physiological time has been shown as a length of one's sleep.Therefore, the emotional reactions of its generation can be 10 points of slight anger reactions.
Therefore, from above-described embodiment, temporal information (receiving after initial sensitive information to receiving the time span information that triggers sensitive information), trigger sensitive information and the first behavior pattern all can exert an influence to determining the corresponding behavioral data of the second behavior pattern.
In addition, in the present embodiment, in the time that behavior control module 110 receives power supply signal, behavior control module 110 also can provide the first behavior pattern.For example, behavior control module 110 controls are failure to actuate or are carried out behavior performance of greeting etc.But in other embodiments, behavior control module 110 also can or trigger sensitive information according to temporal information, determines the corresponding behavioral data of the first behavior pattern, carries out the first behavior pattern with drive machines people.That is to say, the first behavior pattern does not only limit corresponding to default behavioral data after start, and can or trigger sensitive information according to temporal information, determine the corresponding behavioral data of the first behavior pattern.
Specifically, in the present embodiment, behavior control module 110 comprises emotional simulation unit 113.
Emotional simulation unit 113 is in order to produce a Virtual Space with emotion point and multiple situation case points.Emotion point and situation case point correspond respectively to the coordinate points in Virtual Space, and this emotion point can produce accordingly and move because of temporal information, initial sensitive information, behavior control module 110 can be according to the position after this emotion point variation, find out the situation case point the most contiguous with it, to determine the first behavior pattern.In addition, emotion point may be again because the first behavior pattern, temporal information, triggering sensitive information move to another coordinate points, behavior control module 110 can be again according to the position after the variation of emotion point, find out the situation case point the most contiguous with it, and determine the corresponding behavioral data of the second behavior pattern.At this, the stored behavioral data of behavior database 140 corresponds respectively to the different situation case points in Virtual Space.That is to say, each situation case point corresponds respectively to the coordinate points in Virtual Space, and emotion point is constantly to move according to each information in Virtual Space.In addition.Emotional simulation unit 113 can carry out a mathematics algorithm to obtain the situation case point that approaches emotion point most, to determine the corresponding behavioral data of the second behavior pattern.To in each embodiment, be described further below this.
In another embodiment, emotional simulation unit 113 is also in order to produce a random signal, so that emotion point also can change because of random signal, not only can be subject to the stimulation of external signal with dummy robot's mood, more can react the inherent indefinite mood of oneself changes, and produce one more diversified, and the behavior pattern of exchange premium human nature more.Therefore,, with another angle, in the present embodiment, behavior control module 110 can, according to temporal information, triggering sensitive information, the first behavior pattern and random signal, determine the corresponding behavioral data of the second behavior pattern.In like manner, behavior control module 110 also can, according to temporal information, initial sensitive information and random signal, determine the corresponding behavioral data of the first behavior pattern.
In an embodiment again, behavior control module can also comprise a tense control module 111, in order to the temporal information providing according to time quantum 150, adjust the first behavior pattern or the corresponding behavioral data of the second behavior pattern that behavior control module 110 provides.For example, temporal information can display device people be in adult phase or juvenile phase, and it is not necessarily identical to being subject to the reaction of same triggering sensitive information for different phase.For example, the emotional reactions in the time that the adult phase, robot was subject to patting are a little indignation, but the emotional reactions of the robot in the juvenile phase are angry and sad.
Fig. 2 is the detailed maps of the emotion engine system of Fig. 1.Please refer to Fig. 2, in the present embodiment, tense control module 111 comprises children's tense module 114 and adult's tense module 118.Each tense module comprises multiple behavior patterns.For example, children's tense module 114 comprises general modfel 114a, active event pattern 114b, passive event schema 114c, sleep pattern 114d, the pattern of getting up 114e and random behavior pattern 114f.Similarly, in adult's tense module 118, also at least comprise above-mentioned various pattern, i.e. general modfel 118a, active event pattern 118b, passive event schema 118c, sleep pattern 118d, the pattern of getting up 118e and random behavior pattern 118f.
Specifically, tense control module 111 can be according to the system age of robot, expresses the behavior performance of robot at different tense modules, the corresponding behavior or the state that show in different life stage processes as living individual fine and smoothly.For example, when emotion engine system 100 is after initialization, at the system age of the robot that the life stage of robot records according to the physiological clock record sheet 152 of time quantum 150, started by original state 112, enter children's tense module 114, to carry out corresponding behavior pattern.Then,, along with the growth at robot system age, the tense module of tense control module 111, is switched to adult's tense module 118 via growth/transition interface 116.
In children's tense module 114, tense control module 111 can be randomly or the temporal information producing by time of reception unit 150, carrys out the behavior that control occurs.For example, tense control module 111, in the time carrying out active event pattern 114b, can propose initiatively demand to user by control, and be required to meet.In the present embodiment, robot proposes initiatively demand to user, and the event being required to meet can be robot through a schedule time or feel hungry randomly and then propose to need the demand of feeding, or robot proposes the demand that need to urinate and defecate after feeding.In addition, active event pattern 114b can be also the action that tense control module 111 controls are carried out book or newspaper reading.
In addition, in the time carrying out passive event schema 114c, tense control module 111 can be carried out corresponding performance to the behavior state of its passive reception by control.For example, in the time that robot is stroked, drops or pat by user, at the system age of the robot that tense control module 111 records according to physiological clock record sheet 152, coordinate the situation case point of emotional simulation unit 113, and that control shows is joyful, pain or happy performance.In addition,, in random behavior pattern 114f, tense control module 111 can be carried out various random behaviors by control, for example: game mode or roam mode.In the present embodiment, in the time that tense control module 111 is switched to children's tense module 114, the physiological clock of the robot recording according to physiological clock record sheet 152, tense control module 111 also can be carried out general modfel 114a, sleep pattern 114d and the pattern 114e that gets up by control.
Afterwards, along with the growth at robot system age, the tense module of tense control module 111, is switched to adult's tense module 118 via growth/transition interface 116.Similarly, in adult's tense module 118, tense control module 111 also can be independently or by with the coordinating of emotional simulation unit 113, driving element driver element 130 controls are carried out the correspondence performance of various actions patterns.It should be noted that in adult's tense module 118, the correspondence performance of the performed various actions pattern of robot may be different from its corresponding performance performed in children's tense module 114.For example, in the time carrying out sleep pattern 118d, robot the needed length of one's sleep can be shorter than its needed length of one's sleep in sleep pattern 114d.In addition, in the time that pattern 118e is got up in execution, the needed time of being feeling too lazy to get out of bed of robot can be shorter in the needed time of being feeling too lazy to get out of bed of getting up in pattern 114e than it.Certainly, the present invention is not limited to this.In other embodiments, corresponding behavior or the state that can show in different life stage processes according to living individual, carry out adjustment slightly to the behavior pattern in various different tense modules.
Therefore, tense control module 111 can produce by common virtual space the emotional reactions of different times, and controls peripheral system by driving element driver element 130.Thus, tense control module 111 just can be according to the system age of robot, expresses the behavior performance of robot at different tense modules, the corresponding behavior or the state that show in different life stage processes as living individual fine and smoothly.
Please refer to Fig. 2, in the present embodiment, time quantum 150 comprises physiological clock record sheet 152, with recorder people's system age and physiological clock thereof, and can switch by temporal information the tense module of tense control module 111.In addition, physiological clock record sheet 152 is in order to record imitative life cycle and the physiology clock period.Corresponding behavior or state that imitative life cycle shows in different life stage processes for simulation living individual, and the physiological clock cycle is the work and rest of simulation living individual daily life.In the present embodiment, imitative life cycle and physiological clock cycle can be set in advance in physiological clock record sheet 152, but the present invention is not limited to this.
Therefore, the relation between system age and imitative life cycle that time quantum 150 can record according to it, the tense module of switching tense control module 111.For example, along with the growth at system age, the tense module that time quantum 150 can switch tense control modules 111 by children's tense module 114 to adult's tense module 118.Specifically,, when emotion engine system 100 is after initialization, the system age of the robot that physiological clock record sheet 152 records can be 0 years old.Afterwards, as time goes by, physiological clock record sheet 152 continues the system age of the robot of record.For example, when the real time is through one time, the system age of the robot that physiological clock record sheet 152 records is 1 years old.Therefore,, when the real time is during through two days, the system age of robot is 2 years old, by that analogy.Certainly, the present invention is not limited to this.
Then, the relation between system age and imitative life cycle that time quantum 150 can record according to it, the tense module of switching tense control module 111.For example, if the imitative life cycle predetermined system age enters adult's tense while being 20 years old,, when the real time is during through 20 days, time quantum 150 will switch the tense module of tense control module 111 to the tense module 118 of being grown up.Therefore,, along with the growth at robot system age, the tense module of tense control module 111, can be switched to via growth/transition interface 116 adult's tense module 118.
In addition, in the time receiving the temporal information of time quantum 150, tense control module 111 also can be according to physiological clock and the relation between the physiological clock cycle, and control is carried out in above-mentioned behavior pattern corresponding one of them.Specifically,, if carry out the pattern of getting up when 7 of default mornings in physiological clock cycle,, in the time that the real time is 7 of mornings, time quantum 150 can provide temporal information to tense control module 111, the pattern so that the 111 controls execution of tense control module are got up.Similarly, if carry out sleep pattern when at 7 in default evening in physiological clock cycle,, in the time that the real time is at 7 in evening, tense control module 111 can be according to temporal information control execution sleep pattern.In like manner, user can preset the physiological clock cycle in the specific time, makes robot carry out in above-mentioned behavior pattern corresponding one of them.
Therefore, emotion engine system 100 just can be according to the relation between system age and imitative life cycle, and physiological clock and the relation between the physiological clock cycle, express the behavior performance of robot at different tense modules fine and smoothly, and in the specific time, make robot carry out default behavior pattern, the state showing in different life stage processes as living individual, and living individual is at corresponding physiological behavior of different time.
In the present embodiment, sensing cell 120 is in order to detecting or the different enable signal of sensing, and then corresponding initial of behavior control module 110 is provided or triggers sensing signal.Sensing cell 120 comprises that a contact sensing module 120a, a sound sensing positioning module 120b and rock sensing module 120c.
Element drives unit 130 is controlled by behavior control module 110, in order to drive peripheral unit to carry out corresponding behavior.Element drives unit 130 comprises a motor control unit 130a, a multi-media voice control module 130b.
For example, in the time that user sends whomp arround robot, the sound sensing positioning module 120b of sensing cell 120 can be enabled and detect the orientation at the user place of sending whomp, and produces a sensing signal to behavior control module 110.Therefore, behavior control module 110 can make robot move toward the orientation detecting by CD-ROM drive motor control module 130a by this.Then, when robot arrives user while sending the position of whomp, multi-media voice control module 130b just can drive machines people send the default sound, with user's interaction.Now, if when user is stroked robot or is rocked, contact sensing module 120a or rock sensing module 120c and just can transmit its sensitive information to behavior control module 110,110 of behavior control modules drive multi-media voice control module 130b and user to carry out other interaction again.
In addition, the corresponding behavioral data of behavior pattern that behavior database 140 stores in each tense module, and the corresponding behavioral data of the emotional responses of robot, and above-mentioned behavioral data is distinguished the multiple situation case points in corresponding fields space.In addition, the stored behavioral data of behavior database 140 comprises image data, voice data and motor behavior data.In the present embodiment, image data can be the lattice array picture library of light emitting diode picture element (LED pixel).Voice data can be various scene audio information, and it can do by multimedia wafer (media chip) adjustment of the change of voice or sounding speed, the demand changing to reach various situations.Motor behavior data can be the information about various motor attitude actions and motion control track.
Specifically, in the present embodiment, behavior database 140 has stored the corresponding image data of behavior pattern, voice data and the motor behavior data in children's tense module 114.Therefore, in the time that behavior control module 110 is carried out the correspondence performance of various actions pattern by element drives unit 130 controls, robot can show corresponding behavior reaction according to the behavioral data 140a of the stored children's tense module of behavior database 140.Thus, emotion engine system 100 can be expressed the behavior performance of robot fine and smoothly.Similarly, behavior database 140 has also stored the behavioral data 140b of the corresponding adult's tense of the behavior pattern module in adult's tense module 118, so that the behavior performance of robot in adult's tense module 118 is same fine and smooth.
Hence one can see that, the behavior control module 110 of this example can determine the corresponding behavioral data of the first behavior pattern according to temporal information, initial sensitive information, or according to temporal information, triggering sensitive information and the first behavior pattern, determine the corresponding behavioral data of the second behavior pattern.
Fig. 3 is the schematic diagram that the emotional responses of robot operates in Virtual Space.Please also refer to Fig. 2 and Fig. 3, the Virtual Space of the present embodiment can be produced by behavior control module 110, it can be a tri-vector space, its axes of coordinates represents respectively the happy degree (Pleasantness) of robot, clear-headed degree (Arousal) and concentration degree (Certainty), to express the emotion scalar values of imitative living individual.But the present invention is not limited to this, it also can be applicable to two dimension, or in the Virtual Space of other multidimensional.In addition, each axes of coordinates also can represent it can is angry degree, sad degree, randomness, intelligence degree or other different state of mind behaviors.
It should be noted that, in Virtual Space, axes of coordinates APC (Arousal, Pleasantness, Certainty) is psychologic concept, and the imitative different emotion point of living individual is positioned at different coordinates in Virtual Space, and each emotion point different behavior pattern of correspondence respectively.Moreover in Virtual Space, emotion point S is the emotional state that represents that robot is current.For example, in the time of robot and user's interaction, the emotional responses meeting that it produces affects the position of emotion point.In addition, behavior that the physiological clock of emotion engine system 100 inside sets performance, and the corresponding behavior producing because of the demand of outside situation also all can affect the position of emotion point.
At this, the stored behavioral data of behavior database 140 corresponds respectively to the multiple situation case point Q in Virtual Space.In the present embodiment, emotion point S can produce randomly an off-set value and form a random vibration interval of thereupon moving in an interval radius R, so that the emotional responses of robot is unlikely to stiff.In addition, along with the variation of time, the emotional responses that the emotional simulation unit 113 in behavior control module 110 can control forms a motion track T in Virtual Space, to imitate the emotional responses of living individual under different situations.Specifically, a computing quantizing is done in the stimulation that emotional simulation unit 113 can produce in response to the variation of external environment condition, and produces by this emotion vector V in Virtual Space to pass emotion point S, changes with the emotion of imitating living individual.At this, the variation of external environment condition and the stimulation that produces are for example temporal informations, initial or trigger sensitive information, the first behavior pattern and random signal etc. and quantize and obtain, but the present invention is not limited to this.
In another embodiment, the Virtual Space that behavior control module 110 produces also can have a convergence point P.This convergence point P can be positioned at initial point or arbitrary coordinate points of Virtual Space, in order to represent the personality characters of robot.For example, the coordinate of convergence point P has larger happy degree and lucidity, and the personality characters that represents robot is more optimistic, rationality.On the contrary, if the coordinate of convergence point P has less happy degree and larger concentration degree, may represent that robot is comparatively stiff, stubborn.At this, convergence point P can immediately provide a virtual recurrence power.In other words, convergence point P can provide recurrence power according to temporal information.In the time that robot does not receive sensing signal, emotion point S can move to convergence point P, and the emotional responses with dummy robot in the time not being upset is gradually to return back to personality characters originally.And then in the time that robot is upset, the recurrence power of its convergence point P also can provide behavior control module 110 in order to determine suitable behavior pattern, in the time receiving similar events, produce different emotional responsess with the different personality characters of dummy robot.In other words, receive after initial sensitive information to receiving the time span information triggering between sensitive information, also can affect the corresponding behavioral data of the second behavior pattern because of the recurrence power of convergence point P.
In another embodiment, in order to reach the diversified demand of emotion individual character of imitative life entity, behavior control module 110 also can be controlled convergence point P and change according to time of different rules or erratic mode the coordinate position at its place, to reach to the feature of its individual character convergence.For example, behavior control module 110 is according to the information of time quantum 150 in the time of a predetermined physiology phase, and mobile this convergence point P is to happy degree, lucidity and all lower coordinate points of concentration degree.Or in childhood or when adult, behavior control module 110 is controlled convergence point P, makes it in different coordinate positions.So, behavior control module 110 can allow robot have to approach the mood performance of real life.
Furthermore, in the time that robot is subject to outside stimulus, emotional simulation unit 113 produces an emotion vector V according to outside stimulus to emotion point S, and convergence point P provides a recurrence power F to emotion point S, makes by this emotion point S move to another coordinate points.Then, emotional simulation unit 113 calculate now with the immediate situation case point of emotion point S Q why, with the behavior pattern that determines to carry out.
In the present embodiment, the emotion point that the process of emotion computing experiences can be selected the emotional expression mode of robot instantly according to the minimum distance of itself and situation case point, and it comprises intensity and the degree of depth of emotion.In Virtual Space, situation case point is mainly being recorded a kind of expression characteristic of particular emotion, and the expression characteristic of this kind of particular emotion can be when robot is happy, excited, fear or when sad its performed behavior performance.
For example, in the time that user sends whomp arround robot, for behavior control module 110, it is the stimulation of an outside or can be described as and user's interaction.Now, the emotional responses of robot is started by the emotion point in Virtual Space, by the computing of emotional simulation unit 113, makes its emotion point along with the variation of time, forms a motion track.In the process of emotion computing, emotion point can be selected the emotional expression mode of robot instantly according to the minimum distance of itself and situation case point.Afterwards, at convergence point P and return under the pullling of power F, to restore emotion.
Certainly,, in this process, robot is also likely subject to other stimulation again, and expresses specific emotional responses.For example, when robot is during in the stimulation that is subject to whomp for the first time, carry out happy behavior performance, and in the process of emotion computing, when robot is during in the stimulation that is subject to whomp for the second time, can carry out the behavior performance of helpless or doubt etc.
In the present embodiment, even if robot is not subject to any stimulation, emotional simulation unit 113 also can carry out emotion computing, so that robot shows various emotional responses, unlikely too stiff.
In addition, in the present embodiment, can increase the situation case point in Virtual Space by increasing the stored behavioral data of behavior database 140, reach the difference in robot individual character.In addition, each stored behavioral data of behavior database 140 is independent, by index search unit 140c, and the coordinate position in Virtual Space, mark is lived each situation index relative.By the running of behavior database 140, it can be sorted out behavioral data according to various actions performance and emotional responses, and according to the demand of tense control module 111, switches its corresponding behavioral data.
The content disclosing according to above-described embodiment, below provides a kind of control method of robot.Fig. 4 is the process flow diagram of the robot control method of one embodiment of the invention.In the present embodiment, the control method of robot is applicable to have the robot of emotional responses.Above-mentioned control method comprises following step.
First,, in step S401, by behavior control module 110, provide one first behavior pattern.At this, behavior control module 110 can receive an initial sensitive information after start, and the first default behavior pattern is provided.In the present embodiment, initial sensitive information can be power supply signal, but the present invention is not limited to this.
Then,, in step S403, in the time that sensing cell 120 is enabled, produces one and trigger sensitive information to behavior control module 110.At this, emotion engine system 100 can obtain environmental information by sensing cell 120, to differentiate environment interaction behavior type.That is to say, in the time that robot is subject to outside stimulation, can activation sensing cell 120.
Afterwards, in step S405, time quantum 150 produces a temporal information to behavior control module 110, in order to switch the tense module of tense control module 111 or other provide the information that determines that the second behavior is required.At this, determine that the required information of the second behavior can be to receive after initial sensitive information to receiving the time span information triggering between sensitive information.In other embodiments, also can before step S403, first perform step S405 so that temporal information to be provided.
Then,, in step S407, according to temporal information, triggering sensitive information, the first behavior pattern, determine the corresponding behavioral data of the second behavior pattern.At this, the mode that determines the corresponding behavioral data of the second behavior pattern can be to carry out emotion computing by emotional simulation unit 113.That is to say, by the emotion computing of emotional simulation unit 113, decision emotion is put the emotional expression mode of robot instantly.
Finally, in step S409, driving element driver element 130 is to carry out the second behavior pattern.Afterwards, emotion point S can and return under the pullling of power F at convergence point P, and to convergence point, P moves.
In another embodiment, also can further produce a random signal, and according to temporal information, triggering sensitive information, the first behavior pattern and random signal, determine the corresponding behavioral data of the second behavior pattern.
Fig. 5 is the detail flowchart of step S407 in Fig. 4.Please refer to Fig. 5, in step S501, by behavior control module 110 receiving time informations and triggering sensitive information.Then,, in step S503, by the emotion computing of emotional simulation unit 113, draw corresponding emotion vector V and the power of recurrence F according to type or the dynamics of the first behavior pattern, temporal information and triggering sensitive information.Afterwards, in step S505, then the random signal providing according to emotional simulation unit 113, pass the emotion point S that is positioned at old coordinate, and obtain the new coordinate of emotion point S, as shown in step S507.Then,, in step S509, emotional simulation unit 113 can be found out with the situation case point of the new coordinate minimum distance of emotion point S to offer tense control module 111.At step S511,111 foundations of tense control module tense and situation case point at that time provides the emotional expression mode of robot instantly.
In addition, it is worth mentioning that, the first behavior pattern does not only limit corresponding to default behavioral data after start, and can, according to temporal information or initial sensitive information, determine the corresponding behavioral data of the first behavior pattern.
In sum, the emotion engine that the embodiment of the present invention provides and automotive engine system thereof, it can describe fine and smoothly and express the personality characters of electronic installation, and have can modular design architecture, is not limited to external form and the design of specific electronic installation.In addition, the electronic apparatus control method that the embodiment of the present invention provides, it utilizes instant emotion computing and the concept of convergence point, can describe fine and smoothly and express the personality characters of robot.
Finally it should be noted that: above embodiment is only in order to technical scheme of the present invention to be described but not be limited, although the present invention is had been described in detail with reference to preferred embodiment, those of ordinary skill in the art is to be understood that: it still can be modified or be equal to replacement technical scheme of the present invention, and these modifications or be equal to replacement and also can not make amended technical scheme depart from the spirit and scope of technical solution of the present invention.

Claims (22)

1. an emotion engine, is suitable for an electronic installation, it is characterized in that, described emotion engine comprises:
One behavior control module, comprise an emotional simulation unit, in order to receive a temporal information, one triggers sensitive information, and provide one first behavior pattern and one second behavior pattern, wherein said temporal information comprises by receiving after an initial sensitive information to the time span information receiving between described triggering sensitive information, described triggering sensitive information is to produce according to described electronic installation and a user's interaction mode, and described behavior control module is according to described temporal information, described triggering sensitive information and described the first behavior pattern, determine the corresponding behavioral data of described the second behavior pattern,
At least one behavioral data in wherein said the first behavior pattern and the corresponding multiple behavioral datas of described the second behavior pattern difference,
Wherein said emotional simulation unit produces an emotion point and multiple situation case point of a Virtual Space, and wherein each described situation case point has the coordinate of a correspondence and the behavior pattern of a correspondence,
Wherein said emotional simulation unit obtains the emotion vector of a correspondence according to described temporal information, described triggering sensitive information and described the first behavior pattern, according to the coordinate of emotion point in described Virtual Space described in described emotion vector corrected, obtain and a situation case point of the emotion point bee-line of described correction, and according to described in the situation case point obtained determine the corresponding behavioral data of described the second behavior pattern.
2. emotion engine according to claim 1, it is characterized in that, wherein said emotional simulation unit also produces a random signal, and described behavior control module, according to described temporal information, described triggering sensitive information, described the first behavior pattern and described random signal, determines the corresponding behavioral data of described the second behavior pattern.
3. emotion engine according to claim 1, is characterized in that, wherein said behavior control module also, according to described initial sensitive information, determines the corresponding behavioral data of described the first behavior pattern.
4. emotion engine according to claim 3, is characterized in that, wherein said temporal information comprises a system age information of described electronic installation.
5. emotion engine according to claim 3, it is characterized in that, wherein said initial sensitive information is a power supply signal, and described behavior control module is receiving after described initial sensitive information, described the first behavior pattern is provided, and drives an element driver element to carry out described the first behavior pattern.
6. emotion engine according to claim 1, is characterized in that, wherein said behavior control module is also determining after the corresponding behavioral data of described the second behavior pattern, driving an element driver element to carry out described the second behavior pattern.
7. an emotion engine system, is applicable to an electronic installation, it is characterized in that, described emotion engine system comprises:
One behavior control module, comprises an emotional simulation unit, in order to one first behavior pattern and one second behavior pattern to be provided;
One sensing cell, connect described behavior control module, and in the time that described sensing cell is enabled, produces one and trigger sensitive information or extremely described behavior control module of an initial sensitive information, wherein said triggering sensitive information is to produce according to described electronic installation and a user's interaction mode;
One time quantum, connects described behavior control module, and produces a temporal information to described behavior control module, and wherein said temporal information comprises by receiving after described initial sensitive information to the time span information receiving between described triggering sensitive information; And
One behavior database, connects described behavior control module, and stores many behavioral datas,
At least one behavioral data in wherein said the first behavior pattern and respectively corresponding described many behavioral datas of described the second behavior pattern, and described behavior control module is according to described temporal information, described triggering sensitive information and described the first behavior pattern, determine the corresponding behavioral data of described the second behavior pattern
Wherein said emotional simulation unit produces an emotion point and multiple situation case point of a Virtual Space, and wherein each described situation case point has the coordinate of a correspondence and the behavior pattern of a correspondence,
Wherein said emotional simulation unit obtains the emotion vector of a correspondence according to described temporal information, described triggering sensitive information and described the first behavior pattern, according to the coordinate of emotion point in described Virtual Space described in described emotion vector corrected, obtain and a situation case point of the emotion point bee-line of described correction, and according to described in the situation case point obtained determine the corresponding behavioral data of described the second behavior pattern.
8. emotion engine system according to claim 7, it is characterized in that, wherein said emotional simulation unit also produces a random signal, and described behavior control module, according to described temporal information, described triggering sensitive information, described the first behavior pattern and described random signal, determines the corresponding behavioral data of described the second behavior pattern.
9. emotion engine system according to claim 7, is characterized in that, wherein said behavior control module according in described temporal information and described initial sensitive information at least one of them, determine the corresponding behavioral data of described the first behavior pattern.
10. emotion engine system according to claim 7, it is characterized in that, also comprise an element driver element, wherein said initial sensitive information is a power supply signal, and when described behavior control module receives after described initial sensitive information, described the first behavior pattern is provided, and drives described element drives unit to carry out described the first behavior pattern.
11. emotion engine systems according to claim 10, is characterized in that, wherein, when described behavior control module determines after the corresponding behavioral data of described the second behavior pattern, drive described element drives unit to carry out described the second behavior pattern.
The control method of 12. 1 kinds of electronic installations, is suitable for the electronic installation of a tool emotion engine, it is characterized in that, described control method comprises:
One first behavior pattern is provided;
In the time that a sensing cell of described electronic installation is enabled, produce one and trigger sensitive information, wherein said triggering sensitive information is to produce according to described electronic installation and a user's interaction mode;
Produce a temporal information, wherein said temporal information comprises by receiving after an initial sensitive information to the time span information receiving between described triggering sensitive information; And
According to described temporal information, described triggering sensitive information and described the first behavior pattern, determine the corresponding behavioral data of one second behavior pattern,
Wherein, in the step of the corresponding behavioral data of described the second behavior pattern of decision, comprising:
Produce an emotion point and multiple situation case point in a Virtual Space;
Obtain the emotion vector of a correspondence according to described the first behavior pattern, described temporal information and described triggering sensitive information;
According to the coordinate of emotion point in described Virtual Space described in described emotion vector corrected;
Obtain the situation case point with the emotion point bee-line of described correction; And
The situation case point of obtaining described in foundation determines the corresponding behavioral data of described the second behavior pattern.
13. control methods according to claim 12, it is characterized in that, also comprise and produce a random signal, and the corresponding behavioral data of described the second behavior pattern is to determine according to described temporal information, described triggering sensitive information, described the first behavior pattern and described random signal.
14. control methods according to claim 12, is characterized in that, also comprise according to described initial sensitive information and determine the corresponding behavioral data of described the first behavior pattern.
15. control methods according to claim 14, is characterized in that, wherein said temporal information comprises a system age information of described electronic installation.
16. control methods according to claim 15, is characterized in that, also comprise the type of analyzing described triggering sensitive information.
17. control methods according to claim 12, is characterized in that, wherein provide described the first behavior pattern to comprise:
Receiving after a power supply signal, provide described the first behavior pattern; And
Carry out described the first behavior pattern.
18. control methods according to claim 12, is characterized in that, wherein when determining, after the corresponding behavioral data of described the second behavior pattern, to drive an element driver element of described electronic installation to carry out described the second behavior pattern.
19. control methods according to claim 12, it is characterized in that, also comprise a convergence point is provided, and in the step according to the coordinate of emotion point in described Virtual Space described in described emotion vector corrected, provide a recurrence power so that described emotion point moves to described convergence point.
20. control methods according to claim 19, is characterized in that, wherein change the corresponding coordinate of described convergence point with the difference of described temporal information.
21. control methods according to claim 12, it is characterized in that, wherein said Virtual Space is a hyperspace, described hyperspace has a plurality of axes of coordinates, and described control method is obtained the corresponding situation case point of a new coordinate with a vector operation, and described a plurality of axes of coordinates of described Virtual Space represent respectively the state of mind behavior that described electronic installation is different.
22. control methods according to claim 12, it is characterized in that, wherein according to described in the situation case point obtained determine the step of the corresponding behavioral data of described the second behavior pattern comprise according to a system age information and described in the situation case point obtained, provide described the second behavior pattern corresponding behavioral data.
CN200910258052.8A 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device Active CN102103707B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910258052.8A CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910258052.8A CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Publications (2)

Publication Number Publication Date
CN102103707A CN102103707A (en) 2011-06-22
CN102103707B true CN102103707B (en) 2014-06-11

Family

ID=44156456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910258052.8A Active CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Country Status (1)

Country Link
CN (1) CN102103707B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI577194B (en) * 2015-10-22 2017-04-01 山衛科技股份有限公司 Environmental voice source recognition system and environmental voice source recognizing method thereof

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881108B (en) * 2014-02-27 2018-08-31 青岛海尔机器人有限公司 A kind of intelligent human-machine interaction method and device
CN104516873A (en) * 2014-12-12 2015-04-15 北京智谷睿拓技术服务有限公司 Method and device for building emotion model
CN105389735B (en) * 2015-11-18 2021-05-18 重庆理工大学 Multi-motivation emotion generation method based on SPFA algorithm
WO2018000260A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000266A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
WO2018000267A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000261A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
WO2018006372A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for controlling household appliance on basis of intent recognition, and robot
CN106471444A (en) * 2016-07-07 2017-03-01 深圳狗尾草智能科技有限公司 A kind of exchange method of virtual 3D robot, system and robot
CN106462124A (en) * 2016-07-07 2017-02-22 深圳狗尾草智能科技有限公司 Method, system and robot for identifying and controlling household appliances based on intention
WO2018006380A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Human-machine interaction system, device, and method for robot
CN106113062B (en) * 2016-08-23 2019-01-04 深圳慧昱教育科技有限公司 One kind is accompanied and attended to robot
US11501135B2 (en) 2018-05-29 2022-11-15 British Cayman Islands Intelligo Technology Inc. Smart engine with dynamic profiles
CN110871813A (en) * 2018-08-31 2020-03-10 比亚迪股份有限公司 Control method and device of virtual robot, vehicle, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1553845A (en) * 2001-11-07 2004-12-08 索尼公司 Robot system and robot apparatus control method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3140944B2 (en) * 1995-06-20 2001-03-05 松下電器産業株式会社 Kansei input device and data search device
JPH11212934A (en) * 1998-01-23 1999-08-06 Sony Corp Information processing device and method and information supply medium
CN1161700C (en) * 1999-04-30 2004-08-11 索尼公司 Electronic pet system, network system, robot and storage medium
JP2003089077A (en) * 2001-09-12 2003-03-25 Toshiba Corp Robot

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1553845A (en) * 2001-11-07 2004-12-08 索尼公司 Robot system and robot apparatus control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2003-89077A 2003.03.25

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI577194B (en) * 2015-10-22 2017-04-01 山衛科技股份有限公司 Environmental voice source recognition system and environmental voice source recognizing method thereof

Also Published As

Publication number Publication date
CN102103707A (en) 2011-06-22

Similar Documents

Publication Publication Date Title
CN102103707B (en) Emotion engine, emotion engine system and control method of electronic device
US8209179B2 (en) Speech communication system and method, and robot apparatus
US8145492B2 (en) Robot behavior control system and method, and robot apparatus
US11568265B2 (en) Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
US8306929B2 (en) Emotion engine, emotion engine system and electronic device control method
EP1508409A1 (en) Robot device and robot control method
KR101137205B1 (en) Robot behavior control system, behavior control method, and robot device
CN100423911C (en) Robot device and behavior control method for robot device
US20030130851A1 (en) Legged robot, legged robot behavior control method, and storage medium
CN110139732A (en) Social robot with environmental Kuznets Curves feature
US10576618B2 (en) Robot having communication with human, robot control method, and non-transitory recording medium
EP1569129B1 (en) Dialogue control device and method, and robot device
JP2005193331A (en) Robot device and its emotional expression method
US11074491B2 (en) Emotionally intelligent companion device
CN110196632A (en) Information processing unit, information processing method and program
JP2006110707A (en) Robot device
US20240027977A1 (en) Method and system for processing input values
KR20020067696A (en) Robot apparatus, information display system, and information display method
JP4552465B2 (en) Information processing apparatus, action control method for robot apparatus, robot apparatus, and computer program
JP2001157980A (en) Robot device, and control method thereof
WO2023037608A1 (en) Autonomous mobile body, information processing method, and program
JP2002192485A (en) Robot device, information display system and method, robot system, and recording medium
McNulty et al. Residual memory for background characters in complex environments
Ahmed Investigating human altruism towards robots with a novel and reconfigurable interactive social robot
JP2003305674A (en) Robot device, robot control method, recording medium and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant