WO2006120637A2 - Method of configuring a rendered behavior of an ambient device - Google Patents

Method of configuring a rendered behavior of an ambient device Download PDF

Info

Publication number
WO2006120637A2
WO2006120637A2 PCT/IB2006/051443 IB2006051443W WO2006120637A2 WO 2006120637 A2 WO2006120637 A2 WO 2006120637A2 IB 2006051443 W IB2006051443 W IB 2006051443W WO 2006120637 A2 WO2006120637 A2 WO 2006120637A2
Authority
WO
WIPO (PCT)
Prior art keywords
behavior
input
user
robot
ambient
Prior art date
Application number
PCT/IB2006/051443
Other languages
French (fr)
Other versions
WO2006120637A3 (en
Inventor
Albertus Josephus Nicolaas Breemen
Original Assignee
Koninklijke Philips Electronics N. V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N. V. filed Critical Koninklijke Philips Electronics N. V.
Publication of WO2006120637A2 publication Critical patent/WO2006120637A2/en
Publication of WO2006120637A3 publication Critical patent/WO2006120637A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life

Definitions

  • the present invention relates to a method of configuring a rendered behavior of an ambient device.
  • the present invention further relates to an ambient device whose rendered behavior can be configured.
  • Ambient devices with a mechanical renderable user interface provide the user with a natural and social interface.
  • One important design aspect which influences the effectiveness of the tasks for which the device is intended is the behavior of the device.
  • a behavior follows from the following factors: the physical appearance of the device, the decision logics of the device, - the animated/rendered behavior of the device, including the animated mechanical movement as well as the voice, sound and light characteristics.
  • EP 1 091 540 discloses a communication terminal having exchangeable parts where the user is able to replace e.g. one front cover with another front cover having another color or design pattern. In that way, the user is able to personalize his/her phone in a way to give it a distinctive outer appearance. This reference therefore only discloses how to change the physical appearance of the phone without changing the functionality or the "behavior" of the phone.
  • the present invention relates to a method of configuring a rendered behavior of an ambient device, the method comprising the steps of: - receiving input from a user indicating how the behavior of said ambient device should be, processing said input and based thereon extracting from a memory behavior factors associated to said input, and - rendering the behavior of said device based on said new behavior factors.
  • the behavior of the device can be changed and adapted to different users or different situations.
  • the behavior of the robot e.g. both the animations and the speech
  • the change of behavior of the device can comprise a different music selection of the MP3 player.
  • said input from the user comprises attaching physical objects to the ambient device.
  • the user can e.g. by adding glasses to a robotic face make the robot smart, by adding a baseball cap to the robotic head make the robot a sports lover, by dressing the robot in a blue t-shirt make the robot a quiet person, while a red t-shirt makes the robot temperamental, etc.
  • said input from the user comprises receiving said input from the user via user interface.
  • the user can via said interface select the behavior or the character of the ambient device in a precise way.
  • the interface may be a remote control or an interface comprised in the ambient device.
  • said behavior factors comprise personality trait parameters.
  • the personality trait can e.g. comprise openness, conscientiousness, extraversion, agreeableness, neuroticism, smart, friendly, dumb, wild boy, university, quiet, looser, etc.
  • said behavior factors comprise animation parameters for rendering the physical behavior or physical appearance of said device.
  • said animation parameters are either pre-scripted in table with set points or according to parameterized equations. In that way, the behavior can be rendered further by e.g. including physical motion in said behavior parameter.
  • the cooperation between e.g. the personality parameter and said animation parameters can e.g.
  • an example of rendering the physical appearance of said device is where the current animation parameters are associated with physical appearance of a dog, and wherein the new animation parameters are associated with physical appearance of a cat. In that way, the robot, which initially is a dog-shaped robot, is transferred to a cat-like robot based on the new animation
  • the present invention relates to a computer readable medium having stored therein instructions for causing a processing unit to execute said method.
  • the present invention relates to an ambient device whose rendered behavior can be configured, comprising: at least one sensor for receiving input from a user indicating how the behavior of said ambient device (100) should be, a processor for processing said input and based thereon extracting from memory behavior factors associated to said input, and - at least one actuator for rendering the behavior of said device based on said new behavior factors.
  • said at least one sensor comprises at least one sensor from a group consisting of: a touch sensor, - a light sensor, a temperature sensor, a camera,
  • the ambient device can sense various user inputs, such as oral input from the user by the speech recognizer, a visible input from the user by the camera, e.g. showing the ambient device a chess board indicates that the ambient device should be able to play chess, or by detecting when a physical object is attached to the object by the touch sensor.
  • said at least one actuator comprises at least one actuator from a group consisting of: - a display,
  • LED light emitting diode
  • figure 1 shows an ambient device according to the present invention
  • figure 2 is a flow diagram illustrating one embodiment of a method according to the present invention of how to change the behavior of an ambient device
  • figures 3-5 illustrate examples of how to change the behavior of an ambient device which is a robot.
  • Figure 1 shows an ambient device 100 according to the present invention, which e.g. can be a personal robot, any kind of consumer electronic device such as a smart audio player or the like comprising a sensor (Se) 101, a processor (Pr) 102, an actuator (Ac) 103 which in this embodiment comprises a dialogue engine (D_E) 105 and/or animation engine (A_E) 106 and a memory 104.
  • a sensor Se
  • Pr processor
  • Ac actuator
  • D_E dialogue engine
  • A_E animation engine
  • the sensor (Se) 101 is adapted to receive input 110 from a user 107 indicating how the behavior of said ambient device 100 should be, and based thereon an output signal is created identifying the indicated behavior.
  • Such input can e.g. be received from the user by attaching a physical object to the device, e.g. baseball cap, glasses, wig, etc., or it can be received from a user 107 via a user interface, either remotely or via a user interface comprised on/in the ambient device 100. Further, it can be received as oral instructions from the user 107.
  • Examples of sensors (Se) 101 to be used for detecting the user input is a touch sensor, light sensor, speech recognizer, temperature sensor, camera, microphone and the like.
  • the output signal identifying the indicated behavior is then processed by the processor (Pr) 102, which based thereon extracts data stored in said memory 104 containing pre-stored behavior factors. These behavior factors are then used for replacing the current or previous behavior factors, and thereby for changing the behavior of said ambient device 100.
  • Such behavior factors could e.g. comprise personality trait parameters such as openness, conscientiousness, extraversion, agreeableness, neuroticism, smart, friendly, dumb, wild boy, university, likes to talk, quiet, heroic, looser, helpful, na ⁇ ve, romantic, old, wise guy, citizen, men, woman, Barbie girl, etc.
  • the behavior factor can also comprise animation parameters for rendering the physical behavior or physical appearance of said ambient device 100 towards the user 107.
  • the behavior of said ambient device is demonstrated towards the user 107 via the activator (Ac) 103 comprising a dialogue engine (D_E) 105 and an animation engine (A_E) 106.
  • the activator may e.g. comprise multi-color LED's, sound, speech, RC servos, motors, wherein the animation engine (A_E) 106 preferably comprises software/hardware components for playing animations.
  • the animations are in one embodiment either pre- scripted in table with set points or according to parameterized equations.
  • the actuator is a multi-color LED
  • the set points can be the values of the individual color channels of the LED.
  • set point means in this context a particular value of some variable, e.g.
  • the animation engine (A_E) 106 can even be adapted to change some mechanical parts of the device 100 for other parts being of different colors, shapes or materials. In that way, the user can e.g. transform a dog-like robot into a cat-like robot.
  • the dialogue engine (D_E) 105 preferably comprises software/hardware components for rendering acoustic animations. It follows that through a rendered behavior and/or rendered language use the behavior of the ambient device 100 towards a user 107 can be changed. As shown in Fig. 1, the ambient device 100 may also be adapted to access an external memory 108 over a wireless communication channel 109 such as the Internet for extracting said behavior parameters, decision rules and animations which might not be pre- stored in said memory 104.
  • the behavior parameters can influence the dialogue processes by modulating the dialogue decision rules. This is illustrated in the following example: Example:
  • the influence of the personality trait parameter may influence the procedural animations by setting e.g. the speed of the animations, smoothness, etc.
  • FIG. 2 is a flow diagram illustrating one embodiment of a method according to the present invention of how to change the behavior of an ambient device, where the device receives an input (R_I) 201 from a user, indicating how the behavior of said ambient device 100 should be.
  • This input can comprise, assuming the ambient device is a personal robot, putting a baseball cap onto the personal robot' s head.
  • Such input is then sensed (S_I) 202 by a sensor, which in this particular case could comprise a touch sensor which results in an output signal indicating that a baseball cap has been put on the robot's head.
  • S_I sensor
  • P_S processor
  • the output signal indicates that the user wants the personal robot to be university, which both renders the animation (A_M) 204 of the robot as well as the dialogue (D_L) 205. It should, however, be noted that the animation (A_M) 204 and the dialogue (D_L) 205 can be used separately instead of being used simultaneously.
  • the input from the user comprises using a "personality memory card” comprising said behavior factors which can be put in a hardware slot in the ambient device.
  • behavior factors could be from famous people/pop stars and contain famous quotes and animated "moves"/" dance moves", etc.
  • the result from replacing the current or previous behavior factors with said new factors is an adjustment of the behavior of the robot (A_P) 206, and that the processor now operates based on the new behavior factors.
  • the adjustment of the behavior causes an adjustment of access levels to the memory and thereby adjustments of the ambient device knowledge.
  • the intelligence of the robot can be adapted to the personality trait parameters.
  • a user who owns a personal robot wants to watch a baseball game.
  • the user is interested in adjusting the behavior of the robot into a baseball fan and baseball expert. As mentioned previously, this is done by putting a baseball cap onto the robot's head which results in said adjusted behavior.
  • This might be of particular advantage for the user who is going to watch a baseball game, since not only can the robot in that way be an interesting company for the user, but based on the new behavior the robot can also possess enormous information relating to baseball or any other kind of sport. In that way, the user can be educated about e.g. each individual player in the baseball game.
  • FIG. 3 illustrates the above-mentioned examples, wherein in Fig. 3 the personal robot 203 has a "standard" behavior most suitable for the everyday life which is operated based on "standard” behavior factors.
  • Fig. 3 the personal robot 203 has a "standard" behavior most suitable for the everyday life which is operated based on "standard” behavior factors.
  • the user wants to change the behavior of said robot 203 into a smart robot by putting glasses 401 on the robot.
  • the sensor 101 senses e.g. a change of focus, it interprets it as if the user wants to change the behavior of the robot into a smart robot. This could e.g. be the case where the user is going to study and thus needs assistance from the root.
  • the intelligence level of the robot could even be adapted to the optical strength of the glasses, i.e. the more the optical strength is, the more clever the robot will be. In that way, the robot's intelligence can be adapted to the various users.
  • Figure 5 illustrates the previous example where a baseball cap is put on the robot 203 for changing the behavior of the robot into a baseball fan.
  • ambient devices 100 Another example of ambient devices 100 is where an MP3 player changes its music selection based on the color of the surroundings which is sensed by a light sensor.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word 'comprising' does not exclude the presence of other elements or steps than those listed in a claim.
  • the invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Abstract

This invention relates to a method of configuring a rendered behavior of an ambient device. An input is received from a user indicating how the behavior of said ambient device should be. This input is then processed and based thereon behavior factors associated to the input are extracted from a memory. These new behavior factors are then used for rendering the behavior of the device.

Description

METHOD OF CONFIGURING A RENDERED BEHAVIOR OF AN AMBIENT DEVICE
FIELD OF THE INVENTION
The present invention relates to a method of configuring a rendered behavior of an ambient device. The present invention further relates to an ambient device whose rendered behavior can be configured. BACKGROUND OF THE INVENTION
Ambient devices with a mechanical renderable user interface, such as robots and home dialogue systems, provide the user with a natural and social interface. One important design aspect which influences the effectiveness of the tasks for which the device is intended is the behavior of the device. One can say that a behavior follows from the following factors: the physical appearance of the device, the decision logics of the device, - the animated/rendered behavior of the device, including the animated mechanical movement as well as the voice, sound and light characteristics. EP 1 091 540 discloses a communication terminal having exchangeable parts where the user is able to replace e.g. one front cover with another front cover having another color or design pattern. In that way, the user is able to personalize his/her phone in a way to give it a distinctive outer appearance. This reference therefore only discloses how to change the physical appearance of the phone without changing the functionality or the "behavior" of the phone.
OBJECT AND SUMMARY OF THE INVENTION
It is an object of the present invention to improve the diverseness of such ambient devices by enabling a change of the behavior of such devices.
According to one aspect the present invention relates to a method of configuring a rendered behavior of an ambient device, the method comprising the steps of: - receiving input from a user indicating how the behavior of said ambient device should be, processing said input and based thereon extracting from a memory behavior factors associated to said input, and - rendering the behavior of said device based on said new behavior factors.
In that way, the behavior of the device can be changed and adapted to different users or different situations. In cases where the ambient device is a personal robot, the behavior of the robot, e.g. both the animations and the speech, can be changed. In cases where the ambient device is a MP3 player, the change of behavior of the device can comprise a different music selection of the MP3 player.
In an embodiment, said input from the user comprises attaching physical objects to the ambient device. In that way, the user can e.g. by adding glasses to a robotic face make the robot smart, by adding a baseball cap to the robotic head make the robot a sports lover, by dressing the robot in a blue t-shirt make the robot a quiet person, while a red t-shirt makes the robot temperamental, etc.
In an embodiment, said input from the user comprises receiving said input from the user via user interface. In that way, the user can via said interface select the behavior or the character of the ambient device in a precise way. The interface may be a remote control or an interface comprised in the ambient device. In an embodiment, said behavior factors comprise personality trait parameters. The personality trait can e.g. comprise openness, conscientiousness, extraversion, agreeableness, neuroticism, smart, friendly, dumb, wild boy, sportive, quiet, looser, etc.
Changing the behavior parameters may result in that the dialogue between said user and said device can easily be rendered. As an example, if a user says "hello" and the behavior parameter comprises "gentlemen", the ambient device might be adapted to respond by saying "good morning". On the other hand, where the behavior parameter comprises "wise guy", the ambient device might be adapted to respond by saying "hey dude". In an embodiment, said behavior factors comprise animation parameters for rendering the physical behavior or physical appearance of said device. In an embodiment, said animation parameters are either pre-scripted in table with set points or according to parameterized equations. In that way, the behavior can be rendered further by e.g. including physical motion in said behavior parameter. The cooperation between e.g. the personality parameter and said animation parameters can e.g. be such that the physical respond might be different between the personality parameters. For instance, said personality parameter "gentleman" can be adapted to respond differently to a greeting than the personality parameter "wise guy". Also, an example of rendering the physical appearance of said device is where the current animation parameters are associated with physical appearance of a dog, and wherein the new animation parameters are associated with physical appearance of a cat. In that way, the robot, which initially is a dog-shaped robot, is transferred to a cat-like robot based on the new animation
In a further aspect, the present invention relates to a computer readable medium having stored therein instructions for causing a processing unit to execute said method.
According to another aspect, the present invention relates to an ambient device whose rendered behavior can be configured, comprising: at least one sensor for receiving input from a user indicating how the behavior of said ambient device (100) should be, a processor for processing said input and based thereon extracting from memory behavior factors associated to said input, and - at least one actuator for rendering the behavior of said device based on said new behavior factors.
In an embodiment, said at least one sensor comprises at least one sensor from a group consisting of: a touch sensor, - a light sensor, a temperature sensor, a camera,
- a microphone, and
- a speech recognizer. In that way, the ambient device can sense various user inputs, such as oral input from the user by the speech recognizer, a visible input from the user by the camera, e.g. showing the ambient device a chess board indicates that the ambient device should be able to play chess, or by detecting when a physical object is attached to the object by the touch sensor.
In an embodiment, said at least one actuator comprises at least one actuator from a group consisting of: - a display,
- a light emitting diode (LED),
- a motor, a servomotor, a light source, and - a loudspeaker.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter. BRIEF DESCRIPTION OF THE DRAWINGS
In the following, preferred embodiments of the invention will be described referring to the figures, where
figure 1 shows an ambient device according to the present invention, figure 2 is a flow diagram illustrating one embodiment of a method according to the present invention of how to change the behavior of an ambient device, and figures 3-5 illustrate examples of how to change the behavior of an ambient device which is a robot.
DESCRIPTION OF PREFERRED EMBODIMENTS
Figure 1 shows an ambient device 100 according to the present invention, which e.g. can be a personal robot, any kind of consumer electronic device such as a smart audio player or the like comprising a sensor (Se) 101, a processor (Pr) 102, an actuator (Ac) 103 which in this embodiment comprises a dialogue engine (D_E) 105 and/or animation engine (A_E) 106 and a memory 104.
The sensor (Se) 101 is adapted to receive input 110 from a user 107 indicating how the behavior of said ambient device 100 should be, and based thereon an output signal is created identifying the indicated behavior. Such input can e.g. be received from the user by attaching a physical object to the device, e.g. baseball cap, glasses, wig, etc., or it can be received from a user 107 via a user interface, either remotely or via a user interface comprised on/in the ambient device 100. Further, it can be received as oral instructions from the user 107. Examples of sensors (Se) 101 to be used for detecting the user input is a touch sensor, light sensor, speech recognizer, temperature sensor, camera, microphone and the like.
The output signal identifying the indicated behavior is then processed by the processor (Pr) 102, which based thereon extracts data stored in said memory 104 containing pre-stored behavior factors. These behavior factors are then used for replacing the current or previous behavior factors, and thereby for changing the behavior of said ambient device 100. Such behavior factors could e.g. comprise personality trait parameters such as openness, conscientiousness, extraversion, agreeableness, neuroticism, smart, friendly, dumb, wild boy, sportive, likes to talk, quiet, heroic, looser, helpful, naϊve, romantic, old, wise guy, gentleman, men, woman, Barbie girl, etc. The behavior factor can also comprise animation parameters for rendering the physical behavior or physical appearance of said ambient device 100 towards the user 107.
The behavior of said ambient device is demonstrated towards the user 107 via the activator (Ac) 103 comprising a dialogue engine (D_E) 105 and an animation engine (A_E) 106. The activator may e.g. comprise multi-color LED's, sound, speech, RC servos, motors, wherein the animation engine (A_E) 106 preferably comprises software/hardware components for playing animations. The animations are in one embodiment either pre- scripted in table with set points or according to parameterized equations. As an example, if the actuator is a multi-color LED the set points can be the values of the individual color channels of the LED. The term set point means in this context a particular value of some variable, e.g. servo set point, LED intensity value set point, temperature set point, water level set point, sound intensity level set point, etc. An example of such parameterized equation, assuming the ambient device is a robot, is where the following equation is used for describing the movement of the robot: head_position(time)=constant*time. Now, if the robot is sad, the constant will be a low value (and thus the robot moves slow). If the robot is happy the constant is high (and thus the robot moves faster). Another example could be where the position of a body part is modeled by a generic second order dynamic system: ddx = constantl * x + constant2*dx+constant3. Depending on the personality, the three constants are set. This will result in a different devices behavior. The animation engine (A_E) 106 can even be adapted to change some mechanical parts of the device 100 for other parts being of different colors, shapes or materials. In that way, the user can e.g. transform a dog-like robot into a cat-like robot. The dialogue engine (D_E) 105 preferably comprises software/hardware components for rendering acoustic animations. It follows that through a rendered behavior and/or rendered language use the behavior of the ambient device 100 towards a user 107 can be changed. As shown in Fig. 1, the ambient device 100 may also be adapted to access an external memory 108 over a wireless communication channel 109 such as the Internet for extracting said behavior parameters, decision rules and animations which might not be pre- stored in said memory 104.
The behavior parameters can influence the dialogue processes by modulating the dialogue decision rules. This is illustrated in the following example: Example:
IF user says "hello " AND personality trait parameter=gentlemen
THEN say "good morning "
IF user says "hello " AND personality trait parameter=wise guy THEN say "hey due"
The influence can further comprise that threshold levels for making decisions are adjusted so that an ambient device 100 that is always patient (high threshold=gentleman) will be impatient (low threshold=wise guy).
Similarly, the personality trait parameter influences the animation engine by replacing a particular animation with others belonging to a given personality trait parameter, i.e. each personality may have different animations for greeting person. If e.g. the personality trait parameter=wise guy, the greeting can comprise "give me five" handshake, whereas the greeting for personality trait parameter=gentleman can comprise shaking hands. The influence of the personality trait parameter may influence the procedural animations by setting e.g. the speed of the animations, smoothness, etc.
Figure 2 is a flow diagram illustrating one embodiment of a method according to the present invention of how to change the behavior of an ambient device, where the device receives an input (R_I) 201 from a user, indicating how the behavior of said ambient device 100 should be. This input can comprise, assuming the ambient device is a personal robot, putting a baseball cap onto the personal robot' s head. Such input is then sensed (S_I) 202 by a sensor, which in this particular case could comprise a touch sensor which results in an output signal indicating that a baseball cap has been put on the robot's head. This output signal is received and processed by a processor (P_S) 203 which extracts pre-stored behavior factors associated to such an output signal. These new behavior factors are then replaced with current or previous behavior factors and used for operating the robot, i.e. for rendering the behavior of the robot. In this case, the output signal indicates that the user wants the personal robot to be sportive, which both renders the animation (A_M) 204 of the robot as well as the dialogue (D_L) 205. It should, however, be noted that the animation (A_M) 204 and the dialogue (D_L) 205 can be used separately instead of being used simultaneously.
In an embodiment, the input from the user comprises using a "personality memory card" comprising said behavior factors which can be put in a hardware slot in the ambient device. These behavior factors could be from famous people/pop stars and contain famous quotes and animated "moves"/" dance moves", etc.
The result from replacing the current or previous behavior factors with said new factors is an adjustment of the behavior of the robot (A_P) 206, and that the processor now operates based on the new behavior factors.
In an embodiment, the adjustment of the behavior causes an adjustment of access levels to the memory and thereby adjustments of the ambient device knowledge. In that way, a robot whose personality trait parameter=dumb has a limited access to information in the memory compared to a robot whose personality trait parameter=clever. In that way, the intelligence of the robot can be adapted to the personality trait parameters. Example:
A user who owns a personal robot wants to watch a baseball game. The user is interested in adjusting the behavior of the robot into a baseball fan and baseball expert. As mentioned previously, this is done by putting a baseball cap onto the robot's head which results in said adjusted behavior. This might be of particular advantage for the user who is going to watch a baseball game, since not only can the robot in that way be an interesting company for the user, but based on the new behavior the robot can also possess enormous information relating to baseball or any other kind of sport. In that way, the user can be educated about e.g. each individual player in the baseball game. Example:
Another example of user's intention to change the robotic behavior is where the user wants to play chess with the robot. This is indicated to the robot by e.g. holding a chess board in front of it. Here, the used sensor would preferably be a camera, which based on the black and white chess board pattern interprets the output data in a way that the user's intention is to play chess. In that way, the robot e.g. downloads data relating to chess from an external source. Figures 3-5 illustrate the above-mentioned examples, wherein in Fig. 3 the personal robot 203 has a "standard" behavior most suitable for the everyday life which is operated based on "standard" behavior factors. In Fig. 4 the user wants to change the behavior of said robot 203 into a smart robot by putting glasses 401 on the robot. When the sensor 101 senses e.g. a change of focus, it interprets it as if the user wants to change the behavior of the robot into a smart robot. This could e.g. be the case where the user is going to study and thus needs assistance from the root. The intelligence level of the robot could even be adapted to the optical strength of the glasses, i.e. the more the optical strength is, the more clever the robot will be. In that way, the robot's intelligence can be adapted to the various users. Figure 5 illustrates the previous example where a baseball cap is put on the robot 203 for changing the behavior of the robot into a baseball fan.
Another example of ambient devices 100 is where an MP3 player changes its music selection based on the color of the surroundings which is sensed by a light sensor. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word 'comprising' does not exclude the presence of other elements or steps than those listed in a claim. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A method of configuring a rendered behavior of an ambient device (100,
203), the method comprising the steps of: receiving input (110) from a user (107) indicating how the behavior of said ambient device (100) should be, - processing said input (110) and based thereon extracting from a memory
(104, 108) behavior factors associated to said input (110), and - rendering the behavior of said device (100, 203) based on said new behavior factors.
2. A method according to claim 1, wherein said input (110) from the user
(107) comprises attaching physical objects to the ambient device (100, 203).
3. A method according to claim 1, wherein said input (110) from the user (107) comprises receiving said input (110) from the user (107) via user interface.
4. A method according to claim 1, wherein said behavior factors comprise personality trait parameters.
5. A method according to claim 1 or 4, wherein said behavior factors comprise animation parameters for rendering the physical behavior or physical appearance of said device (100, 203).
6. A method according to claim 5, wherein said animation parameters are either pre-scripted in table with set points, or according to parameterized equations
7. A computer readable medium having stored therein instructions for causing a processing unit to execute a method according to claim 1-6.
8. An ambient device (100, 203) whose rendered behavior can be configured, comprising: at least one sensor (101) for receiving input (110) from a user (107) indicating how the behavior of said ambient device (100) should be, - a processor (102) for processing said input (110) and based thereon extracting from a memory (104, 108) behavior factors associated to said input (110), and at least one actuator (103, 105, 106) for rendering the behavior of said device (100, 203) based on said new behavior factor.
9. A device according to claim 8, wherein said at least one sensor (101) comprises at least one sensor from a group consisting of:
- a touch sensor,
- a light sensor, - a temperature sensor, a camera, and a microphone.
10. A device according to claim 8, wherein said at least one actuator (103, 105, 106) comprises at least one actuator from a group consisting of:
- a display,
- a light emitting diode (LED),
- a motor, a servomotor, - a light source, and a loudspeaker.
PCT/IB2006/051443 2005-05-10 2006-05-09 Method of configuring a rendered behavior of an ambient device WO2006120637A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP05103852.9 2005-05-10
EP05103852 2005-05-10

Publications (2)

Publication Number Publication Date
WO2006120637A2 true WO2006120637A2 (en) 2006-11-16
WO2006120637A3 WO2006120637A3 (en) 2007-02-08

Family

ID=36928306

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2006/051443 WO2006120637A2 (en) 2005-05-10 2006-05-09 Method of configuring a rendered behavior of an ambient device

Country Status (2)

Country Link
TW (1) TW200709014A (en)
WO (1) WO2006120637A2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2182634A (en) * 1985-11-05 1987-05-20 Sirius Spa Programmable robot
US20030179887A1 (en) * 2002-03-19 2003-09-25 Thomas Cronin Automatic adjustments of audio alert characteristics of an alert device using ambient noise levels

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2182634A (en) * 1985-11-05 1987-05-20 Sirius Spa Programmable robot
US20030179887A1 (en) * 2002-03-19 2003-09-25 Thomas Cronin Automatic adjustments of audio alert characteristics of an alert device using ambient noise levels

Also Published As

Publication number Publication date
WO2006120637A3 (en) 2007-02-08
TW200709014A (en) 2007-03-01

Similar Documents

Publication Publication Date Title
US11618170B2 (en) Control of social robot based on prior character portrayal
EP3381175B1 (en) Apparatus and method for operating personal agent
Camurri et al. An architecture for emotional agents
US8135128B2 (en) Animatronic creatures that act as intermediaries between human users and a telephone system
CN108363492A (en) A kind of man-machine interaction method and interactive robot
US20100115449A1 (en) System and method for controlling animation by tagging objects within a game environment
JPH10289006A (en) Method for controlling object to be controlled using artificial emotion
AU2016250773A1 (en) Context-aware digital play
JP2008279165A (en) Toy system and computer program
JP2013099823A (en) Robot device, robot control method, robot control program and robot system
US20020019678A1 (en) Pseudo-emotion sound expression system
WO2020129959A1 (en) Computer program, server device, terminal device, and display method
WO2023287735A1 (en) Digital character with dynamic interactive behavior
WO1999032203A1 (en) A standalone interactive toy
KR20090058760A (en) Avatar presenting method and computer readable medium processing the method
WO2006120637A2 (en) Method of configuring a rendered behavior of an ambient device
KR20200117712A (en) Artificial intelligence smart speaker capable of sharing feeling and emotion between speaker and user
US11550528B2 (en) Electronic device and method for controlling operation of accessory-mountable robot
Choi et al. Intelligent Wearable Assistance System for Communicating with Interactive Electronic Media
KR100673613B1 (en) Digital cellular phone
CN111773692A (en) MicroPython-based hardware driving method, equipment and storage medium
JP4316819B2 (en) Robot control apparatus, robot control method, program, and storage medium
KR102661381B1 (en) Apparatus and method for controlling operation of robot capable of mounting accessory
US20230359320A1 (en) Techniques For Adjusting A Detachable Display Capsule Of A Wrist-Wearable Device To Operationally Complement A Wearable-Structure Attachment, And Wearable Devices And Systems For Performing Those Techniques
Junius et al. Playing with the strings: designing puppitor as an acting interface for digital games

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

NENP Non-entry into the national phase in:

Ref country code: RU

WWW Wipo information: withdrawn in national office

Country of ref document: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06744882

Country of ref document: EP

Kind code of ref document: A2