WO2000032361A1 - Robot, procede de commande de robot et support d'enregistrement de programme - Google Patents
Robot, procede de commande de robot et support d'enregistrement de programme Download PDFInfo
- Publication number
- WO2000032361A1 WO2000032361A1 PCT/JP1999/006713 JP9906713W WO0032361A1 WO 2000032361 A1 WO2000032361 A1 WO 2000032361A1 JP 9906713 W JP9906713 W JP 9906713W WO 0032361 A1 WO0032361 A1 WO 0032361A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- instinct
- emotion
- output
- module
- robot device
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H11/00—Self-movable toy figures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
Definitions
- the present invention relates to a mouthpiece device, a control method for a robot device, and a mouthgram recording medium.
- the present invention relates to a robot device that operates naturally like a living body, a control method of the robot device, and a program recording medium.
- robotic devices that shape multi-legged living objects such as dogs and cats have been developed.
- Such conventionally proposed robotic devices have only been able to silently perform only pre-programmed and predetermined tasks, or have been able to perform only a simple sequence of actions.
- the present invention has been proposed in view of the above-described circumstances, and provides a robot device, a control method of a mouth robot device, and a program recording medium that can act with realism and a sense of life in the real world. Aim.
- the robot apparatus provides a feeling that a plurality of emotion units exhibiting various emotions output emotions by affecting each other.
- This robot device behaves like a living body having reality and a sense of life based on the output of an emotion module composed of a plurality of emotion units.
- control method of the robot apparatus includes: an emotion output step of outputting an emotion by a plurality of emotion units showing various emotions interacting with each other; and an emotion output in the emotion output step.
- the robot device is controlled based on the output in the emotion output step by a plurality of emotion units so that the robot performs a natural action like a living body having a reality and a sense of life.
- the program recording medium of the present invention an emotion output step of outputting emotions by a plurality of emotion units showing various emotions interacting with each other, and a mouth lock based on the emotion output by the emotion output step.
- the program recording medium has recorded therein a program for performing an operation control process for controlling the operation of the computer device.- In this program recording medium, based on the output in the emotion output process by a plurality of emotion units, a biological body having a realism and a sense of life is provided. The robot device is controlled to perform such a natural action as described above.
- the robot apparatus includes an instinct module in which a plurality of emotion units exhibiting various instinct output instinct, and an operation means operating based on the instinct output by the instinct module.
- This robot device is based on the output of an instinct module consisting of a plurality of instinct units, like a living body with reality and a sense of life. Behave as expected.
- control method of the robot apparatus includes an instinct output step in which a plurality of instinct units exhibiting various instinct output respective instincts, and a mouthpiece based on the instinct output in the instinct output step.
- the agram recording medium includes an instinct output step in which a plurality of instinct units exhibiting various instinct output respective instincts, and an operation of the mouth-bot device based on the instinct output in the instinct output step And a program for performing an operation control process for controlling
- a robot device is controlled so as to perform a natural action like a living body having reality and a sense of life based on outputs in a plurality of instinct units in an instinct output process.
- an emotion module in which a plurality of emotion units showing emotion output respective emotions
- an instinct module in which a plurality of instinct units indicating instinct output respective instinct
- an emotion module output The robot device includes an operation unit that operates based on the emotion and the instinct output by the instinct module. Based on the output of the module, it behaves as natural as a living body with reality and a sense of life.
- control method of the robot device includes a method of controlling a plurality of emotions.
- the emotion output step in which each emotion unit outputs an emotion
- the instinct output step in which a plurality of instinct units indicating instinct output the instinct
- the emotion output in the emotion output step and the instinct output step output
- a natural or living organism such as a living body having a sense of reality and a sense of life, is based on the output in the emotion output process by a plurality of emotion units and the output in the instinct output process by a plurality of instinct units.
- an emotion output step in which a plurality of emotion units showing emotions respectively output emotions
- an instinct output step in which a plurality of instinct units indicating instinct output instinct respectively
- a program for performing an operation control step of controlling the operation of the robot device based on the emotion output in the emotion output step and the instinct output in the instinct output step is recorded:
- FIG. 1 is a block diagram showing a configuration of a robot device to which the present invention is applied.
- FIG. 2 is a diagram showing a configuration of a program for controlling the robot device.
- Figure 3 illustrates the relationship between the emotion module and other objects:
- FIG. 4 is a flowchart to explain the operation when information is input from outside to the emotion module:
- FIG. 5 is a flowchart to explain the situation when the emotion module changes over time:
- Figure 6 illustrates the relationship between the instinct module and other objects, etc .:
- FIG. 9 is a diagram for explaining a state when the above-mentioned robot device and another robot device are communicating with each other.
- FIG. 10 shows a personal computer which shows the emotions and actions of the above-mentioned robot device. It is a figure for explaining a state at the time of control.
- the present invention c the robot apparatus 1 which is applied to the robot apparatus 1 having the configuration shown in FIG. 1, for example, a central processing unit for controlling the entire system (Central Processing Unit:. Hereinafter referred to as "CPU") 1 1 and a video camera 1 2 having a CCD (Charge Coupled Device) image sensor and video data from a video camera 1 2
- CPU Central Processing Unit
- video camera 1 2 having a CCD (Charge Coupled Device) image sensor and video data from a video camera 1 2
- the LSI 14 has, for example, a communication unit 14a including an interface such as serial communication, parallel communication, and SB communication if.
- An external personal computer is connected via the communication unit 14a.
- the personal computer] 00 can change, for example, a program for operating the CPU 11 via the LSI 14, or perform the operation:
- the LSI 14 stores various devices such as a PC force standard via a PC card interface 15 such as an ATA (Advanced Technology Attachment) flash memory card inserted into a PC card slot.
- the device 200 is connected to a communication device 300 such as a wireless communication card.
- the storage device 200 stores various parameters for controlling the emotion level of the emotion unit and the instinct level of the instinct unit. Specifically, control is performed by changing the emotion level of the emotion unit. Parameters such as emotional parameters, input action parameters, attenuation parameters, interaction parameters, etc., which are used to control the instinct unit by changing the instinct level of the instinct unit. Parameters, input action parameters, increment parameters, etc. are also stored: At the time of execution, these parameters are read from the storage device 200 and used.
- the robot device 1 includes first to fourth CPCs (Configurable Technology Attachment) components 20, 30, 40, 50, which constitute limbs, ears, mouths, and the like:
- the devices are connected to the serial bus hub (SBH) 14 b in the LSI 14 c.
- SBH serial bus hub
- the first CPC device 20 is a hub 21 for controlling each circuit in the device in accordance with a control command from the LSI 14, and a memory 2 for temporarily storing control signals, detection signals, and the like.
- an acceleration sensor 23 that detects acceleration, a potentiometer 24, and an actuator 25 that plays a role of a joint, etc .
- the acceleration sensor 23 has three axes in tens of milliseconds. The acceleration of each of the hubs 21 and To the CPU 11 via the bus hub 14b.
- Second CPC device 3 a Nono Bed 3 1, a memory 3 2, a rotational angular velocity sensor 3 3 for detecting a rotational angular velocity, a potentiometer 3 4, c rotation angular velocity sensor 3 3 and a Akuchiyueta 3 5 Detects rotational angular velocities in three axial directions in units of tens of milliseconds, and supplies the detection results to LSI 14 via hub 31 and serial bus hub 14b.
- the third CPC device 40 has contacted the hub 41, the memory 42, a light emitting diode (LED) 43, for example, which emits light to indicate external stimulation, and Touch to detect whether or not With sensors 4 4:
- LED light emitting diode
- the fourth CPC device 50 is a knob 51, a memory 52, a speaker 53 serving as a mouth for outputting sound to the outside, and a role of an ear for detecting an external sound.
- the robot device 1 has a multi-legged walkway port.
- the robot device 1 is a multi-legged articulated robot having the shape of an animal having four legs: the robot device is not limited to this. not, for example, a multi-joint type robot Tsu Bok of the two-legged walking Yo Rere c
- the above-described acceleration sensor 23 detects acceleration in the X-axis, Y-axis, and Z-axis directions.
- the rotational angular velocity sensor 33 detects the rotational angular velocity for the R, P, and Y angles when rotating about the X, Y, and Z axes as rotation axes.
- the program that controls the robot device 1 is designed in a hierarchical manner as shown in FIG. Specifically, the above-mentioned program is based on an embedded real-time system S (Operating System) operating on the hardware having the above-described configuration, and is additionally provided with system software, middleware, and application software. It is constituted by forming three layers.
- S Operating System
- the system software layer is composed of, for example, a device driver that directly controls devices and a server object that provides services to objects in upper layers.
- the middleware layer includes, for example, a recognition object that processes sensor information such as images, sounds, and contacts, a motion control object that controls robot motion such as walking and posture, and limbs, head, and tail. It is composed of motion generation objects that move and express-
- the emotion's instinct model object has an emotion module and an instinct module:
- emotion units are, for example, the current level of emotion (hereinafter referred to as. "Emotional level”), a minimum emotional level, maximum emotion It is composed of a level and a threshold that serves as a criterion for reporting emotions.
- the emotion unit is prepared only for the type of emotion to be handled, for example, joy, sadness, anger, fear, surprise, disgust, etc .:
- Each of these emotion levels is initialized by the value of the emotion parameter first. It then changes over time with external information, such as cognitive objects, etc .:
- Each emotion unit has the property of influencing each other by raising or lowering their emotional levels.
- the emotion level of the sadness unity when the emotion level of the sadness unity is high, the emotional level of the anger emotion unit is high.
- the emotion level of the joy emotion unit is high, the emotion level of the anger or disgust emotion unit is low.
- the instinct module handles the instinct unit as data, similarly to the emotion module.
- instinct The unit is Only the types of instinct to deal with are prepared, for example, consisting of instinct needs such as appetite, exercise, rest, love, knowledge, and sexual desire: Each of these instinct levels is first determined by the value of the instinct parameter It is initialized and then changes with the passage of time from external information, such as cognitive objects, etc .: Instinct units do not increase instinct levels with each other, unlike emotional units : However, the instinct module and the emotion module may influence each other. For example, when the instinct is “I am hungry”, it is easy to become angry.
- Each of the above-mentioned objects is composed of an object-oriented design, regardless of whether it is in the upper layer or the lower layer, receives information from other objects and changes its state accordingly. It outputs information according to its own status to other objects.
- the emotion module has two cases: the emotion level of each emotion unit changes when external information is input, and the emotion level changes over time.
- the recognition objects described above are the hardware of the first to fourth CPC devices 20, 30, 40, 50 shown in FIG. 1, which are hardware.
- sensor information input information such as color information of an image by a color sensor, sound information of a speaker by a sound sensor, contact information by a contact sensor, etc. is handled.
- the information of the recognition result is sent to the emotion module of the emotion-instinct model object as shown in Fig. 3:
- the emotion module determines the type of the input information as shown in FIG. 4 (step ST 1), and uses parameters corresponding to the input information, as shown in FIG. Change the emotion level of each emotion unit (step ST2): Then, the emotion module selects the emotion unit whose emotion level is the maximum among the emotion units whose emotion level exceeds the threshold. Select The selected emotion unit notifies the information to the object requesting the output, for example, an action generation object. Note that the object requesting output must register itself as an observer to the emotion module according to the object-oriented observer pattern. The emotion module may also accept input from objects that do not directly handle sensor information, such as receiving a message from the instinct module that eliminated frustration.
- the action creation object controls the hardware via the action creation object and the like. In other words, by controlling the first to fourth CPC devices 20, 30, 40, 50 shown in FIG. 1, for example, operations using limbs, head, tail, etc., sound generation, LED blinking Express your emotions with intervals, etc.
- step ST11 the emotion module initializes the emotion level / parameter, and proceeds to step ST12:
- step ST12 the emotion module uses the timer of LSI 14 to determine whether a certain time has elapsed. If the certain time has not elapsed, the emotion module waits for step ST12, and then waits for step ST12. If has elapsed, go to step ST13:
- step ST13 the emotion module decreases the emotion level of each emotion unit, and proceeds to step ST14.
- the magnitude of the attenuation is determined by the attenuation parameter stored in the storage unit 13-In step ST14, the emotion module changes the emotion level by the mutual suppression of each emotion Z stimulus, Proceed to step ST15. For example, when the fear is great, the joy decreases, and when the disgust is great, the anger increases.- The relationship of the interaction and its magnitude are determined by the mutual parameters stored in the storage unit 13. It is determined.
- step S15 the emotion module determines whether there is an emotion unit exceeding the threshold value. If there is no such emotion unit, the process returns to step ST12.If there is such an emotion unit, the emotion module returns to step ST12. Proceed to ST16.
- step ST16 the emotion module selects the emotion unit whose emotion level is the highest from the emotion units exceeding the threshold, and proceeds to step ST17-In step ST17, the emotion module Then, the information of the selected emotion unit is notified to the action generation object. Selected The sentiment module notifies the information to the object requesting output, for example, an action creation object, etc .:
- the sentiment module is a message that the instinct module has solved the frustration. In some cases, input from an object that does not directly handle sensor information may be received, such as receiving a message.
- the action generation object controls hardware via a motion generation object or the like. In other words, by controlling the first to fourth CPC devices 20, 30, 40, 50 shown in FIG. 1, for example, operations using limbs, head, tail, etc., sound generation, LED blinking Express your emotions at intervals. Then, the process returns to step ST 12 again.
- the emotion module sends the information of the emotion unit whose emotion level is the highest among the emotion units whose emotion level has changed due to external information or internal change, by notifying the other objects of the information. It is possible to notify the action generation object of the state when various emotions are involved.
- the action generation object generates the first to fourth CPC devices 20, 30, 40, 50, which are hardware, through system software, ⁇ S, based on information from the emotion module. Control.
- the emotion module notifies the action generation object of information on the emotion unit having the highest emotion level when each emotion is complexly and organically related. As a result, it is possible to realize the optimal emotional expression according to the situation at that time
- the robot device 1 is not only an emotion module that responds to input from the outside world, but also an instinct module that gradually increases the desire from inside. Therefore, a case where an action is performed based on the output of the instinct module will be described:
- the instinct module may change the instinct level of each instinct unit when external information is input, or may change its instinct level over time:
- the recognition object described above is used as various kinds of sensor information of the first to fourth CPC devices 20, 30, 40, 50 shown in FIG. It handles input information such as image color information, sound sensor sound information from a sound sensor, and contact information from a contact sensor. Then, when the recognition object recognizes the information to be notified, it notifies the instinct module of the instinct / instinct model object of the information of the recognition result as shown in FIG. 6:
- the instinct module determines the type of the input information as shown in FIG. 7 (step ST21) and uses the parameters corresponding to the input information to determine the type of the input information.
- Change the instinct level of each instinct unit step ST22: For example, the instinct unit of appetite increases the instinct level when the battery runs low, and the desire for eating and drinking, for example, the charge request
- the instinct module may also accept information output by objects that do not handle information from various sensors, such as the action generation module or the information output by the action generation module after completing the desire action: For example, when intense exercise is signaled, the instinct level of exercise greed declines significantly.
- the instinct module selects the instinct unit with the highest instinct level from the instinct units whose instinct level exceeds the threshold:
- the selected instinct unit notifies the information to the object requesting the output, for example, the action creation object, etc.
- the object requesting the output is an object-oriented observer. Depending on the bar pattern, you need to register yourself as an observer to the instinct module:
- the action generation object moves its limbs, head, tail, etc., so that it moves violently when the desire to exercise increases, and conversely, it takes a rest when the desire to rest increases.
- instinct to express instinct.
- the instinct module executes the processing of step ST31 and the subsequent steps shown in FIG.
- step ST31 the instinct module initializes the instinct level I parameter, and proceeds to step ST32.
- step ST32 the instinct module uses the timer of the LSI 14 to determine whether a fixed time has elapsed, and if the fixed time has not elapsed, waits for the step ST32, When the fixed time has elapsed, the process proceeds to step ST33.
- step ST33 the instinct module increases the instinct level of each instinct unit, and proceeds to step ST34-the magnitude of the increase is determined by the increase parameter stored in the storage unit 13 Is done.
- step ST34 the instinct module determines whether there is an instinct unit exceeding the threshold, and if there is no such instinct unit. Returns to step ST32, and if there is such an instinct unit, proceeds to step S ⁇ 35:
- step S ⁇ 35 the instinct module selects the instinct unit with the highest instinct level from the instinct units exceeding the threshold, and proceeds to step ST 36-In step S ⁇ 36,
- the module notifies the selected instinct unit of information to the client module such as the action creation object:
- the action creation object is based on the information from the instinct module and the system software Controls the first to fourth CPC devices 20, 40, 40, 50, which are hardware due to hardware and OS, to realize the optimal instinct expression according to the situation at that time can do.
- both the emotion module and the instinct module operate based on the information from various objects, but are controlled independently in parallel, respectively.
- the robot device 1 can naturally express the complicated mental state that has been mixed. You can:
- the robot device 1 can communicate with another robot device 1A (not shown) via the communication device 300.
- the emotion module of the robot device 1 The information of the emotion unit having the highest level is notified to the communication device 300 (for example, a wireless communication card).
- the communication device 300 wirelessly transmits the information of the emotion unit to another robot device 1A specified in advance.
- the other robot device 1A can read the emotion of the robot device 1 and communicate with each other by emotion between the robot device 1 and the other robot device 1A. Can be taken.
- the robot apparatus 1 determines that his or her territory has been damaged, the robot apparatus 1 takes an action corresponding to anger, such as barking as shown in FIG.
- the emotion level of the anger emotion unit of the mouth robot device 1 becomes higher: and, at this time, the communication device 300 of the robot device 1
- the emotion level of the anger emotion unit is sent to the other robotic device 1A:
- the other robotic device 1A that has received the anger feeling of the robotic device 1 takes an action in response to the anger, for example, as shown in FIG.
- the escape action taken by the mobile device 1A is performed by raising the level of self-fear or surprise by the other mouth robot device 1A in response to the anger sent from the robot device 1.
- the robot device 1 and the above-mentioned other robot device 1A are interrogated, it is possible to communicate by emotion and take an action corresponding thereto, but the action is limited to the above-described action.
- the other robotic device 1A may take a joyful action accordingly: that is, receiving the joyful feeling of the robotic device 1
- the other robot 1A increases the level of its own joy and takes a joy action together with the robot 1:
- the robot information can also be transmitted from the robot device 1 to the other robot device 1A in the same manner: With this, the information of the instinct unit is communicated by the robot device, and the Mi You can take a new case.
- the communication is not limited to wireless communication, but may be performed using a wired communication.
- information such as an emotion unit in the robot device 1 is recorded on a recording medium such as a memory card.
- this may be mounted on another robotic device 1A:
- the above robot device 1 is the same as the electronic bet in the virtual bet device described in Japanese Patent Application No. 10-030973. It is possible to communicate between the two.
- a recording medium such as a memory card is mounted on the robot device 1 and a control program recorded on the recording medium is installed.
- the control program recorded on the recording medium is a control program consisting of an OS, system software, middleware, and an application as shown in Fig. 2. .
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Toys (AREA)
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020007013955A KR20010052699A (ko) | 1998-11-30 | 1999-11-30 | 로봇 장치, 로봇 장치의 제어방법 및 프로그램 기록 매체 |
EP99972962A EP1136194A4 (en) | 1998-11-30 | 1999-11-30 | ROBOT, CONTROL PROCEDURE OF THE ROBOT AND MEDIUM FOR PROGRAM RECORDING |
US09/701,254 US7076331B1 (en) | 1998-11-30 | 1999-11-30 | Robot, method of robot control, and program recording medium |
HK02101082.5A HK1040664B (zh) | 1998-11-30 | 2002-02-11 | 自動裝置、自動裝置的控制方法及程序記錄媒體 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10/340716 | 1998-11-30 | ||
JP34071698 | 1998-11-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000032361A1 true WO2000032361A1 (fr) | 2000-06-08 |
Family
ID=18339637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1999/006713 WO2000032361A1 (fr) | 1998-11-30 | 1999-11-30 | Robot, procede de commande de robot et support d'enregistrement de programme |
Country Status (6)
Country | Link |
---|---|
US (1) | US7076331B1 (ja) |
EP (1) | EP1136194A4 (ja) |
KR (1) | KR20010052699A (ja) |
CN (1) | CN1146493C (ja) |
HK (1) | HK1040664B (ja) |
WO (1) | WO2000032361A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002120181A (ja) * | 2000-10-11 | 2002-04-23 | Sony Corp | ロボット装置及びその制御方法 |
WO2002045916A1 (fr) * | 2000-12-06 | 2002-06-13 | Sony Corporation | Robot, procede de commande du mouvement d'un robot et systeme de commande du mouvement d'un robot |
JP2002233979A (ja) * | 2001-01-19 | 2002-08-20 | Chinsei Ri | 知能型愛玩ロボット |
JP2002239256A (ja) * | 2001-02-14 | 2002-08-27 | Sanyo Electric Co Ltd | 自動応答玩具における感情決定装置および自動応答玩具 |
US7778730B2 (en) | 2005-12-09 | 2010-08-17 | Electronics And Telecommunications Research Institute | Robot for generating multiple emotions and method of generating multiple emotions in robot |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100624403B1 (ko) * | 2001-10-06 | 2006-09-15 | 삼성전자주식회사 | 인체의 신경계 기반 정서 합성 장치 및 방법 |
KR100858079B1 (ko) * | 2002-01-03 | 2008-09-10 | 삼성전자주식회사 | 에이전트 감정 생성 방법 및 장치 |
WO2005069890A2 (en) * | 2004-01-15 | 2005-08-04 | Mega Robot, Inc. | System and method for reconfiguring an autonomous robot |
KR100819248B1 (ko) * | 2006-09-05 | 2008-04-02 | 삼성전자주식회사 | 소봇의 감정변환 방법 |
CN101224343B (zh) * | 2007-01-19 | 2011-08-24 | 鸿富锦精密工业(深圳)有限公司 | 类生物及其部件控制模块 |
DE102007048085A1 (de) | 2007-10-05 | 2009-04-16 | Navalis Nutraceuticals Gmbh | Arzneimittel enthaltend Frauenmantel, Mönchspfeffer zur Behandlung von Endometritis, Vagintis |
DE502008002255D1 (de) | 2007-03-23 | 2011-02-24 | Navalis Nutraceuticals Gmbh | Arzneimittel enthaltend Fraumenmantel zur Behandlung von Endometritis, Vaginitis |
DE102007014595A1 (de) | 2007-03-23 | 2008-09-25 | Navalis Nutraceuticals Gmbh | Arzneimittel enthaltend Frauenmantel (Alchemilla vulgaris) zur Behandlung von Endometritis |
CN101411946B (zh) * | 2007-10-19 | 2012-03-28 | 鸿富锦精密工业(深圳)有限公司 | 玩具恐龙 |
US9155961B2 (en) | 2009-05-28 | 2015-10-13 | Anki, Inc. | Mobile agents for manipulating, moving, and/or reorienting components |
US8939840B2 (en) | 2009-07-29 | 2015-01-27 | Disney Enterprises, Inc. | System and method for playsets using tracked objects and corresponding virtual worlds |
FR2989209B1 (fr) * | 2012-04-04 | 2015-01-23 | Aldebaran Robotics | Robot apte a integrer des dialogues naturels avec un utilisateur dans ses comportements, procedes de programmation et d'utilisation dudit robot |
US9387588B1 (en) | 2014-08-25 | 2016-07-12 | Google Inc. | Handling gait disturbances with asynchronous timing |
US9618937B1 (en) | 2014-08-25 | 2017-04-11 | Google Inc. | Slip detection using robotic limbs |
US10081098B1 (en) | 2014-08-25 | 2018-09-25 | Boston Dynamics, Inc. | Generalized coordinate surrogates for integrated estimation and control |
US9446518B1 (en) * | 2014-11-11 | 2016-09-20 | Google Inc. | Leg collision avoidance in a robotic device |
US9499218B1 (en) | 2014-12-30 | 2016-11-22 | Google Inc. | Mechanically-timed footsteps for a robotic device |
US9996369B2 (en) | 2015-01-05 | 2018-06-12 | Anki, Inc. | Adaptive data analytics service |
US9594377B1 (en) * | 2015-05-12 | 2017-03-14 | Google Inc. | Auto-height swing adjustment |
US9586316B1 (en) | 2015-09-15 | 2017-03-07 | Google Inc. | Determination of robotic step path |
US9789919B1 (en) | 2016-03-22 | 2017-10-17 | Google Inc. | Mitigating sensor noise in legged robots |
CN106504614B (zh) * | 2016-12-01 | 2022-07-26 | 华南理工大学 | 一种积木式编程的教育机器人 |
JP6605442B2 (ja) | 2016-12-27 | 2019-11-13 | 本田技研工業株式会社 | 情報提供装置および情報提供方法 |
WO2019068634A1 (en) * | 2017-10-02 | 2019-04-11 | Starship Technologies Oü | DEVICE AND METHOD FOR DISTRIBUTING ARTICLES CONSUMABLE BY A MOBILE ROBOT |
KR20210020312A (ko) * | 2019-08-14 | 2021-02-24 | 엘지전자 주식회사 | 로봇 및 그의 제어 방법 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6224988A (ja) * | 1985-07-23 | 1987-02-02 | 志井田 孝 | 感情をもつロボツト |
JPH0612401A (ja) * | 1992-06-26 | 1994-01-21 | Fuji Xerox Co Ltd | 感情模擬装置 |
JPH10235019A (ja) * | 1997-02-27 | 1998-09-08 | Sony Corp | 携帯型ライフゲーム装置及びそのデータ管理装置 |
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4657104A (en) * | 1983-07-23 | 1987-04-14 | Cybermation, Inc. | Concentric shaft mobile base for robots and the like |
US5742738A (en) * | 1988-05-20 | 1998-04-21 | John R. Koza | Simultaneous evolution of the architecture of a multi-part program to solve a problem using architecture altering operations |
US5983161A (en) * | 1993-08-11 | 1999-11-09 | Lemelson; Jerome H. | GPS vehicle collision avoidance warning and control system and method |
US5963712A (en) * | 1996-07-08 | 1999-10-05 | Sony Corporation | Selectively configurable robot apparatus |
US5832189A (en) * | 1996-09-26 | 1998-11-03 | Interval Research Corporation | Affect-based robot communication methods and systems |
JP3765356B2 (ja) * | 1997-12-22 | 2006-04-12 | ソニー株式会社 | ロボツト装置 |
US6249780B1 (en) * | 1998-08-06 | 2001-06-19 | Yamaha Hatsudoki Kabushiki Kaisha | Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object |
-
1999
- 1999-11-30 EP EP99972962A patent/EP1136194A4/en not_active Withdrawn
- 1999-11-30 CN CNB998094846A patent/CN1146493C/zh not_active Expired - Fee Related
- 1999-11-30 WO PCT/JP1999/006713 patent/WO2000032361A1/ja not_active Application Discontinuation
- 1999-11-30 KR KR1020007013955A patent/KR20010052699A/ko not_active Application Discontinuation
- 1999-11-30 US US09/701,254 patent/US7076331B1/en not_active Expired - Fee Related
-
2002
- 2002-02-11 HK HK02101082.5A patent/HK1040664B/zh not_active IP Right Cessation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6224988A (ja) * | 1985-07-23 | 1987-02-02 | 志井田 孝 | 感情をもつロボツト |
JPH0612401A (ja) * | 1992-06-26 | 1994-01-21 | Fuji Xerox Co Ltd | 感情模擬装置 |
JPH10235019A (ja) * | 1997-02-27 | 1998-09-08 | Sony Corp | 携帯型ライフゲーム装置及びそのデータ管理装置 |
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
Non-Patent Citations (7)
Title |
---|
HIROHIDE USHIDA ET AL: "Emotional Model Application to Pet Robot (in Japanese)", PROCEEDINGS DISTRIBUTED AT LECTURE MEETING ON ROBOTICS AND MECHATRONICS PREPARED BY JAPAN MACHINERY SOCIETY, vol. 1998, no. PT1, 26 June 1998 (1998-06-26), pages 2CII4.5 (1) - 2CII4.5 (2), XP002929938 * |
MASAHIRO FUJITA ET AL: "Reconfigurable Physical Agents", PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS, 9 May 1998 (1998-05-09), pages 54 - 61, XP002926100 * |
MASAHIRO FUJITA ET AL: "Robot Entertainment (in Japanese)", PROCEEDINGS OF THE 6TH SONY RESEARCH FORUM, 27 November 1996 (1996-11-27), pages 234 - 239, XP002929935 * |
MASAHIRO FUJITA: "Robot Entertainment: Small Four-Legged Automatic Robot", TRANSACTIONS OF JAPAN ROBOT SOCIETY, vol. 16, no. 3, 15 April 1998 (1998-04-15), pages 31 - 32, XP002929934 * |
See also references of EP1136194A4 * |
SHUSUKE MOGI ET AL: "Basic Research on Artificial Psychology Model (in Japanese)", PRINTINGS AT 15TH STUDY MEETING BY HUMAN INTERFACE AND COGNITIVE MODEL RESEARCH GROUP, ARTIFICIAL INTELLIGENCE SOCIETY, 24 January 1992 (1992-01-24), pages 1 - 8, XP002929937 * |
TETSUYA OGATA ET AL: "Emotional Model and Internal Symbol Acquisition Model Based on Actions of the Robot (in Japanese)", PROCEEDINGS DISTRIBUTED AT LECTURE MEETING ON ROBOTICS AND MECHATRONICS PREPARED BY JAPAN MACHINERY SOCIETY, vol. 1998, no. PT1, 26 June 1998 (1998-06-26), pages P2CII4.3 (1) - 2CII4.3 (2), XP002929936 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002120181A (ja) * | 2000-10-11 | 2002-04-23 | Sony Corp | ロボット装置及びその制御方法 |
WO2002045916A1 (fr) * | 2000-12-06 | 2002-06-13 | Sony Corporation | Robot, procede de commande du mouvement d'un robot et systeme de commande du mouvement d'un robot |
US6889117B2 (en) | 2000-12-06 | 2005-05-03 | Sony Corporation | Robot apparatus and method and system for controlling the action of the robot apparatus |
US7076334B2 (en) | 2000-12-06 | 2006-07-11 | Sony Corporation | Robot apparatus and method and system for controlling the action of the robot apparatus |
CN1309535C (zh) * | 2000-12-06 | 2007-04-11 | 索尼公司 | 机器人设备、用于控制机器人设备运动的方法以及用于控制机器人设备运动的系统 |
KR100843822B1 (ko) * | 2000-12-06 | 2008-07-04 | 소니 가부시끼 가이샤 | 로봇 장치, 로봇 장치의 동작 제어 방법 및 로봇 장치의동작 제어 시스템 |
JP2002233979A (ja) * | 2001-01-19 | 2002-08-20 | Chinsei Ri | 知能型愛玩ロボット |
JP2002239256A (ja) * | 2001-02-14 | 2002-08-27 | Sanyo Electric Co Ltd | 自動応答玩具における感情決定装置および自動応答玩具 |
US7778730B2 (en) | 2005-12-09 | 2010-08-17 | Electronics And Telecommunications Research Institute | Robot for generating multiple emotions and method of generating multiple emotions in robot |
Also Published As
Publication number | Publication date |
---|---|
KR20010052699A (ko) | 2001-06-25 |
EP1136194A1 (en) | 2001-09-26 |
US7076331B1 (en) | 2006-07-11 |
HK1040664B (zh) | 2005-03-18 |
CN1312750A (zh) | 2001-09-12 |
HK1040664A1 (en) | 2002-06-21 |
CN1146493C (zh) | 2004-04-21 |
EP1136194A4 (en) | 2001-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000032361A1 (fr) | Robot, procede de commande de robot et support d'enregistrement de programme | |
US7515992B2 (en) | Robot apparatus and emotion representing method therefor | |
US8145492B2 (en) | Robot behavior control system and method, and robot apparatus | |
US6493606B2 (en) | Articulated robot and method of controlling the motion of the same | |
WO2004080665A1 (ja) | ロボット装置、その行動制御方法及びプログラム | |
JP4609584B2 (ja) | ロボット装置、顔認識方法及び顔認識装置 | |
JP3714268B2 (ja) | ロボット装置 | |
US20050197739A1 (en) | Behavior controlling system and behavior controlling method for robot | |
US20040210345A1 (en) | Buffer mechanism and recording and/or reproducing apparatus | |
KR20020067699A (ko) | 로봇 장치 및 로봇 장치의 행동 제어 방법 | |
JP2006110707A (ja) | ロボット装置 | |
JP2007125631A (ja) | ロボット装置及びその行動制御方法 | |
JP3925140B2 (ja) | 情報提供方法及び情報提供装置、並びにコンピュータ・プログラム | |
JP4296736B2 (ja) | ロボット装置 | |
JP2001236585A (ja) | 移動ロボット及び移動ロボットのための盗難防止方法 | |
JP2007125629A (ja) | ロボット装置及びその行動制御方法 | |
US11833441B2 (en) | Robot | |
WO2002030629A1 (fr) | Appareil robot, systeme d"affichage d"information et procede d"affichage d"information | |
JP2002059384A (ja) | ロボットのための学習システム及び学習方法 | |
JP2004114285A (ja) | ロボット装置及びその行動制御方法 | |
JP3501123B2 (ja) | ロボット装置及びロボット装置の行動制御方法 | |
JP2002086380A (ja) | 脚式ロボット及びその制御方法 | |
JP2002205289A (ja) | ロボット装置の動作制御方法、プログラム、記録媒体及びロボット装置 | |
JP4552465B2 (ja) | 情報処理装置、ロボット装置の行動制御方法、ロボット装置及びコンピュータ・プログラム | |
JP2005193330A (ja) | ロボット装置及びその情動表出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 99809484.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 09701254 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020007013955 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1999972962 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020007013955 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 1999972962 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1999972962 Country of ref document: EP |
|
WWR | Wipo information: refused in national office |
Ref document number: 1020007013955 Country of ref document: KR |