AU5649899A - Robot apparatus, method of controlling robot apparatus, method of display, and medium - Google Patents

Robot apparatus, method of controlling robot apparatus, method of display, and medium Download PDF

Info

Publication number
AU5649899A
AU5649899A AU56498/99A AU5649899A AU5649899A AU 5649899 A AU5649899 A AU 5649899A AU 56498/99 A AU56498/99 A AU 56498/99A AU 5649899 A AU5649899 A AU 5649899A AU 5649899 A AU5649899 A AU 5649899A
Authority
AU
Australia
Prior art keywords
data
feeling
storage means
model
robot apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU56498/99A
Other versions
AU768353B2 (en
Inventor
Masahiro Fujita
Kotaro Sabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of AU5649899A publication Critical patent/AU5649899A/en
Application granted granted Critical
Publication of AU768353B2 publication Critical patent/AU768353B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means

Description

1 DESCRIPTION Robot Apparatus, Control Method for Robot Apparatus, Displaying Method and Furnishing Medium Technical Field This invention relates to a robot apparatus having a feeling or behavior model changed by extraneous or innate factors, a control method for a robot apparatus, a furnishing medium, a display method for data written in storage means, and a furnishing medium. Background Art As an automatic mobile robot aimed at collecting marine data, an underwater search robot, for example, has been developed. For an underwater search robot, it is desirable to collect and record as many data difficult to collect as possible. These data include, as an example, seawater temperature data, sea stream data, depth data, terrestrial data and picture data. The operations of extracting and analysing effective data is usually carried out after the underwater search robot has returned to the water surface. Recently, an automatic mobile robot, aimed at entertainment, has been furnished. It is presumed that the entertainment perfonnance of the automatic mobile 2 robot will become higher by accumulating data responsive to conditions changing with lapse of time. However, in the automatic mobile robot, aimed at entertainment, protracted storage of meaningless data in the robot leads to increased memory costs. On the other hand, the operation of extracting effective data from the data stored in a robot is time- and labor-consuming. Moreover, it may be presumed that, if the data extracted from the automatic mobile robot can be browsed on the assumption that the data has been collected by the automatic mobile robot, or can be recognized as such, the amusement performance of the robot will be higher. Disclosure of the Invention In view of the above-depicted status of the prior art, it is an object of the present invention to provide a robot apparatus having a feeling model or a behavioral model changed by an extraneous or innate factor that is able to extract only effective data, a control method for the robot apparatus, a furnishing medium, a display method for displaying data written by the robot apparatus on memory means on a display unit, and a furnishing medium. That is, a robot apparatus according to the present invention has a behavioral model or a feeling model changed at least based on extraneous factors, and includes detection means for detecting extraneous states, storage means for storing data and 3 write control means for writing pre-set data in the storage means based on a detection signal detected by the detection means. In this robot apparatus, the pre-set data is written by the erasure control means in the storage means based on a detection signal detected by the detection means adapted for detecting the extraneous state. A method for controlling a robot apparatus according to the present invention has a behavioral model or a feeling model changed at least based on extraneous factors and includes a detecting step of detecting extraneous states by detection means and a write control step of writing pre-set data in the storage means based on a detection signal detected by the detection means. In this robot apparatus control method, having these steps, pre-set data is written in the storage means based on the detection signal detected by the detection means. A furnishing medium furnishes a program to a robot apparatus having a behavioral model or a feeling model changed at least based on extraneous factors, the program being configured to execute processing. The program includes a detecting step of detecting extraneous states by detection means, and a write control step of writing pre-set data in the storage means based on a detection signal detected by the detection means. By this furnishing medium, the robot apparatus writes pre-set data in the storage means based on the detection signal detected by the detection means.
4 A robot apparatus according to the present invention has a behavioral model for outputting a pre-set behavior command or a feeling model for outputting the feeling infonnation, and includes detection means for detecting extraneous states, storage means for storing data and write control means for writing pre-set data in the storage means based on the pre-set behavior command or the feeling information. This robot apparatus, having the above structure, writes pre-set data by write control means in storage means based on the pre-set behavior command or the feeling information. A method for controlling a robot apparatus according to the present invention is adapted to control a robot apparatus having a behavioral model or the feeling information outputting a pre-set behavior command. The method includes a step of outputting the pre-set behavior command or the feeling information based on the behavioral model or the feeling infonnation based on the input information and a write control step of writing pre-set data based on the pre-set behavior command or the feeling information. In the robot apparatus control method, having the above steps, pre-set data is written in storage means based on the pre-set behavior command or feeling information. A furnishing medium according to the present invention furnishes a program to a robot apparatus having a behavioral model outputting a pre-set behavior command or the feeling information, the program being adapted to execute processing including 5 a step of outputting the pre-set behavior command or the feeling infonnation based on the behavioral model or the feeling information based on the input information and a write control step of writing pre-set data based on the pre-set behavior command or the feeling information. By this furnishing medium, the robot apparatus is able to write pre-set data based on the pre-set behavior command or feeling infonnation. A robot apparatus according to the present invention has an instinct model for outputting the instinct information, and includes detection means for detecting extraneous states, storage means for storing data and write control means for writing pre-set data in the storage means. The write control means writes the pre-set data in the storage means based on the instinct information. In this robot apparatus, pre-set data is written in the storage means based on the instinct information. A method for controlling a robot apparatus having an instinct model outputting the instinct information, according to the present invention, includes an outputting step of outputting the instinct information by the instinct model based on the input information, and a write control step of writing pre-set data based on the instinct information. In this robot apparatus control method, pre-set data is written in the storage means based on the instinct information. A furnishing medium for furnishing a program to a robot apparatus having an 6 instinct model adapted to output the instinct information, the program being adapted to execute the processing including an outputting step of outputting the instinct information by the instinct model based on the input information and a write control step of writing pre-set data in storage means based on the instinct information. By this furnishing medium, the robot apparatus writes pre-set data in the storage means based on the instinct information. A robot apparatus according to the present invention has a behavioral model, a feeling model or an instinct model changed based at least on inner factors, the behavioral model, feeling model or the instinct model outputting a pre-set behavior command, feeling information or the instinct information based on the inner factor. The robot apparatus includes monitoring means for monitoring the inner state as the inner factor, storage means for memorizing data and write control means for writing the pre-set data in the storage means. The write control means writes the pre-set data in the storage means based on the monitored results by the monitoring means. In this robot apparatus, pre-set data is written in storage means based on the inner state. A method for controlling a robot apparatus having a behavioral model, a feeling model or an instinct model changed based at least on inner factors, the behavioral model, feeling model or the instinct model outputting a pre-set behavior command, feeling information or the instinct information based on the inner factor, according to the present invention, includes a write control step of monitoring the inner state as the 7 inner factor and writing the pre-set data in storage means based on the monitored results. In this robot apparatus control method, having these steps, pre-set data is written in storage means based on the inner state. A furnishing medium according to the present invention furnishes a program to a robot apparatus having a behavioral model, a feeling model or an instinct model changed based at least on inner factors, the behavioral model, feeling model or the instinct model outputting a pre-set behavior connand, feeling infonnation or the instinct information based on the inner factor, the program causing execution of the processing including a write control step of monitoring the inner state as the inner factor to write pre-set data in storage means based on the monitored results. By this furnishing medium, the robot apparatus writes pre-set data in the storage means based on the inner state. A display method according to the present invention includes a read-out step of reading out the pre-set data memorized in the storage means by a robot apparatus having a behavioral model, a feeling model and/or an instinct model changed based at least on extraneous factors and/or inner factors, the robot apparatus writing pre-set data in storage means depending co conditions, and a display step of displaying the pre-set data read out by the read-out step on a display. In this display method, having the above steps, the robot apparatus displays pre set data stored by the robot apparatus in the storage means.
8 A furnishing medium according to the present invention furnishes a program to a picture display apparatus adapted to demonstrate a picture on a display. The program is adapted to execute the processing including a read-out step of reading out pre-set data stored in the storage means by a robot apparatus having a behavioral model and/or a feeling model and/or an instinct model changed depending on an extraneous factor or an inner factor, the robot apparatus writing pre-set data depending on conditions and a displaying step of displaying in the display the pre-set data read out by the read-out step. By the furnishing medium, the picture display apparatus demonstrates pre-set data stored by the robot apparatus in the storage means on the display. Brief Description of the Drawings Fig. 1 is a perspective view of a pet type robot embodying the present invention. Fig.2 is a block diagram showing an innate electrical structure of the pet type robot. Fig.3 is a block diagram showing a detailed structure of a signal processing unit of the pet type robot. Fig.4 is a block diagram for illustrating the feeling model of the pet type robot. Fig.5 illustrates the relationship between the sensor input, feeling model, instinct model and the behavioral model in the pet type robot. Fig.6 shows a table for plural status transitions for determining a behavioral 9 output as a subsystem for the behavioral model. Fig.7 illustrates the principle of a probability automaton prescribing the status transitions. Fig.8 illustrates a neural network applicable to the behavioral model. Fig.9 illustrates a software layer and a hardware layer of the pet type robot. Fig. 10 is a flowchart for illustrating the processing of memorizing effective data on memory means based on feeling information changed with detection signals. Fig. 11 is a flowchart for illustrating the processing for memorizing effective picture data on a memory card based on an output value of the feeling model. Fig. 12 illustrates a memorizing structure of a memory card used for memorizing a picture based on an output value of the feeling model. Fig. 13 is a flowchart for illustrating the processing of memorizing effective picture data in storage means based on a behavior command of a behavioral model changed with detection signals. Fig. 14 is a flowchart for illustrating the processing of memorizing effective picture data in storage means based on a behavior command of a behavioral model changed with detection signals. Fig. 15 illustrates the pet type robot causing status transitions and memorizing a picture in storage means in the status transitions of the pet type robot. Fig. 16 is a flowchart for illustrating the processing of detecting a specified detection signal and memorizing effective data in association with the detection signal 10 in the storage means. Fig. 17 is a flowchart for illustrating the processing of memorizing effective data in the storage means based on the value of the detection signal. Fig. 18 illustrates the storage structure of a memory card in which a picture is stored based the value of the detection signal. Fig. 19 is a flowchart for illustrating the processing of memorizing effective data in the storage means based on an information input from outside. Fig.20 is a flowchart for illustrating the processing ofmemorizing effective data in the storage means based on the innate information. Fig.21 is a flowchart for illustrating the operation of reading out image data stored in a memory card. Fig.22 illustrates the step of taking out data stored in a memory card in the pet type robot from the memory card in a personal computer. Fig.23 is a front view showing a monitor displaying a picture stored in the memory card on a personal computer by a browser which is a browsing software Fig.24 illustrates a picture captured on the memory card when the pet type robot feels fear as an obstacle lying directly before it, with the corresponding output value of the feeling model exceeding a threshold value. Fig.25 illustrates the function of the browser for illustrating that the picture stored on the memory card can be displayed as a picture album. Fig.26 illustrates that the sentence displayed in combination with the picture on 11 the picture album is formulated into a database. Fig.27 illustrates the function ofthe browser function for illustrating that diurnal changes of the feeling output of the feeling model of the pet type robot can be demonstrated. Best Mode for Carrying out the Invention Referring to the drawings, a best mode for carrying out the invention will be explained in detail. The illustrated embodiment of the present invention is the application of the invention to a pet type robot. The pet type robot embodying the present invention is configured as shown for example in Fig. 1. This pet type robot 1 is made up of legs 2a, 2b, 2c and 2d, driven for movement, a head 3, housing a CCD (charge coupled device) video camera 11 (Fig.2) and a trunk 4. The pet type robot 1 behaves and changes its feeling in accordance with a program determining its own behavior based on extraneous and innate factors. It is noted that the program which determines the own behavior is constructed by a behavioral model or a feeling model. Specifically, the pet type robot 1 is configured to walk autonomously in association with the inputs from variable sensors, such as a touch sensor 20 of Fig.2 as later explained, based on a program determining its own behavior. The pet type robot 1 is provided with a PC card slot 14 for loading a memory 12 card 13. This pet type robot 1 is able to write pre-set data on the memory card 13 loaded on the PC card slot 14 depending on conditions. It is noted that the conditions of writing data on the memory card 13 may be based on a behavior command issued by the behavioral model, on the feeling information issued by the feeling model, on the dialog with a user (keeper), on the results of detection with the external states, or on the results of detection of the internal state caused by innate factors. The pet type robot 1 is constructed by having its various components electrically connected to one another as shown for example in Fig.2. The picture data picked up by a CCD video camera 11 is sent to a signal processing unit 12. This signal processing unit 12 processes the picture data routed from the CCD video camera 11 to memorize the picture data over an internal bus 18 in the memory card 13 or a DRAM (dynamic random access memory) 16 as memorizing means. A CPU (central processing unit) 15 reads out the operating program stored in a flash ROM (read-only memory) 17 over the internal bus 18 to control the entire system. The operating program of the CPU 11, stored in the flash ROM 17, can be formulated or modified by an external personal computer (PC) 31 connected to the signal processing unit 12. This CPU 15 has, as its functions, the writing control function of writing data in the memory card 13 or the DRAM 16 as memory means, an erasure control function of erasing data written in the memory card 13 and in the memory means, and a re-arraying function of re-arraying the data written in the memory means based on the data annexed to the data.
13 The signals detected by potentiometers 19a to 19d, making up detection means for detecting the external state, a touch sensor 20 and a microphone 21 are routed through branching portions 24a to 24e to the signal processing unit 12, which signal processing unit 12 routes signals sent from the branching portions 24a to 24e over the internal bus 18 to the CPU 15. The CPU 15 controls the operation of the actuators 22a to 22d and the legs 2a t o2d as well as head 3 driven thereby, based on the supplied signals. The CPU 15 controls the speech outputted from the speaker 23. It is noted that the potentiometers 19a to 19d, touch sensor 20, microphone 21, actuators 22a to 22d and the speaker 23 constitute legs, ears and a mouth of the pet type robot 1 and are collectively termed a CPC (configurable physical component) device. Fig.3 shows an illustrative structure of the signal processing unit 12. A DRAM interface 41, a host interface 42 and a ROM interface 43 are connected to the DRAM 16, CPU 15 and to the flash ROM 17, while being connected to an external bus 44. A bus controller 45 controls the external bus 44, whilst a bus arbiter 46 arbitrates between the external bus 44 and an internal bus 47. To a parallel port 48 and a serial port 50 is connected a personal computer (P) 31 as an external development environment. A battery manager 49 manages the residual capacity of a battery, not shown. The parallel port 48, battery manager 49 and the serial port 50 are connected over a peripheral interface 53 to the internal bus 47. The CCD video camera 11 furnishes the pictured picture data to a filter bank 14 FBK 56, which then thins out supplied picture data to formulate picture data of variable resolutions. These picture data are routed over the internal bus 47 to a direct memory access (DMA) controller 51. The DMA controller 51 transfers the furnished picture data to the DRAM 16 for storage therein. The DMA controller 51 causes the picture data stored in the dram 16 to be read out and routed to an IPE (inner product engine) 55. The IPE 55 executes pre-set calculations using the furnished picture data. The calculated results are transferred to the dram 16 in accordance with commands from the DMA controller 51 for storage therein. To a USB (universal serial bus) host controller 57 is connected a CPC device 25, which CPC device 25 is made up of, for example, the potentiometers 19a to 19d, touch sensor 20, microphone 21, actuators 22a to 22d and the speaker 23. The speech data furnished from the CPC device 25 are furnished via the USB host controller 57 to a DSP (digital signal processor) 52, which then executes pre-set processing on the furnished speech data. To the USB interface 58 is connected the personal computer (PC) 32 as an eternal developing environment. A timer 54 routes time information to respective components over the internal bus 47. The above is the structure of the pet type robot 1. The behavioral model or the feeling model of the pet type robot 1 is changed based on the extraneous or innate factors. The pet type robot behaves responsive to an output of the behavioral model or the feeling model.
15 A feeling model 64 of the pet type robot 1 is constructed as shown for example in Fig.4. The first to third sensors 61 to 63 detect stimuli applied from outside, such as environment, to convert the stimuli into electrical signals, which are outputted. These electrical signals are sent to first and second input evaluation units 71, 72. It is noted that the first to third sensors 61 to 63 are comprised not only of the potentiometers 19a to 19d, touch sensor 20, microphone 21, but of a speech recognition sensor and a picture color recognition sensor etc, and converts the actuations by the user in taking care of the robot 1 or the speech the or she enunciated into electrical signals, which are outputted. Outputs of the first to third sensors 61 to 63 are routed to the first and second input evaluation units 71, 72. The first input evaluation unit 71 evaluates the electrical signals furnished from the first to third sensors 61 to 63 to detect a pre-set feeling. This pre-set feeling may, for example, be the feeling of pleasure. The first input evaluation unit 71 sends an evaluation value of the detected feeling to a first feeling module 73. To the first feeling module 73 is allocated a pre-set feeling such that the feeling parameter is increased or decreased based on the evaluated feeling value furnished by the first input evaluation unit 71. If, for example, the "pleasure is allocated to the first feeling module 73, the parameter of the feeling "pleasure is increased or decreased based on the evaluated value of the feeling supplied from the first input evaluation unit 71. The first feeling module 73 sends the feeling parameter to an output selection unit 75.
16 Similarly, a second input evaluation unit 72 evaluates the electrical signals furnished from the first to third sensors 61 to 63 to detect the pre-set feeling. The pre set feeling here is, for example, the feeling of anger. The second input evaluation unit 72 sends the detected evaluation value of the feeling to a second feeling module 74. To the second feeling module 74 is allocated a pre-set feeling such that the feeling parameter is increased or decreased based on the evaluated feeling value furnished by the second input evaluation unit 72. If, for example, the "anger " is allocated to the second feeling module 74, the parameter of the feeling "Anger " is increased or decreased based on the evaluated value of the feeling supplied from the second input evaluation unit 72. The second feeling module 74 sends the feeling parameter to the output selection unit 75. The output selection unit 75 checks whether or not the feeling parameter supplied from the first and second feeling modules 73, 74 exceeds a pre-set threshold value, and outputs the feeling parameter exceeding the threshold value. If the two feeling parameters from the first and second feeling modules 73, 74 exceed the threshold value, the output selection unit 75 selects a larger one to output the selected parameter. A behavior generator 65 converts the feeling supplied from the output selection unit 75 into a command instructing a specified behavior to route the command to an output unit 66 while feeding the command back to an output evaluation unit 76. The output evaluation unit 76 evaluates the behavior supplied from the behavior 17 generator 65 and, if the behavior is performed, the output evaluation unit 76 performs control to decrease the feeling parameter corresponding to the behavior. An output unit 66 makes an output consistent with a behavior command from the behavior generator 65. The output unit 66 issues an output of the pet type robot 1 which then behaves in accordance with a behavior command from the behavior generator 65. That is, the output unit 66 is made up of the actuators 22a to 22d and the speaker 23 driving the components such as legs 2a t o2d, head 3 or the trunk 4, and drives pre-set actuators to turn the head 3 or issue a whining or meowing sound. The pet type robot 1 performs the behavior in this manner based on the feeling parameter of the feeling model. In addition, the pet type robot 1 is able to write pre-set data in storage means in the storage means based on feeling parameters. When the pet type robot 1 has done such behavior expressing the feeling, it writes the surrounding picture and sound as external states in the storage means. It is noted that the picture is captured by the CCD video camera 11 as external inputting means forcing a part of detection means detecting the external state, whilst the sound is captured by the microphone 21 as external inputting means. The picture is captured by a CCD video camera 11 as external inputting means, constituting a portion of detection means adapted for detecting the extraneous state, whilst the speech is captured by a microphone as external inputting means. In the following exclamation, it is assumed that the "pleasure" and "anger" are allocated to the first and second feeling modules 73, 74, respectively. It is also 18 assumed that the first sensor 61, second sensor 62 and the third sensor 63 are a picture color recognizing sensor, a sound recognizing sensor and a touch recognizing sensor 20, respectively. When fed from the picture color recognizing sensor (first sensor) 61, sound recognizing sensor (second sensor) 62 and from the touch recognizing sensor (third sensor 20) with electrical signals associated with the "yellow" , electrical signals corresponding to a pre-set frequency, such as "re" and with electrical signals corresponding to the "caressing" state, respectively, the first input evaluation unit 71 evaluates the respective signals to determine the evaluation value for "pleasure". The first input evaluation unit 71 routes the evaluation value "pleasure" to the first feeling module 73. The first feeling module 73 increases the feeling parameter based on the evaluation value for "pleasure". The feeling parameter is routed to the output selection unit 75. When fed from the picture color recognizing sensor (first sensor) 61, sound recognizing sensor (second sensor) 62 and from the touch recognizing sensor (third sensor 20) with electrical signals associated with the "red" , electrical signals corresponding to a pre-set frequency, such as "fa" and with electrical signals corresponding to the "hitting" state, respectively, the second input evaluation unit 72 evaluates the respective signals to determine the evaluation value for " anger". The second input evaluation unit 72 routes the evaluation value " anger" to the second feeling module 74. The second feeling module 74 increases the feeling parameter 19 based on the evaluation value for " anger". The feeling parameter is routed to the output selection unit 75. The output selection unit 75 checks whether or not the feeling parameter supplied from the first or second feeling modules 73, 74 exceeds a pre-set threshold value. It is assumed here that the feeling "anger" exceeds a threshold value. The behavior generator 65 converts the feeling parameter supplied from the output selection unit 75 into a command instructing a specified behavior (barking) to route the command to the output unit 66 , while causing the command to be fed back to the output evaluation unit 76. The output unit 66 issues an output in accordance with a behavior command (barking) from the behavior generator 65. That is, the output unit 66 outputs the corresponding sound. The "anger" is released by the pet type robot 1 barking so that its feeling of "anger" is suppressed. In this consideration, the output evaluation unit 76 decreases the feeling parameter of the second feeling module 74. Meanwhile, the above-mentioned output of the feeling model 64 is the feeling parameter differentiated with respect to time. That is, the larger the variation in the feeling parameter, the larger becomes an output of the feeling model 64. For example, if the feeling parameter "anger" of the pet type robot 1 is of a larger magnitude, the feeling parameter "pleasure" is rapidly changed (increased) by the robot viewing the yellow ball it likes. In this case, the picture data captured from the CCD video camera 11 is verified by the pet type robot 1 as being valid picture data so that it is stored in 20 memory means such as memory card 13. The above is the explanation of the feeling model for the pet type robot 1. The behavioral model for determining the behavior of the pet type robot 1 based on the various information is hereinafter explained with reference to Fig.5. The behavioral model determines the behavioral output for causing the operation of the pet type robot 1 by a sensor input, as shown in Fig.5. The sensor input is an input from the sensor for acquiring the external information such as the potentiometers 19a to 19c of the CPC device 25. This behavioral model M 3 has a table of plural transition states having different objectives for behavior as a subsystem. Specifically, referring to Fig.6, the subsystem includes a system management F 1 having system management as the objective for behavior, a posture management F 2 , having the posture management as the objective for behavior, and an obstruction evasion F 3 , having the obstruction evasion as the objective for behavior. The behavioral model M 3 also includes a reflection F 4 , having the reflective movement as the objective for behavior, a feeling expression F 5 having the feeling expression as the objective for behavior, and an autonomous behavior F 6 in general, having the autonomous behavior in general as the objective for behavior. The behavioral model M 3 also includes a game F 7 having the game playing as the objective for behavior, a performance F 8 having the performance as the objective for behavior, a soccer F 9 having the soccer operation as the objective for behavior and a recording Flo having data saving as the objective for behavior. The behavioral model 21
M
3 determines an behavioral output transferring from the current stat to the targeted state based on the above-described status transition table. The status transition table attaches priority to the respective states which are related with one another so that the behavior will be executed in the order of the priority sequence. In the present instance, the priority is increasing in the sequence of the recording Flo, soccer F 9 , performance F 8 , game F 7 , autonomous behavior F 6 , feeling expression F 5 , reflection F 4 , obstruction evasion F 3 , posture management F 2 and system management Fl. Thus, the system management F 1 , posture management
F
2 , obstruction evasion F 3 , reflection F 4 , feeling expression F 5 , autonomous behavior
F
6 in general, game F 7 , performance F 8 , soccer F 9 and recording Flo are executed in this sequence of priority responsive to the sensor input from the CPC device 25. Also, in making the behavioral output, this behavioral model M 3 refers to the feeling value (feeling parameter) as an output signal of the feeling model and to an instinct value (instinct parameter) as an output signal of the instinct model, as shown in Fig.5. In the feeling model MI, the feeling parameter is increased and decreased responsive to the input evaluation value based on the sensor input from the CPC device 25, while being also increased and decreased responsive to the output evaluation value obtained on having the behavior. That is, the feeling parameter of the feeling model M, is updated based on the input evaluation and on the output evaluation. Meanwhile, the feeling model M, includes the feeling due to reaction to 22 an input from an extraneous field or due to the innate status and that changed with lapse of time. Specifically, it includes grief, fear, surprise and hate, in addition to the aforementioned pleasure and anger. In the instinct model M 2 , the feeling parameter is increased and decreased responsive to the input evaluation value based on the sensor input from the CPC device 25, while being also increased and decreased responsive to the output evaluation value obtained on having the behavior. That is, the feeling parameter of the instinct model M 2 is updated based on the input evaluation and on the output evaluation. Meanwhile, the instinct model M 2 is mainly derived from the innate state and is changed gradually. It is a model based mainly on the desire, such as appetite, desire for exercise, rest, love, knowledge and sex. For example, the instinct model such as appetite can be obtained by having reference to the residual battery capacity. The ultimate behavioral output is done by the behavioral model M 3 with reference being made to the feeling value showing the feeling parameter changed with the input evaluation value and the output evaluation value or to the instinct value showing the instinct parameter. A behavior selection module 81 controls the CPC device 25 so that the operation will be consistent with the objective of the behavior by the behavioral output of the behavioral model M 3 to cause movements of the limb, head and the tail to complete the targeted action. This action is the aforementioned output evaluation value and fed back to the feeling model M, and to the instinct model M 2
-
23 As for the status transition table, the principle of the algorithm, termed the probability automaton, determining the state of probabilistic transition based on the transition probability, is used. Referring to Fig.7, the principle of the algorithm of the probability automaton is explained. Referring to Fig.7, if, in the algorithm termed the probability automaton, n states, where n is an integer, are represented by nodes NODEO to NODE,, whether transition occurs from a node NODEO to the other node NODE-NODEn is probabilistically determined based on the transition probability P 1 - Pn set respectively for arcs ARC, - ARC, interconnecting the NODEO to NODEn. The arc previously defines the states realized in the device ( pet type robot 1) and indicates the operation of the device during transitions between the respective states in order to cause the transition of the operations of the device between the defined states. By applying the algoritlun of the probability automaton to the status transition table, the following node may be determined, if the current state is the first node NODEO, based on the current status and on the information for status transition such as sensor input of the CPC device 25. Meanwhile, the behavioral model is not limited to taking a behavioral output based on the status transition table, but to taking other measures. For example, a behavioral model can be constructed using a neural network comprised by having reference to an information processing mechanism in a neural network. The neural network is constructed by an input layer 91, an intermediate layer 92 and an output 24 layer 93. For example, if such neural network is applied to the behavioral model of the pet type robot 1, the behaviors A,, A 2 , --- , Ak, as output of the output layer 93, where k is an integer, are determined by the sensor input of the CPC device 25, as the information of the inner state or the information of the outer state, through the input layer 91 and the intermediate layer 92. Also, in the neural network, weighted learning is executed so that, in the neural network, expected results of the behavior will be obtained from the expected input (infonnation of the inner state and the sensor input). In this manner, the pet type robot 1 is operated for expressing the feeling or takes a behavioral action by the feeling model and the behavioral model. Meanwhile, the feeling model of the pet type robot 1 has been explained as determining the behavior responsive to the feeling parameter. However, as for the operation based on the feeling model of the pet type robot 1, status transition may be caused to occur by having reference to the status transition table in the behavioral model responsive to the feeling parameter and the prevailing status. Specifically, the pet type robot 1 is made up of a software layer and a hardware layer. Fig.9 shows the software layer and the hardware layer making up the pet type robot 1. The software layer is constituted by a behavior generating module set 101, a recognition module set 102, a behavioral module set 103, a virtual robot 104 and a file system 105. The hardware layer is constructed by a robot hardware 106 constituting the main body portion of the pet type robot 1 and a memory card 13 as 25 storage means that can be mounted/dismounted to or from the pet type robot 1. The recognition module set 102 is fed with picture data, sound information or the contact information, as the sensor information of the CPC device 25. On recognition of the information to be informed from the sensor information, the recognition module set 102 outputs the information on the results of recognition to the behavior generating module set 101. That is, the recognition module set 102 recognizes with which information is associated the sensor information and outputs the results of recognition to the behavior generating module set 101. The behavior generating module set 101 is a module set for generating the behavior of the pet type robot 1 and initiates the targeted behavior of the pet type robot 1 based on the results of recognition from the recognition module set 102. It is through this behavioral module set 103 that the behavior generating module set 101 controls the CPC device 25 to initiate targeted behavior, such as the action employing the limb, head or tail, sound outputting or data storage in memory means. The robot hardware 106 is constituted by e.g., this CPC device 25. Moreover, control of the robot hardware 106 by the behavioral module set 103 is through the virtual robot 104. The virtual robot 104 is an imaginary robot which is the substitution of the real pet type robot 1 on the software. That is, the real pet type robot 1 is monitored on the software by the virtual robot 104. The operation of the real pet type robot is controlled based on the virtual robot 104. That is, the limb, head or the tail of the virtual robot 104 is operated or the sound radiated by an output of the 26 behavioral module set 103 to perform corresponding control ofthe robot hardware 106 of the real pet type robot 1. The file system 105 writes or read out data to or from the memory card 13 Specifically, the file system 105 writes or reads out the data to or from the memory card 13 by write or readout control by the behavioral module set 103. The above is the constitution of the portions of the pet type robot 1 responsible for its feeling and behavior. By the behavioral model or the feeling model, constructed as explained above, the pet type robot 1 operates responsive to changes in the extraneous factor ascribable to extraneous state or to those in the innate factor ascribable to the innate state. The pet type robot 1 is constructed to store picture or sound data as pre-set data in memory means, such as the memory card 13 or the DRAM 16, responsive to the operation by the behavioral model or the feeling model or to other conditions. The processing for storing data in the memory means in the pet type robot 1 is hereinafter explained. Specifically, the processing in case data is to be stored based on outputs of the behavioral model or the feeling model, in case data is to be stored based on the results of direct detection of the external state, in case data is to be stored based on the inputting of the pre-set information from outside and on the internal state as the internal state, is explained. It is first assumed that data is to be stored in the memory means based on the output of the feeling model.
27 Referring to Fig. 10, the CPU 15 verifies whether or not a detection signal as a sensor output of the CPC device 25 has been detected. The CPU 15 at step Si executes the decision processing as to whether or not a detection signal as a sensor input to the CPC device 25 has been detected. If, at step S1, the detection signal has been found to be detected, the CPU 15 advances to step S2. At step S2, the feeling information (feeling parameter) of the pre-set feeling model corresponding to the detection signal is generated responsive to the value of the detection signal. The processing at this step S2 corresponds to the outputting of the feeling model explained in connection with Fig.4. At step S3, the CPU 15 checks whether or not the feeling parameter is a specified feeling parameter (feeling information). For example, it is determined whether or not the feeling parameter reaches a pre-set value. If the CPU 15 has found that the feeling parameter is not the specified feeling parameter, the CPU 15 again performs the processing from step S1. If the CPU 15 has found that the feeling parameter is the specified feeling parameter, the CPU 15 advances to step S4. At step S4, the CPU 15 performs the operation corresponding to the feeling information and causes data to be stored in the memory means. The pet type robot 1 is responsive to a detection signal indicating the external state, as explained above, to output the feeling information from the feeling model to cause data to be stored in the memory means. The specified processing downstream of the outputting of the feeling model responsive to the detection signal and to which 28 are annexed conditions for verification is now explained by referring to the flowchart of Fig.11. First, at step S11, the CPU 15 checks whether or not an output value of the feeling model 64 (feeling parameter) has reached a pre-set threshold. Specifically, the CPU 15 checks whether or not the output value is larger than a pre-set threshold value. If it is decided at step S11 that the output value of the feeling model 64 has not exceeded the pre-set threshold value, the CPU 15 reverts to step S11. If, at step S11, the output value of the feeling model 64 is found not to exceed the pre-set threshold, the CPU 15 advances to step S12. At step S12, the CPU 15 checks whether or not there is any vacant area in the memory card 13. If, at step S12, it is found that there is a vacant memory area, the CPU 15 advances to step S13 to cause the picture data captured from the CCD video camera 11 to be stored in the vacant area of the memory card 13. The CPU 15 then causes the time and date data and the feeling parameter, in association with the picture data, as the characteristic information of the picture data. At step S14, the CPU 15 re-arrays the picture data in the order of the decreasing magnitudes of the feeling model 64. The CPU 15 then reverts to step S 11. That is, the memory area of the memory card 13 is made up of a header 111 memorizing the time and date data and the feeling parameter as the characteristic information and a picture data portion 112 memorizing the picture data, as shown in Fig. 12. The CPU 15 sorts the picture data in the order of the decreasing magnitude of the feeling output. There- 29 arraying of the picture data at step S14 occurs by the re-arraying function of the CPU 15 of re-arraying the pre-set data written in the memory means in accordance with the information corresponding to the pre-set data. If it is found at step S12 that there is no vacant memory area, the CPU 15 advances to step S15, where the CPU 15 checks whether or not the current output value of the feeling model 64 is larger than the smallest value of the feeling output accompanying the picture data memorized in the memory card 13. That is, the CPU 15 checks whether or not the current output value is larger than the value of the feeling output arrayed at the lowermost row in Fig. 12. If it is found at step S15 that the current output value is not larger or smaller than the smallest value of the memorized feeling output, the CPU 15 reverts to step S11. If it is found at step S 15 that the current output value is larger than the smallest value of the memorized feeling output, the CPU 15 advances to step S16 where the CPU 15 erases picture data corresponding to the smallest value of the feeling output. The picture data erasure is by the erasure control function of the CPU 15 in erasing pre-set data having the characteristic information appended thereto from the memory means. The CPU 15 then advances to step S13 to cause storage of the then prevailing feeling output. This causes the feeling output to be stored sequentially in the order of the decreasing magnitude of the feeling output. By the above processing, the pet type robot 1 is able to refer to the feeling 30 information of the feeling model to cause the data to be stored in the memory means. The pet type robot 1 may also be responsive to the feeling information of the feeling model to cause data to be stored in the memory means. In this case, the CPU 15 at step S21 checks whether or not the detection signal corresponding to the sensor input of the CPC device 25 is being detected, as shown in Fig. 13. The CPU 15 performs the decision processing at step S21 until detection of the detection signal. If it is found at step S21 that the detection signal has been detected, the CPU 15 advances to step S22. At step S22, the behavior command of the behavioral model is generated in association with the detection signal. The processing at this step S22 corresponds to the behavior output consistent with the status transition table explained in connection with Fig.5. At step S23, the CPU 15 checks whether or not the behavior command is a particular behavior command. If it is found that the behavior command is not a particular behavior command, the processing again is performed as from step S2 1. If it is found that the behavior command is a particular behavior command, the CPU 15 advances to step S24. At step S24, the CPU 15 performs the operation consistent with the feeling information and causes the data to be stored in the storage means. The pet type robot 1 is responsive to a detection signal specifying the external state, as explained above, to output a pre-set behavior command from the behavioral 31 model, to perform the operation consistent with the behavior command to cause data to be stored in the memory means. The pet type robot 1 may also be responsive to the instinct information of the instinct model to cause data to be stored in memory means. In this case, the CPU 15 at step S81 checks whether or not a detection signal corresponding to the sensor input of the CPC device 25 is being detected. The CPU 15 performs the decision processing at step S81 until detection of the detection signal. If it is found at step S81 that the detection signal has been detected, the CPU 15 advances to step S82. At this step S82, the CPU 15 generates the instinct information of the instinct model responsive to the detection signal. The processing at this step S82 is to correspond to the behavior output consistent with the status transition table explained in connection with Fig.5. That is, the behavior output is determined by having reference to the instinct information, with the pet type robot 1 taking a behavioral action consistent with the instinct in through the intermediary of the behavior output. At the next step S83, the CPU 15 verifies whether or not the instinct information is a particular instinct information. If the CPU 15 finds that the instinct information is not the specified instinct information, it performs the processing from step S81 again. If the CPU 15 finds that the instinct .information is the specified instinct information, it advances to step S84. At this step S84, the CPI 15 performs the operation consistent with the feeling 32 information, whilst causing the data to be stored in the memory means. That is, data erasure or re-arraying can be performed, as explained with reference to the flowchart of Fig. 11 with respect to the above-described feeling model. The pet type robot 1 outputs the infonnation from the behavioral model and the instinct model, responsive to the detection signal indicating the extraneous state, and performs the operation consistent with the information to cause data to be stored in the memory means. By having the output of e.g., the behavioral model as the data acquisition condition, the pet type robot 1 is able to cause data to be stored in the memory means. The pet type robot 1 is able to write data in the memory means responsive to the operation of status transition by the status transition table. For example, in case the status (node) is able to transfer between the sleeping state stl, a walking state st2, a sitting state st3 and a barking state st4, as shown in Fig. 15, the pet type robot 1 can transfer from a given state to another state, responsive to a behavior command, while causing data to be stored in the memory means. For example, data may be stored when the status transfers from the walking state st2 to the sleeping state st1. This allows picture data to be written in the memory card 13 as data directly previous to sleeping. By inputting a picture photographed by the CCD video camera 11, with the time the value of the anger feeling output as a transition condition of the status transition table, and by outputting the behavior of inputting the speech by a microphone 21, the operation of data writing operation to the memory means can be allocated to within the 33 anger feeling operation. Thus, the pet type robot 1, who has its head struck violently and felt angry, can record a picture of a person who struck and his abusive speech on the recording means. Also, if, with the obstruction detection by the sensor and with the pet type robot 1 feeling fear as the transition condition, the picture is captured at this time, a picture of the pet type robot 1 feeling fear as to the step or height difference directly before it. Since the picture is stored with the line of sight of the pet type robot 1 as a reference, a user who has reproduced the picture is able to see the picture as if the picture is a steep cliff, as the line of sight of the pet type robot 1. By providing a number of status transitions of data stored in the storage means based on the outputs of the behavioral model or the feeling model, a variety of data can be captured in the storage means. In the embodiment explained using Fig. 11, reference is had to the feeling parameter of the characteristics infonnation as a condition of erasing data at steps S15 and S16. However, the present invention is not limited to this configuration. For example, it is possible to have reference to the date and time data of the characteristics information to determine the data erasure based on the decision as to whether or not the pre-set time has elapsed. In this case, data which has elapsed pre-set time can be erased based on the date and time data. The case in which data is stored in the storage means based on the sensor input (detection signal) of the CPC device 25 is explained. That is, although a behavioral 34 model or a feeling model changed with a detection signal is checked to store data in the storage means, the pet type robot 1 is also able to directly check the extraneous state to store data in the storage means. This is now explained with reference to the flowchart of Fig. 16. First, the CPU 15 at step S31 verifies whether or not the detection signal is a particular detection signal. For example, it is checked whether or not the value of the detection signal has reached apre-set value. The CPU 15 performs decision processing at step S31 until detection of the particular detection signal. If it is found at step S31 that the detection signal has been detected, the CPU 15 advances to step S32 where the CPU stores data corresponding to the detection signal in the storage means. The pet type robot 1 directly verifies the detection signal as explained above, to store data in the storage means responsive to the verified results. Further details are explained with reference to the flowchart of Fig. 17. First, at step S4 1, the CPU 15 verifies whether or not the value of the detection signal as detected responsive to the extraneous state by a sensor of the CPC device 25 is larger than a pre-set threshold value. If, for example, a sound is entered to the microphone 21, it ie checked whether or not the value of the corresponding detection signal is larger than the pre-set threshold value. If, at step S4 1, the value of the detection is verified not to exceed the pre-set threshold value, the CPU 15 reverts to step S41. If it is found at step S41 that the value of the detection signal exceeds the pre-set threshold value, the CPU 15 advances 35 to step S42. The case in which the value of the detection signal is found to exceed the pre-set threshold value, for example, in which the sound has been detected by the microphone 21, means that the sound is a loud sound. At step S42, the CPU 15 verifies whether or not there is any vacant area in the storage area of the memory card 13 . If it has been found at step S42 that there is vacant space in the storage area, the CPU 15 advances to step S43 to store the picture data captured from the CCD video camera 11 in the vacant area in the memory card 13. At this time, the CPU 15 causes the date and time data and the feeling parameter to be stored as characteristics information in association with the picture data. At step S44, the CPU 15 re-arrays picture data in the order of the increasing values of the detection signals. The CPU 15 then reverts to step S41. That is, the storage area of the memory card 13 includes a header 111 storing parameters of the date and time data and detection signals and a picture data portion 112 storing the picture data, as shown in Fig. 18. The CPU 15 sorts the picture data in the order of the decreasing magnitude of the feeling output. If, at step S42, it has been found that there is no vacant storage area, the CPU 15 advances to step S45 to check whether or not the current value of the detection signal exceeds the minimum value of the detection signal ancillary to the picture data stored in the memory card 13. That is, the CPU 15 checks whether or not the current detection signal is larger than the value of the detection signal arranged in a lowermost position in Fig. 18. If the current value of the detection signal is verified at step S45 to 36 be not larger than or smaller than the smallest value of the stored detection signal, the CPU 15 reverts to step S41. If, at step S45, the current detection signal is verified to be larger than the smallest value of the stored detection signal, the CPU 15 advances to step S46 to erase the picture data corresponding to the smallest value of the detection signal. The CPU 15 then advances to step S43 to store the value of the detection signal. This causes the detection signals to be stored sequentially in the order of the decreasing values of the detection signals in the memory card 13. By the above processing, the pet type robot 1 is able to store the data in the memory means by directly referring to the values of the detection signals. For example, if reference is had to the detection signal by the sensor of the CPC device 25 as data storage conditions, the pet type robot 1 is able to store the picture or the speech at such time in memory means, such as memory card 13. Thus, if a cup has dropped and broken in the vicinity of the pet type robot 1, the resulting catastrophic state can be stored as picture and speech in the storage means responsive to the magnitude of the sound. The picture can be acquired as real by causing the pet type robot 1 to swing its neck in a direction towards the origin of the sound. The direction in which the sound has been entered may be identified by the phase difference of the sound entering the sensor. Specifically, the behavior of the pet type robot 1 turning to the direction of the sound source is outputted, with the large sound being inputted to the sensor as a 37 condition for transition in the status transition table. Assuming that the pet type robot has swung its head with the transition condition in the destination of transition as an object, the behavior of storing the picture data at such time in the memory card 13 is allocated. In this manner, if a cup has been dropped in the vicinity of the pet type robot 1, the pet type robot 1 can turn its head to the sound source responsive thereto to write the catastrophic state as a picture in the memory card 13. In the embodiment shown in Fig. 17, the feeling parameter of the characteristics information is referred to as a condition or erasing the data at steps S45 and S46. This, however, is merely illustrative because reference may be had to the date and time data of the characteristics information to determine the data erasure based on the decision as to whether or not the pre-set time has elapsed. Next, a case in which data is to be stored in memory means responsive to the inputting of the pre-set information from outside is explained. In the foregoing description, the pet type robot 1 voluntarily records the information. A case in which data is recorded on the recording means by interaction (dialog) with the user (keeper) is explained. In this case, the pet type robot 1 evaluates a detection signal entered from the sensor of the CPC device 25 to write data in the storage means responsive to the input detection signal (command) based on the results of evaluation. Reference is had to the flowchart of Fig.19. First, the CPU 15 at step S51 verifies whether or not a detection signal has been detected. The check operation at step S51 is performed until detection of the detection 38 signal. If it has been found at step S51 that the detection signal has been detected, the CPU 15 advances to step S52. At step S52, the CPU 15 verifies whether or not the detection signal is a pre-set command (dialog) from the keeper. The decision here is made by, for example, the aforementioned input evaluation portion. If the detection signal is verified not to be a pre-set signal from the keeper, the CPU 15 again performs the processing at step S51. If the detection signal is verified to be a pre-set signal from the keeper, the CPU 15 advances to step S53. At step S53, the CPU 15 causes data to be stored in the storage means in keeping with the user's command. By this dialog with the user, the pet type robot 1 is able to store data in the memory means. By this processing, data can be stored in the memory means in keeping with the status transition, with the transition condition then being the sitting pet type robot 1 having its head struck lightly twice. Specifically, data is stored by the following processing in the memory means: As an extraneous state, a touch sensor 20 as pressure measurement means is struck and a detection signal (pressure information) outputted from the touch sensor 20 on being struck is evaluated by the above-described input evaluation portion. If the result of evaluation that being struck twice is a pre-set command from the user is obtained, the pet type robot 1 stores the picture data or the speech data in the storage 39 means. Meanwhile, data acquisition by the pet type robot 1 through dialog is not limited to being struck, as explained above. For example, the pet type robot 1 is able to identify a command by a pre-set language to record data. In this manner, data can be intentionally stored in the pet type robot 1 by the keeper touching the pet type robot 1 as a pre-set operation or speaking to the pet type robot 1 in a pre-set language. It is also possible to use a device for interaction for the pet type robot 1, such as a sound commander, to command the pet type robot 1 to cause data to be stored in the storage means. In this case, the pet type robot 1 can be provided with a module recognizing the sound to induce status transition in keeping with the corresponding command by handling the result of recognition of the recognition module as a sensor input of the behavioral model to cause the data to be stored in the storage means. A case in which reference is had to the inner state of the pet type robot 1 for storage in the storage means is explained. In the above embodiment, the pet type robot 1 writes data in the storage means based on the behavioral parameter or the feeling parameter, writes data in the storage means based on the detection signal as the result of detection of the extraneous state or writes data in the storage means based on the detection signal as the result of detection of the extraneous state. That is, in the above described embodiment, the pet type robot 1 writes data in the memory means by extraneous factors. The pet type robot 1 is able not only to write data in the storage 40 means based on the extraneous factors, but also to write data based on the inner factors. The pet type robot 1 is able to increase its appetite by chronological changes ir behavior, that is to consume the battery capacity. Thus, data can be stored in the storage means based on the decrease in the battery capacity, with the battery capacity decrease being then the changes in the inner state as the inner factor. This will now be explained with reference to the flowchart of Fig.20. At step S6 1, the CPU 61 verifies whether or not the pre-set inner factor (inner state) has been changed a specified amount. The CPU 15 performs the discriminating processing of step S61 until detection of the detection signal of a pre-set amount of the inner factor. If it has been found at step S61 that the inner factor has changed a specified amount, the CPU 15 advances to step S62 where the CPU 15 causes data to be stored in the storage means. The pet type robot 1 causes data to be stored in the memory means based on these changes in the inner factor. Since the pet type robot 1 is also able to store data in the memory means when the decrease in the battery capacity has reached a pre-set value, the pet type robot 1 can cause the picture data to be stored in the storage means as data when it is hungry. In this processing, the CPU 15 has a function of monitoring the amount of changes in the inner factor and causes data to be written in the storage means based on the monitoring result by the monitor control function.
41 The processing in case data stored in the storage means by the pet type robot 1 is read out by the personal computer 31 is hereinafter explained. Specifically, the operation of processing for reading out picture data stored in the memory card 13 is explained with reference to the flowchart shown in Fig.2 1. First, the user extracts the memory card 13 from the PC card slot 14 to load the memory card 13 in a card slot, not shown, in the personal computer 31, as shown in Fig.22. When the memory card 13 is loaded in the card slot, the CPU, not shown, enclosed in the personal computer 31 reads out at step S71 picture data stored in the memory card 13, as shown in Fig.2 1. If the picture data is stored in the memory card 13 in association with the feeling output, the CPU reads out picture data in the order of the decreasing magnitudes of the feeling output. On the other hand, if the picture data is stored in the memory card 13 in association with the magnitude of the detection signal, the CPU reads out picture data in the order of the decreasing magnitudes of the feeling output. At step Second input evaluation unit 72, the CPU re-arrays the read-out picture data in the chronological order of the date and time data to proceed to step S73. At step S73, the CPU stores the re-arrayed picture data in a memory, not shown, to terminate the processing. This allows the user to read out picture data at any time on the personal computer 31. Therefore, the user can read out picture data to enjoy the picture data as an album recording the life of the pet type robot 1.
42 For example, the personal computer 31 is able to read out picture data stored in the memory card 13 by a so-called browser which is a browser software stored on the furnishing medium to demonstrate the read-out picture data on a display such as a monitor. For example, the picture data stored in the memory card 13 can be browsed by the browser s follows: The user can, for example, view the video data written by the pet type robot 1 on the memory card 13 by executing the browser on the personal computer 31. Moreover, the browser is able to refer to the date and time data to array and display the pictures chronologically. Specifically, the user can view first to sixth pictures P 1 to P 6 stored by the pet type robot 1 on the memory card 13 chronologically on the personal computer 31, as shown in Fig.23. For example, the first picture P 1 is a shoe placed on the porch, the second picture P 2 is a kept cat, the third picture P 3 is a table leg, the fourth picture P 4 is a leg of someone, the fifth picture P 5 is a keeper's face and the sixth picture P 6 is a kept dog. These first to sixth pictures P 1 to P 6 may be those when the output magnitude of the feeling model is large or when the magnitude of the detection signal is large. If time is displayed as reference, diurnal events may be browsed. If every other day is taken as a reference, data can be stored for prolonged time to permit the pictures to be browsed. By re-arraying the events chronologically based on the time information accompanying the information, the user is able to view the events as a sort 43 of an album recording the growth or life records of the pet type robot 1. Moreover, the browser is able to display the pictures recorded by the pet type robot 1 on the memory card 13 like a picture diary. For example, the pet type robot 1 memorizes a picture when an output value of the feeling model or the value of the detection signal has exceeded a certain threshold value. For example, if the pet type robot 1 feels fear as to an obstacle lying before it, and if the output value of the feeling model at that time exceeds a threshold value, it writes the picture at that time on the memory card 13, so that the pet type robot 1 writes a picture P 10 when it has felt fear as to the obstacle, as shown for example in Fig.24. Based on the decision condition accompanying the picture P 10 written in this manner on the memory card 13 , and on the output value of the feeling model, the browser outputs an associated sentence W, reading: "Today, I has fear because there were many obstacles", as an example, to a monitor 3 1a along with the picture P 10 for display like a picture diary. The sentence W associated with the picture P is selected from the database made up of plural senstences W 1 to Wm, where m is an integer. An audio output may also be issued in meeting with the outputting of the output picture. The browser also is able to graphically display only changes in the output value of the feeling model. For example, the browser is able to display changes n the output value of the "fear" or "pleasure" of the diurnal feeling model with respect to time plotted on the abscissa in a graph of Fig.27. This permits the user to see the pleasure, anger, grief or pleasure for a day of the pet type robot 1.
44 In the above-described embodiment, data is mainly stored on the memory card 13 . This, however, is not limitative since data can be memorized in a DRAM 16. In inputting a picture to the personal computer 31, data can be sent from the pet type robot 1 to the personal computer 31 using radio communication means, such as PC card RangeLAN or cable communication means such as USB. By using the radio or wired communication means, it is possible to view picture data etc captured by the pet type robot 1 in real-time on the personal computer 31. It is also possible to install the computer program recorded on the recording medium (furnishing medium) to cause the pet type robot 1 to execute the aforementioned processing. The furnishing medium for supplying a computer program executing the above processing to the user may be enumerated by a transmission medium on a network, such as Internet or digital satellite, in addition to the information; recording medium, such as a magnetic disc or a CD-ROM. Industrial Applicability According to the present invention, it is possible to cause a robot apparatus to collect the information autonomously. This permits the user to check the information collected by the robot apparatus with an feeling of expectation while remaining unawares of what information will be acquired. Since the information is collected under a certain condition, efficient information collection is rendered possible, while 45 it is unnecessary to increase the recording capacity of recording means which memorizes the information. According to the present invention, it is also possible to visualize the information as viewed by the robot apparatus to increase the friendly feeling entertained for the robot apparatus.

Claims (47)

1. A robot apparatus having a behavioral model or a feeling model changed at least based on extraneous factors, comprising: detection means for detecting extraneous states; storage means for storing data; and write control means for writing pre-set data in said storage means based on a detection signal detected by said detection means.
2. The robot apparatus according to claim 1 further comprising: means for evaluating said detection signal; said write control means writing said pre-set data in said storage means based on the evaluated results by said evaluation means.
3. The robot apparatus according to claim 2 wherein said detection means includes pressure measurement means for measuring the pressure as the extraneous state; said evaluating means evaluating the pressure information as the detection signal from said pressure measurement means.
4. The robot apparatus according to claim 1 wherein said detection means includes extraneous data inputting means at which extraneous data is inputted; said write control means writing extraneous input data to said extraneous data inputting means in said storage means. 47
5. The robot apparatus according to claim 1 further comprising: erasure control means for erasing pre-set data stored in said storage means; said write control means adding characteristics information of said detection signal to said pre-set data to write the resulting signal in said storage means; said erasure control means erasing said pre-set data added to with said characteristics information from said storage means when a pre-set condition consistent with the characteristics information holds.
6. The robot apparatus according to claim 5 wherein said pre-set condition for said erasure control means is whether or not a pre-set value has been reached.
7. The robot apparatus according to claim 5 wherein said pre-set condition for said erasure control means is whether or not a pre-set time has elapsed.
8. The robot apparatus according to claim 1 further comprising: re-arraying means for re-arraying said pre-set data written in said storage means depending on a value of the detection signal associated with said pre-set data.
9. A method for controlling a robot apparatus having a behavioral model or a feeling model changed at least based on extraneous factors, comprising: a detecting step of detecting extraneous states by detection means; and a write control step of writing pre-set data in said storage means based on a detection signal detected by said detection means. 48
10. The method according to claim 9 further comprising: an evaluating step of evaluating said detection signal detected by said detection means in said detection step; said write control step writing said pre-set data in storage means based on a detection signal detected by said detection means.
11. A furnishing medium for furnishing a program to a robot apparatus having a behavioral model or a feeling model changed at least based on extraneous factors, said program being configured to execute processing comprising: a detecting step of detecting extraneous states by detection means; and a write control step of writing pre-set data in said storage means based on a detection signal detected by said detection means.
12. The furnishing medium according to claim 11 further comprising: an evaluating step of evaluating said detection signal detected by said detection means in said detection step; said write control step writing said pre-set data in storage means based on a detection signal detected by said detection step.
13. A robot apparatus having a behavioral model for outputting a pre-set behavior connand or a feeling model for outputting the feeling information, said robot apparatus comprising: detection means for detecting extraneous states; storage means for storing data; and 49 write control means for writing pre-set data in said storage means based on said pre-set behavior command or said feeling information.
14. The robot apparatus according to claim 13 wherein said detection means includes extraneous data inputting means at which extraneous data is inputted; said write control means writing extraneous input data to said extraneous data inputting means in said storage means.
15. The robot apparatus according to claim 13 wherein said behavioral model or said feeling model is a status transition model; said write control means writing said pre-set data in said storage means based on a pre-set transition state of said status transition model.
16. The robot apparatus according to claim 13 further comprising: erasure control means for erasing pre-set data stored in said storage means; said erasure control means erasing said pre-set data from said storage means based on said pre-set behavior command or said feeling information.
17. The robot apparatus according to claim 13 further comprising: erasure control means for erasing pre-set data stored in said storage means; said write control means writing said pre-set data added to with the pre-set infonnation derived from said pre-set behavior command or the feeling information in said storage means; said erasure control means erasing said pre-set information from said storage 50 means when the pre-set condition associated with the pre-set behavior command holds.
18. The robot apparatus according to claim 17 wherein said pre-set condition for said erasure control means is whether or not a pre-set value has been reached.
19. The robot apparatus according to claim 17 wherein said pre-set condition for said erasure control means is whether or not a pre-set time has elapsed.
20. The robot apparatus according to claim 15 further comprising: erasure control means for erasing pre-set data stored in said storage means; said erasure control means erasing said pre-set data from said storage means based on a pre-set transition state of said status transition model.
21. The robot apparatus according to claim 13 further comprising: re-arraying means for re-arraying said pre-set data written in said storage means depending on said pre-set behavior command or feeling information associated with said pre-set data.
22. A method for controlling a robot apparatus adapted to control a robot apparatus having a behavioral model or the feeling infonnation outputting a pre-set behavior command, said method comprising: a step of outputting the pre-set behavior command or the feeling infonnation based on said behavioral model or the feeling information based on the input information; and 51 a write control step of writing pre-set data based on said pre-set behavior command or the feeling information.
23. A furnishing medium for furnishing a program to a robot apparatus having a behavioral model outputting a pre-set behavior command or the feeling information, said program being adapted to execute processing including a step of outputting the pre-set behavior command or the feeling information based on said behavioral model or the feeling infonnation based on the input information and a write control step of writing pre-set data based on said pre-set behavior command or the feeling information.
24. A robot apparatus having an instinct model for outputting the instinct infonnation, said robot apparatus comprising: detection means for detecting extraneous states; storage means for storing data; and write control means for writing pre-set data in said storage means; said write control means writing said pre-set data in said storage means based on said instinct information.
25. The robot apparatus according to claim 24 wherein said detection means includes an extraneous data inputting means in which extraneous data is inputted; said write control means writing the extraneous input data inputted as said pre set data to said extraneous data input means in said storage means. 52
26. The robot apparatus according to claim 24 wherein said behavioral model or said feeling model is a status transition model; said write control means writing said pre-set data in said storage means based on a pre-set transition state of said status transition model.
27. The robot apparatus according to claim 24 further comprising: erasure control means for erasing pre-set data stored in said storage means; said erasure control means erasing said pre-set data from said storage means based on said instinct information.
28. The robot apparatus according to claim 24 further comprising: erasure control means for erasing pre-set data stored in said storage means; said write control means writing said pre-set data added to with the pre-set infonnation derived from said instinct information in said storage means; said erasure control means erasing said pre-set information from said storage means when the pre-set condition associated with the pre-set behavior command holds.
29. The robot apparatus according to claim 28 wherein said pre-set condition for said erasure control means is whether or not a pre-set value has been reached.
30. The robot apparatus according to claim 28 wherein said pre-set condition for said erasure control means is whether or not a pre-set time has elapsed.
31. The robot apparatus according to claim 26 further comprising: 53 erasure control means for erasing the pre-set data stored in said storage means; said erasure control means erasing said pre-set data from said storage means based on a pre-set transition state of said status transition model.
32. The robot apparatus according to claim 24 further comprising: re-arraying means for re-arraying said pre-set data written in said storage means depending on said instinct information associated with said pre-set data.
33. A method for controlling a robot apparatus having an instinct model outputting the instinct information, comprising: an outputting step of outputting the instinct information by said instinct model based on the input information; and a write control step of writing pre-set data based on said instinct information.
34. A furnishing medium for furnishing a program to a robot apparatus having an instinct model adapted to output the instinct information, said program being adapted to execute the processing including an outputting step of outputting the instinct information by said instinct model based on the input information; and a write control step of writing pre-set data in storage means based on said instinct information.
35. A robot apparatus having a behavioral model, a feeling model or an instinct model changed based at least on inner factors, said behavioral model, feeling model or the instinct model outputting a pre-set behavior command, feeling information or the 54 instinct information based on said inner factor, said robot apparatus comprising: monitoring means for monitoring the inner state as said inner factor; storage means for memorizing data; and write control means for writing the pre-set data in said storage means; said write control means writing said pre-set data in said storage means based on the monitored results by said monitoring means.
36. The robot apparatus according to claim 35 further comprising: erasure control means for erasing pre-set data stored in said storage means; said write control means writing said pre-set data added to with characteristics information on the inner state in said storage means; said erasure control means erasing said pre-set data added to with characteristics information from said storage means when the pre-set condition associated with the characteristics information holds.
37. The robot apparatus according to claim 36 wherein said pre-set condition for said erasure control means is whether or not a pre-set value has been reached.
38. The robot apparatus according to claim 35 further comprising: re-arraying means for re-arraying said pre-set data written in said storage means depending on said inner state associated with said pre-set data.
39. A method for controlling a robot apparatus having a behavioral model, a feeling model or an instinct model changed based at least on inner factors, said behavioral 55 model, feeling model or the instinct model outputting a pre-set behavior command, feeling information or the instinct information based on said inner factor, said method comprising: a write control step of monitoring the inner state as said inner factor and writing said pre-set data in storage means based on the monitored results.
40. A furnishing medium for furnishing a program to a robot apparatus having a behavioral model, a feeling model or an instinct model changed based at least on inner factors, said behavioral model, feeling model or the instinct model outputting a pre-set behavior command, feeling information or the instinct information based on said inner factor, said program causing execution of the processing including a write control step of monitoring the inner state as said inner factor to write pre-set data in storage means based on the monitored results.
41. A display method comprising: a read-out step of reading out said pre-set data memorized in said storage means by a robot apparatus having a behavioral model, a feeling model and/or an instinct model changed based at least on extraneous factors and/or inner factors, said robot apparatus writing pre-set data in storage means depending co conditions; and a display step of displaying said pre-set data read out by said read-out step on a display.
42. The display method according to claim 41 wherein said read-out step reads out from said storage means the extraneous input data as said pre-set data captured by said 56 robot apparatus responsive to said condition.
43. The display method according to claim 41 wherein said read-out step reads out a plurality of said pre-set data from said storage means and wherein said read-out step re-arrays said plural pre-set data to display the re-arrayed data.
44. The display method according to claim 43 wherein said display step chronologically re-arrays the plural pre-set data depending on the time information added to said pre-set data to display the re-arrayed data in said display.
45. The display method according to claim 41 further comprising: an information outputting step of outputting the information associated with said pre-set data displayed in said display step.
46. The display method according to claim 45 wherein the information outputting step outputs the letter information corresponding to said pre-set data.
47. A furnishing medium for furnishing a program to a picture display apparatus adapted to demonstrate a picture on a display, said program being adapted to execute the processing including a read-out step of reading out pre-set data stored in said storage means by a robot apparatus having a behavioral model and/or a feeling model and/or an instinct model changed depending on an extraneous factor or an inner factor, said robot apparatus writing pre-set data depending on conditions, and a displaying step of displaying in said display said pre-set data read out by said 57 read-out step.
AU56498/99A 1998-09-10 1999-09-10 Robot apparatus, method of controlling robot apparatus, method of display, and medium Ceased AU768353B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP25646598 1998-09-10
JP10/256465 1998-09-10
PCT/JP1999/004957 WO2000015396A1 (en) 1998-09-10 1999-09-10 Robot apparatus, method of controlling robot apparatus, method of display, and medium

Publications (2)

Publication Number Publication Date
AU5649899A true AU5649899A (en) 2000-04-03
AU768353B2 AU768353B2 (en) 2003-12-11

Family

ID=17293025

Family Applications (1)

Application Number Title Priority Date Filing Date
AU56498/99A Ceased AU768353B2 (en) 1998-09-10 1999-09-10 Robot apparatus, method of controlling robot apparatus, method of display, and medium

Country Status (8)

Country Link
US (15) US6934604B1 (en)
EP (1) EP1059147A4 (en)
JP (1) JP4576715B2 (en)
KR (1) KR100721075B1 (en)
CN (1) CN100387407C (en)
AU (1) AU768353B2 (en)
CA (1) CA2309671C (en)
WO (1) WO2000015396A1 (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1059147A4 (en) * 1998-09-10 2008-01-16 Sony Corp Robot apparatus, method of controlling robot apparatus, method of display, and medium
KR20020008848A (en) * 2000-03-31 2002-01-31 이데이 노부유끼 Robot device, robot device action control method, external force detecting device and external force detecting method
CN100423911C (en) 2000-10-13 2008-10-08 索尼公司 Robot device and behavior control method for robot device
JP3645848B2 (en) * 2000-11-17 2005-05-11 株式会社ソニー・コンピュータエンタテインメント Information processing program, recording medium on which information processing program is recorded, information processing apparatus and method
US6967455B2 (en) 2001-03-09 2005-11-22 Japan Science And Technology Agency Robot audiovisual system
JP2002283261A (en) 2001-03-27 2002-10-03 Sony Corp Robot device and its control method and storage medium
ITVI20020053A1 (en) * 2002-03-22 2003-09-22 Qem Srl INTEGRATED SYSTEM FOR THE CONTROL OF THE AXES OF INDUSTRIAL MACHINERY
US20080150684A1 (en) * 2004-11-04 2008-06-26 Lock Technology B.V. Portable entry system and method
KR100825719B1 (en) * 2005-12-09 2008-04-29 한국전자통신연구원 Method for generating emotions and emotions generating robot
KR20070075957A (en) * 2006-01-17 2007-07-24 주식회사 로보스타 Robot control system for multi tasking based task
KR100752098B1 (en) * 2006-03-07 2007-08-24 한국과학기술연구원 Robot system based on neural network
JP2007275234A (en) * 2006-04-05 2007-10-25 Tomy Co Ltd Game apparatus
TW200743862A (en) * 2006-05-23 2007-12-01 Sunonwealth Electr Mach Ind Co A heat-dissipating module for a back light set of a liquid crystal display
US8027888B2 (en) * 2006-08-31 2011-09-27 Experian Interactive Innovation Center, Llc Online credit card prescreen systems and methods
KR100866212B1 (en) * 2007-02-08 2008-10-30 삼성전자주식회사 Genetic robot platform and genetic robot behavior manifestation method
JP4633775B2 (en) * 2007-10-29 2011-02-16 富士フイルム株式会社 Robot and control method thereof
JP4839487B2 (en) * 2007-12-04 2011-12-21 本田技研工業株式会社 Robot and task execution system
US8648863B1 (en) * 2008-05-20 2014-02-11 Pixar Methods and apparatus for performance style extraction for quality control of animation
KR20100053797A (en) * 2008-11-13 2010-05-24 주식회사 로보로보 Educational robot apparatus for chidren and operating method of the educational robot apparatus for chidren
US8939840B2 (en) 2009-07-29 2015-01-27 Disney Enterprises, Inc. System and method for playsets using tracked objects and corresponding virtual worlds
TWI447660B (en) * 2009-12-16 2014-08-01 Univ Nat Chiao Tung Robot autonomous emotion expression device and the method of expressing the robot's own emotion
KR100980722B1 (en) 2010-07-23 2010-09-07 전성택 Emotion processing method of artificiality emotion-model having virtual emotion
RU2450336C1 (en) * 2011-01-11 2012-05-10 Государственное образовательное учреждение высшего профессионального образования "Кубанский государственный технологический университет" (ГОУ ВПО "КубГТУ") Modified intelligent controller with adaptive critic
JP6250566B2 (en) 2012-02-15 2017-12-20 インテュイティブ サージカル オペレーションズ, インコーポレイテッド User selection of robot system operation mode using operation to distinguish modes
JP5670416B2 (en) * 2012-12-28 2015-02-18 ファナック株式会社 Robot system display device
CN109153127B (en) * 2016-03-28 2022-05-31 Groove X 株式会社 Behavior autonomous robot for executing head-on behavior
WO2017212723A1 (en) 2016-06-06 2017-12-14 ソニー株式会社 Virtual lifeform control system and virtual lifeform control method
WO2017217192A1 (en) * 2016-06-14 2017-12-21 Groove X株式会社 Coolness-seeking autonomous operation robot
JP6467674B2 (en) 2016-07-20 2019-02-13 Groove X株式会社 Autonomous robot that understands skinship
KR102577571B1 (en) * 2016-08-03 2023-09-14 삼성전자주식회사 Robot apparatus amd method of corntrolling emotion expression funtion of the same
US10272349B2 (en) * 2016-09-07 2019-04-30 Isaac Davenport Dialog simulation
CN106354085B (en) * 2016-10-14 2019-01-11 广州励丰文化科技股份有限公司 A kind of mechanical arm console that pressure sensitive is combined with remote media and method
JP6761598B2 (en) * 2016-10-24 2020-09-30 富士ゼロックス株式会社 Emotion estimation system, emotion estimation model generation system
DE102017125524B4 (en) 2017-10-31 2020-11-05 Ktm Ag Motorcycle seating arrangement
WO2020056375A1 (en) 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Voice modification to robot motion plans
JP7302989B2 (en) 2019-03-08 2023-07-04 株式会社コロプラ Game program, method, and information processing device

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6224988A (en) * 1985-07-23 1987-02-02 志井田 孝 Robot having feeling
JPH03270888A (en) * 1990-03-20 1991-12-03 Tokico Ltd Control device for industrial robot
JPH0440506A (en) * 1990-06-06 1992-02-10 Murata Mach Ltd Teaching device for robot
JP2921936B2 (en) * 1990-07-13 1999-07-19 株式会社東芝 Image monitoring device
JPH07248823A (en) * 1994-03-11 1995-09-26 Hitachi Ltd Personal robot device
JPH098A (en) * 1995-06-17 1997-01-07 Fuji Toreela- Seisakusho:Kk Field working machine
JP3413694B2 (en) * 1995-10-17 2003-06-03 ソニー株式会社 Robot control method and robot
KR0168189B1 (en) * 1995-12-01 1999-02-01 김광호 Control method and apparatus for recognition of robot environment
JPH09244730A (en) * 1996-03-11 1997-09-19 Komatsu Ltd Robot system and controller for robot
JPH1011107A (en) * 1996-06-21 1998-01-16 Nippon Telegr & Teleph Corp <Ntt> Real time action determining system for intelligent robot and its method
US6021369A (en) * 1996-06-27 2000-02-01 Yamaha Hatsudoki Kabushiki Kaisha Integrated controlling system
US6032139A (en) * 1996-09-27 2000-02-29 Yamaha Hatsudoki Kabushiki Kaisha Electronic controller using genetic evolution techniques suitable for controlling a motor
US6319201B1 (en) * 1997-10-15 2001-11-20 Peter J. Wilk Imaging device and associated method
US6175857B1 (en) * 1997-04-30 2001-01-16 Sony Corporation Method and apparatus for processing attached e-mail data and storage medium for processing program for attached data
JPH11126017A (en) * 1997-08-22 1999-05-11 Sony Corp Storage medium, robot, information processing device and electronic pet system
US6249780B1 (en) * 1998-08-06 2001-06-19 Yamaha Hatsudoki Kabushiki Kaisha Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object
EP1059147A4 (en) * 1998-09-10 2008-01-16 Sony Corp Robot apparatus, method of controlling robot apparatus, method of display, and medium
JP4661074B2 (en) * 2004-04-07 2011-03-30 ソニー株式会社 Information processing system, information processing method, and robot apparatus

Also Published As

Publication number Publication date
US20050197741A1 (en) 2005-09-08
US7155312B2 (en) 2006-12-26
US7146251B2 (en) 2006-12-05
US20050187659A1 (en) 2005-08-25
US7155311B2 (en) 2006-12-26
CN1287522A (en) 2001-03-14
CN100387407C (en) 2008-05-14
US7146249B2 (en) 2006-12-05
US7155310B2 (en) 2006-12-26
US6934604B1 (en) 2005-08-23
CA2309671A1 (en) 2000-03-23
US20050187661A1 (en) 2005-08-25
US20050197740A1 (en) 2005-09-08
US20050187660A1 (en) 2005-08-25
US7142946B2 (en) 2006-11-28
JP4576715B2 (en) 2010-11-10
US20050171643A1 (en) 2005-08-04
EP1059147A1 (en) 2000-12-13
US7146252B2 (en) 2006-12-05
US20050187658A1 (en) 2005-08-25
US20050187662A1 (en) 2005-08-25
US7146250B2 (en) 2006-12-05
US7155314B2 (en) 2006-12-26
US7149603B2 (en) 2006-12-12
KR100721075B1 (en) 2007-05-23
US7155313B2 (en) 2006-12-26
US20050171641A1 (en) 2005-08-04
US20050171642A1 (en) 2005-08-04
US20050192706A1 (en) 2005-09-01
KR20010031909A (en) 2001-04-16
US20050171640A1 (en) 2005-08-04
CA2309671C (en) 2010-12-21
US20050177278A1 (en) 2005-08-11
WO2000015396A1 (en) 2000-03-23
US7151984B2 (en) 2006-12-19
AU768353B2 (en) 2003-12-11
US20050182514A1 (en) 2005-08-18
US7151983B2 (en) 2006-12-19
EP1059147A4 (en) 2008-01-16

Similar Documents

Publication Publication Date Title
US6934604B1 (en) Robot apparatus, method of controlling robot apparatus, method of display, and medium
US7117190B2 (en) Robot apparatus, control method thereof, and method for judging character of robot apparatus
TW581959B (en) Robotic (animal) device and motion control method for robotic (animal) device
US6754560B2 (en) Robot device, robot device action control method, external force detecting device and external force detecting method
JP2006110707A (en) Robot device
JP2001157980A (en) Robot device, and control method thereof
JP2003136451A (en) Robot device and control method thereof

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)