WO2000067960A1 - Dispositif jouet et sa technique de commande - Google Patents
Dispositif jouet et sa technique de commande Download PDFInfo
- Publication number
- WO2000067960A1 WO2000067960A1 PCT/JP2000/002988 JP0002988W WO0067960A1 WO 2000067960 A1 WO2000067960 A1 WO 2000067960A1 JP 0002988 W JP0002988 W JP 0002988W WO 0067960 A1 WO0067960 A1 WO 0067960A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- robot
- robot device
- battery
- predetermined
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 28
- 230000009471 action Effects 0.000 claims abstract description 68
- 230000033001 locomotion Effects 0.000 claims abstract description 19
- 230000008451 emotion Effects 0.000 claims description 115
- 230000007704 transition Effects 0.000 claims description 55
- 238000001514 detection method Methods 0.000 claims description 26
- 230000002996 emotional effect Effects 0.000 claims description 22
- 230000005540 biological transmission Effects 0.000 claims description 9
- 238000009423 ventilation Methods 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 6
- 230000001747 exhibiting effect Effects 0.000 claims description 2
- 238000005259 measurement Methods 0.000 claims description 2
- 238000009529 body temperature measurement Methods 0.000 claims 1
- 230000003028 elevating effect Effects 0.000 claims 1
- 230000036544 posture Effects 0.000 description 86
- 230000007246 mechanism Effects 0.000 description 75
- 230000006399 behavior Effects 0.000 description 33
- 238000012545 processing Methods 0.000 description 27
- 210000003128 head Anatomy 0.000 description 21
- 230000007423 decrease Effects 0.000 description 14
- 241001465754 Metazoa Species 0.000 description 13
- 230000036528 appetite Effects 0.000 description 12
- 235000019789 appetite Nutrition 0.000 description 12
- 230000014509 gene expression Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000005452 bending Methods 0.000 description 7
- 230000005236 sound signal Effects 0.000 description 7
- 230000003247 decreasing effect Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 4
- 238000001816 cooling Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 3
- 230000001737 promoting effect Effects 0.000 description 3
- 241000282472 Canis lupus familiaris Species 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000004936 stimulating effect Effects 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 101100537937 Caenorhabditis elegans arc-1 gene Proteins 0.000 description 1
- 101100298222 Caenorhabditis elegans pot-1 gene Proteins 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 206010020466 Hunger Diseases 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 208000032140 Sleepiness Diseases 0.000 description 1
- 206010041349 Somnolence Diseases 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 239000000956 alloy Substances 0.000 description 1
- 229910045601 alloy Inorganic materials 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000001718 repressive effect Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 230000037321 sleepiness Effects 0.000 description 1
- 210000001364 upper extremity Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/008—Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/005—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators using batteries, e.g. as a back-up power source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0054—Cooling means
Definitions
- the present invention relates to a robot device and a control method therefor, and is suitably applied to, for example, a robot.
- the tail of this type of robot is not only a string-shaped tail that simply hangs from the rear end of the body, but also in the vertical direction or depending on the drive of the actuator built in the rear end of the body.
- a device that swings in the left-right direction has been proposed.
- this type of robot uses a battery, which is a secondary battery provided inside the fuselage, as a main power source, and drives various circuit units based on the power supplied from the battery. It has been made to be.
- the tail when the tail is swung in response to the driving of the actiyue, the tail is bent and swung in a direction according to the emotion at that time, such as a real dog or cat. If this can be done, it will be possible to give the user a greater sense of intimacy and satisfaction, and to further improve the amusement as a pet robot.
- the tail does not need to move as needed like a limb, and it can be moved freely at any time, so if a failure or trouble occurs inside the pet mouth bot Can be notified to the unit by tail movement. It is even more desirable.
- a driving system including a battery is provided in the body of a pet robot.
- a ventilation port is formed at a predetermined position of the body, and the air is exchanged with outside air through the ventilation port. In this way, it is possible to prevent the internal temperature of the body from rising extremely.
- the present invention has been made in view of the above points, and has a mouth bot device capable of significantly improving the amusement property, expressing the necessity of charging with an attitude, and ensuring safety, and a control thereof. It tries to propose a method.
- a robot apparatus having a movable part whose one end is rotatably connected to at least one axis or more, Emotional instinct model that has motion control means for operating the movable part according to force information and an emotional instinct model caused by the motion, and determines the operation of the movable part by changing the emotional instinct model based on input information
- a changing means is provided.
- a robot device having a movable portion having one end rotatably connected to at least one axis or more, detecting means for detecting an internal state of the robot device
- An operation control means for operating the movable part is provided.
- a battery remaining amount detecting means for detecting a remaining amount of the battery, and a battery remaining amount detected by the battery remaining amount detecting means.
- An operation control means for causing the robot device to transition to a predetermined posture and / or exhibiting a predetermined operation when the pressure becomes equal to or lower than a predetermined level is provided.
- a robot device having a body portion having a built-in power source and having a ventilation port formed at a predetermined position of the body portion
- temperature detection means for detecting an internal temperature of the body portion
- an operation control means for causing the robot apparatus to transition to a predetermined posture and / or to perform a predetermined operation is provided.
- the first step of detecting the remaining amount of the battery, and the detected remaining amount of the battery becomes a predetermined level or less.
- a second step of causing the robot device to transition to a predetermined posture and / or to exhibit a predetermined operation is provided.
- the robot device can be controlled so as to give the user a feeling of promoting appetite as if it were a real animal, as a sense of closeness or satisfaction, thus significantly improving the amusement ability.
- a control method for a robot device can be realized.
- a first step of detecting an internal temperature of the body portion in a robot device having a body portion having a built-in power source and having a ventilation port formed at a predetermined position of the body portion When the internal temperature detected by the temperature detecting means becomes equal to or higher than a predetermined temperature, a second step of causing the robot device to transition to a predetermined posture and / or to perform a predetermined operation is provided.
- a second step of causing the robot device to transition to a predetermined posture and / or to perform a predetermined operation is provided.
- FIG. 1 is a perspective view showing a configuration of a pet robot system to which the present invention is applied.
- FIG. 2 is a perspective view showing a configuration of a pet robot.
- FIG. 3 is a perspective view showing the configuration of the pet robot of FIG.
- FIG. 4 is a perspective view showing the configuration of the station in FIG.
- FIG. 5 is a block diagram showing the configuration of the pet robot of FIG.
- FIG. 6 is a schematic diagram showing the tail part of the tail unit.
- FIG. 7 is a schematic perspective view showing the internal configuration of the base portion of the tail unit.
- FIG. 8 is a partial cross-sectional view showing the internal configuration of the gearbox in the base portion of FIG.
- FIG. 9 is a partial cross-sectional view for explaining an operation state of the differential gear mechanism in the gear box of FIG.
- FIG. 10 is a schematic perspective view showing the entire configuration of the tail unit.
- FIG. 11 is a schematic diagram for explaining data processing in the controller.
- FIG. 12 is a schematic diagram for explaining the data processing by the emotional instinct model unit.
- FIG. 13 is a schematic diagram for explaining the emotional processing performed by the instinct model unit.
- FIG. 14 is a schematic diagram for explaining data processing by the emotion's instinct model unit.
- FIG. 15 is a schematic diagram for explaining data processing by the emotion / instinct model unit.
- ⁇ Schematic diagram for explaining the de-night processing by the instinct model unit ⁇
- Figure 17 is a state transition diagram of the finite state automaton in the action decision mechanism.
- FIG. 18 is a posture transition diagram in the posture transition mechanism section.
- FIG. 19 is a flowchart for explaining a charge request processing procedure.
- FIG. 20 is a flowchart for explaining the internal temperature adjustment processing procedure.
- a pet robot system according to the present embodiment as a whole, and a pet port bot 2 is placed at a predetermined position of a dedicated battery charger (hereinafter, referred to as a station) 3 with a predetermined posture state.
- the battery (not shown) built in the pet robot 2 is charged by mounting the pet robot.
- the actual pet robot 2 has a body unit
- the front and rear ends of the body unit 4 are connected to the head unit 6 and the tail unit ⁇ , respectively, while the leg unit 5A to 5D are connected to the front, rear, left and right of the unit 4. Have been.
- the tail unit 7T is pulled out from a base unit 7B provided on the upper surface of the body unit 4 so as to bend or swing with two degrees of freedom.
- a cooling fan (not shown) is provided inside the body unit 4, and an exhaust port 4AX and an intake port 4BX are formed on the upper surface 4A and the lower surface 4B, respectively, via the cooling fan. I have. Accordingly, in the pet robot 2, the air sucked from the air inlet 4BX is discharged to the outside from the air outlet 4AX through the inside of the body unit 4 in response to the driving of the cooling fan, so that the body unit 4 is designed to reduce the internal temperature.
- a first connector half 4C formed by exposing a plurality of electrodes (pads) (not shown).
- the wiring drawn from the built-in charger is connected to a household power supply via an AC adapter, and as shown in Fig. 4, the external appearance of the top of the main unit 3A
- a concave space 3 AH corresponding to the body unit 4 of the pet robot 2 is formed at the center, and flat surfaces 3 AR and 3 AL are respectively formed along the longitudinal direction on both sides of the concave space 3 AH. .
- a second connector half having electrode terminals (not shown) protrudingly formed corresponding to the respective electrodes of the first connector half 4 C of the pet robot 2. 3 C is provided.
- a plurality of LED (Light Emitting Diode) lamps 3 L are provided on the front of the main unit 3 A of the station 3, and a plug of an AC adapter (not shown) electrically connected to the station 3 is connected to the power supply. Whether the battery (not shown) provided inside the pet robot 2 is charged, or whether the spare battery removably stored in the station 3 is charged. Is turned on or off in a predetermined emission color. Has been made to notify.
- the robot 2 When actually combining the robot 2 with the station 3, the robot 2 is first bent in a so-called “down” posture, that is, the leg units 5A to 5C are bent and the body unit is bent. Change the lower surface 4 of 4 to an attitude that brings the lower surface 4B close to the floor (this attitude is referred to as the station shift attitude).
- the user lifts the pet robot 2 and places the body unit 4 so that the body unit 4 fits into the recessed space 3AH of the station 3, so that the first unit 4 on the body unit unit 4 side is provided.
- the connector half 4C of the second station and the second connector half 3C of the station 3 are brought into contact with each other to make them conductive.
- the leg unit 5A to 5D does not hinder the body unit 4 from being fitted into the recessed space 3AH.
- the legs at the tips of the leg units 5A to 5D are held in contact with both flat surfaces 3AR and 3AL formed on the main body 3A of the station 3.
- the robot 2 includes a controller 10 for controlling the operation of the entire robot 2 and a controller 10 for controlling the entire operation of the robot 2.
- a battery 11 serving as a power source
- an internal sensor unit 14 including a battery sensor 12 and a heat sensor 13 are housed.
- the head unit 6 has a microphone 15 corresponding to the “ears”, a CCD (Charge Coupled Device) camera 16 corresponding to the “eyes”, an evening sensor 17 and a “mouth”.
- Corresponding loudspeakers 18 and the like are arranged at predetermined positions.
- the microphone 15 of the head unit 6 is provided as a musical scale by a sound commander (a commander that generates a sound of a different musical scale depending on the operation content) from a user. It collects command sounds such as "Chase the ball" and sends the obtained audio signal S1 to the controller 10. Further, the CCD camera 16 captures an image of the surroundings, and sends the obtained image signal S 2 to the controller 10.
- the evening sensor 17 is provided on the upper part of the head unit 6 as is apparent in FIG. 2, and receives a pressure applied by a physical action such as “stroke” or “slap” from the user. Detect and send the detection result to controller 10 as pressure detection signal S3.
- the battery sensor 12 of the body unit 4 detects the remaining amount of the battery 11 in five stages, and the detection result of each stage is sent to the sequential controller 10 as a battery remaining amount detection signal S4. Send out.
- the battery sensor 12 indicates that the remaining amount of the battery 11 is 80% or more, 80% to 50%, 50% to 25%, 25% to 20% and 20 [%] When the value is less than or equal to 0%, it is classified as "Fu 11", “Midd 1 e-Fu 11", “Midd 1 e", “Low” and “Low_Low” in stages. It is made to detect.
- the heat sensor 13 of the body unit 4 detects the internal temperature of the robot 2, and sends the detection result to the controller 10 as a heat detection signal S5.
- the controller 10 is composed of a microphone 15, a CCD camera 16, an evening sensor 17, a battery sensor 12, and a sound signal S 1, an image signal S 2, a pressure detection signal S 3, and a battery provided from a heat sensor 13. Based on the remaining amount detection signal S4, the heat detection signal S5, and the like, it is determined whether there is a surrounding condition, a command from a user, a user's action, or the like.
- the controller 10 determines a subsequent action based on the determination result and the control program input in advance, and based on the determination result, determines a necessary action 5 AA ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ S CA ⁇ S CAK S DA ⁇ S DAK , 6 Ai to 6 A L , 7 and 7 A 2 to move the head unit 6 up and down, left and right, move the tail unit 7, and move each leg unit 5 A to 5 D Have the user perform an action such as driving and walking.
- the controller 10 outputs a sound based on the sound signal S6 to the outside by giving a predetermined sound signal S6 to the speaker 18 as necessary. Turns on, off or blinks the LED (not shown) provided at the position.
- the robot 2 can autonomously act on the basis of the surrounding conditions, control programs, and the like.
- the tail unit 7 has one end of the tail 7 T connected to a gear box (not shown) in a base 7 B provided on the upper surface of the body unit 4. I have.
- this tail 7T is formed into a hairpin shape by bending a single wire 7W made of a superelastic alloy into a substantially U-shaped (or almost V-shaped) shape at the center.
- a bending mechanism composed of two substantially parallel wire sections 7WA and 7WB is formed.
- a plurality of cylindrical members 7 PA to 7 PD formed of a synthetic resin, light metal, or the like are inserted in series around the outer periphery of the hairpin-shaped wire portions 7 WA and 7 WB.
- the articulated mechanism is constructed by connecting between PA and 7 PD as a so-called spherical bearing (a kind of universal joint) so that it can bend freely.
- a pin 7P is driven perpendicularly to the axial direction of the leading end of the cylindrical member 7PA, and the bent portion 5WC of the wire 7W is brought into contact with the cylindrical member 7PA.
- the member 7 can be prevented from coming off the PA.
- the proximal ends 7 WAS and 7 WBS of the two wire portions 7 WA and 7 WB project in a state in which they are bent in substantially L-shapes in opposite directions.
- Each is connected to a gear box 22 (FIG. 7) described later.
- the pair of Akuchiyue Isseki 7 A, 7 A 2 is U-shaped cross section consisting of the gear one Domo Isseki
- the formed bearing member 20 is fixed to the back plate 2 OA, and the output shafts 7 A 1X and 7 A 2X of the respective actuators 7 Ai and 7 A 2 are respectively formed in the back plate 2 OA through holes. It penetrates inward through 20 AH.
- each Akuchiyue Isseki 7 A ⁇ gearbox output shaft 7 of 7 A 2 A 1X, 7 A 2X via the gear transmission mechanism 2 1 It is designed to engage with 22. That is, in this gear transmission mechanism 2 1, and each Akuchiyue Isseki 7 Ai, the output shaft 7 of 7 A 2 A 1X, 7 A 2X in anchored helical teeth wheel 2 3 A, 2 3 B, the bearing member 2
- the helical gear 26 A which is rotatably mounted about the first and second shafts 24, 25 supported between the top plate 20 B and the bottom plate 20 C of FIG. And 26 B are meshed with each other.
- a gear 27A having the first shaft 24 as a rotation center is integrally connected to the helical gear 26A, and the second shaft At the lower end of 25, a gear 27B having the second shaft 25 as a rotation center is integrally connected to the helical gear 26B.
- gears 29 and 30 are provided at the upper end and lower end of the third shaft 28 supported between the upper plate 20B and the lower plate 20C of the bearing member 20, respectively.
- the upper end gear 29 is meshed with the gear 27 A attached to the upper end of the first shaft 24, and the lower gear 30.
- one end of a pair of upper and lower fourth and fifth shafts 31 and 32 which are coaxial with each other is supported on the upper plate 20B and the lower plate 20C of the bearing member 20, respectively.
- a substantially spherical gear box 22 is attached so as to connect the fourth and fifth shafts 31 and 32.
- the fourth and fifth shafts 31, 32 are rotatably mounted about the center of rotation, and the gears 33, 34 are mounted on the upper and lower ends of the third shaft 28, respectively.
- the teeth are meshed in correspondence with 29, 30.
- the gearbox 22 has a spherical shell 22 A, which is a combination of a pair of hemispheric shells 22 A and 22 A 2, as shown in FIG.
- the differential gear mechanism 35 is incorporated into the hollow portion 22H in the spherical shell 22A so as to engage with the fourth and fifth shafts 31 and 32. It is configured.
- the differential gear mechanism 35 has a support shaft 36 rotatably supported in the spherical shell 22A so as to be orthogonal to the fourth and fifth shafts 31 and 32.
- a pair of upper and lower small-diameter bevel gears 39, 40 which are coaxially and integrally connected to the gears 3, 3, 3 attached to the shafts 31, 31, 32, respectively.
- a slide guide 41 made of synthetic resin or the like is rotatable about the support shaft 36 so that its longitudinal direction is orthogonal to the support shaft 36.
- a pair of upper and lower sliders 42 A and 42 B are attached to the slide guide 41 so as to be freely slidable in the longitudinal direction so as to sandwich the support shaft 36 therebetween.
- a pair of upper and lower guide rails 41A and 41B are formed on one surface of the slide guide 41 in parallel with each other in the longitudinal direction, while one surface of each of the sliders 42A and 42B is formed.
- One end of each of the wire portions 7WA and 7WB is embedded in the guide groove of each slider along the groove direction so as to hold the entire tail portion 7T.
- a pair of upper and lower pin-shaped projections 38 A and 38 B are integrally formed inside one large-diameter bevel gear 38 so as to maintain a positional relationship of 180 ° (°) with each other,
- a pair of upper and lower slide grooves 42AH and 42BH are formed parallel to the fourth and fifth shafts 31 and 32, respectively.
- the large-diameter bevel gears 37, 38 are opposite to each other about the support shaft 36 (arrow a).
- the projections 38A and 38B of the bevel gear 38 slide while sliding along the slide grooves 42AH and 42BH, respectively.
- the sliders 42 A and 42 B can be slid in the opposite directions to the slide guide 41 by pressing against the rotation direction 8.
- the gear transmission A pair of upper and lower gears 33 and 34 attached to the fourth and fifth shafts 31 and 32 via the mechanism 21 form a pair of upper and lower small-diameter umbrellas constituting the differential gear mechanism 35.
- the two wire portions 7WA in the tail 7T can be bent by alternately pushing or pulling the 7 WB.
- the tail portion 7T can be bent in a vertical direction at a desired amplitude, or can be swung in a vertical direction at a desired amplitude and speed.
- a slot 22 having a predetermined size parallel to the fourth and fifth shafts 31 and 32 is provided at a predetermined position on the front surface of the spherical shell 22A in the gearbox 22.
- the slide guide 41 in the gear box 22 and the tail portion 7T drawn from the pair of upper and lower sliders 42A, 42B are projected through the elongated hole 22L.
- the tail portion 7T can be bent or swung up and down within the range of both upper and lower edges of the long hole 22L.
- the gear transmission mechanism The pair of upper and lower gears 33 and 34 attached to the fourth and fifth shafts 31 and 32 via 21 rotate together in the same direction (the direction of the arrow b or the opposite direction).
- the upper and lower pair of small-diameter bevel gears 39 and 40 constituting the differential gear mechanism 35 are in the same direction because they mesh with both the left and right pair of large-diameter bevel gears 37 and 38.
- the differential gear mechanism 35 that is, the gear box 22 is integrated with the pair of upper and lower gears 33, 34 around the fourth and fifth shafts 31, 31, 32 as the center of rotation. And rotate in the same direction.
- the tail unit 7 as shown in FIG. 1 0, a pair of Akuchiyue Isseki 7 Alpha chi, so that 7 A 2 each output shaft 7 A 1X, 7 A 2X are rotated in the same direction at the same rotational speed
- the tail portion 7T is moved in the direction of the arrow c or in the direction opposite thereto. It can be bent at a desired amplitude in the left-right direction or the arrow d direction or the vertical direction opposite thereto, or can be swung at the desired amplitude and speed in the left-right direction.
- a sensor input processing unit 50 that recognizes a specific external state
- the sensor input processing unit The emotion that expresses the emotion and the state of the instinct by accumulating the recognition results of the 50 'instinct model unit 51, based on the recognition results of the sensor input processing unit 50
- Action decision mechanism 52 which determines the next action, and the decision result and emotion of the action decision mechanism 52, the posture transition that actually causes the pet robot 2 to express the action based on the expression result of the instinct model section 51.
- Mechanism 53, and each actuary 5 A Ai to 7 A! And 7 A 2 can be divided into a control mechanism 54 for controlling the drive.
- the sensor input processing unit 50 determines a specific external state or the like based on the audio signal S 1, the image signal S 2, and the pressure detection signal S 3 given from the microphone 15, the CCD camera 16 and the evening sensor 17. After detecting and recognizing a specific action from the user and an instruction from the user, the recognition result is converted into an auditory state, a visual state, and a perceived state received by the real animal, and the internal sensor unit 14 is configured. Based on the remaining battery level detection signal S4 and the heat detection signal S5 provided from the battery sensor 12 and the heat sensor 13 to be turned on, the remaining temperature of the battery 11 is recognized and the internal temperature of the robot 2 is recognized. The obtained state recognition information D1 is notified to the emotion / instinct model unit 51 and the behavior determination mechanism unit 52.
- the sensor input processing unit 50 constantly monitors the audio signal S 1 given from the microphone 15, and issues a command such as “walk”, “down”, “follow the ball”, etc. as the spectrum of the audio signal S 1.
- a spectrum with the same scale as the command sound output from the sound commander (not shown) is detected in response to the command, it is recognized that the command has been given, and the recognition result is used as the emotion / instinct model unit 51 and the behavior. Notify the decision mechanism 52.
- the sensor input processing unit 50 constantly monitors the image signal 2 given from the CCD camera 16 and includes, for example, a “red round object” or “vertical to the ground” in an image based on the image signal S 2.
- a plane having a predetermined height or more is detected, it is recognized that "there is a ball” and “there is a wall”, and the recognition result is notified to the emotion / instinct model section 51 and the action determination mechanism section 52.
- the sensor input processing unit 50 constantly monitors the pressure detection signal S3 given from the evening sensor 17 and, based on the pressure detection signal S3, has a predetermined threshold value or more and a short time (for example, less than 2 seconds). When it detects the pressure of When a pressure that is less than a predetermined threshold and that is for a long time (for example, 2 seconds or more) is detected based on the pressure detection signal S3, it is recognized as “patched (praised)”. Then, these recognition results are notified to the emotion and instinct model unit 51 and the action determination mechanism unit 52.
- the sensor input processing section 50 constantly monitors the remaining battery level detection signal S4 provided from the battery sensor 12 and, based on the remaining battery level detection signal S4, determines that the remaining level of the battery 11 is 8 0 [ %], 50 to 25 [%], 25 to 20 [%], and 20 [%], and recognizes the recognition result. 51 and the action decision mechanism 52 are notified.
- the sensor input processing unit 50 constantly monitors the heat detection signal S5 given from the heat sensor 13, and based on the heat detection signal S5, the internal temperature of the pet robot 2 is equal to or higher than a predetermined dangerous temperature. Is detected, it is recognized as “dangerous”, and the recognition result is notified to the emotion / instinct model unit 51 and the action determination mechanism unit 52.
- the emotional instinct model section 51 includes a basic emotion group 60 composed of a plurality of independent emotion models 60 A to 60 D, and a plurality of independent desire models.
- the control parameters for the basic desire group 61 consisting of the desire units 61 A to 61 D are stored in the memory 10 OA of the controller 10.
- the emotion unit 6 OA indicates the emotion of “joy”
- the emotion unit 60 B indicates the emotion of “sadness”
- the emotion unit 60 C indicates the anger. ".
- the emotion units 60 A to 60 D represent the degree of emotion by, for example, the intensity up to the 0 to 100 level, and the intensity of the emotion changes every moment based on the supplied state recognition information D 1. Let it. Thus, the emotion 'instinct model unit 51 expresses the emotional state of the pet robot 2 by combining the intensity of the emotion unit 60 A to 60 D that changes from moment to moment, and models the temporal change of the emotion. .
- the desire unit 6 1 A indicates the desire of “appetite”
- the desire unit 6 1 B indicates the desire of “curiosity”.
- Unit 6 1 C indicates the desire to exercise.
- the desire units 61A to 61D indicate the degree of desire by, for example, intensities up to the 0 to 100 level, and the supplied state recognition information D1 The intensity of the desire is changed every moment based on the.
- the instinct model unit 51 expresses the state of the instinct of the pet robot 2 by combining the intensity of the desire unit 61 A to 61 D, which changes every moment, and models the temporal change of the instinct. ing.
- the emotion / instinct model unit 51 changes the intensity of the emotion units 60 A to 60 D and the desire units 61 A to 6 ID based on the state recognition information D 1.
- the emotion and instinct model section 51 determines the state of emotion by cumulatively combining the intensity of the changed emotion units 60 A to 60 D, and changes the changed desire unit 61 A to 61.
- the state of the instinct is determined by cumulatively combining the intensities of D, and the determined emotion and the state of the instinct are sent to the action determining mechanism 52 as emotion / instinct state information D 2.
- the emotion and instinct model unit 51 connects desired emotion units 60 A to 60 D among the basic emotion group 60 with each other in a mutually repressive or mutually stimulating manner, and the combined unit unit 60
- the intensity of one emotion unit is changed from A to 60D, the intensity of the other emotion unit changes accordingly, and a pet robot having natural emotions is realized.
- the emotion / instinct model unit 51 praise the user by mutually inhibitingly combining the “joy” emotion unit 6 OA and the “sad” emotion unit 60 B.
- the intensity of the emotional unit 6 OA was increased, and the state recognition information D 1 that changed the intensity of the emotional unit 60 B was not supplied.
- the intensity of the “joy” emotion unit 6 OA increases, the intensity of the “sadness” emotion unit 60 B naturally decreases.
- the intensity of the “sadness” emotion unit 60 B increases.
- the intensity of the “sadness” emotion unit 60 B increases.
- the emotional instinct model unit 51 also connects the “sadness” emotion unit 60B and the “anger” emotion unit 60C in a mutually stimulating manner, so that when the user is hit, Increases the intensity of the "anger” emotion unit 60C, and does not supply state recognition information D1 that changes the intensity of the "sadness” emotion unit 60B.
- "Anger” The intensity of the emotion unit 60B naturally increases as the intensity of the emotion unit 60C increases.
- the emotion and instinct model unit 51 naturally responds to the increase in the intensity of the “sadness” emotion unit 60 B. Anger Increases the intensity of emotional unit 60 C.
- the emotion and instinct model unit 51 connects the desired desire units 61 A to 61 C of the basic desire group 61 as in the case of connecting the emotion units 60 A to 60 C.
- the intensity of one of the combined desire units 61A to 61C is changed, the intensity of one of the combined desire units is changed accordingly.
- the strength changes, and the robot 2 with natural instinct is realized.
- the emotion / instinct model unit 51 reciprocally connects the “appetite” desire unit 6 1 A and the “motor desire” unit unit 61 B. Therefore, when the battery 11 inside the pet robot 2 becomes low, the intensity of the “appetite” desire unit 61 A and the intensity of the “appetite” desire unit 61 B at that time are increased. Even if state recognition information D 1 that changes
- the intensity of the “appetite” desire unit 61 A increases, the intensity of the “exercise desire” desire unit 61 B naturally decreases.
- the emotion and instinct model unit 51 naturally increases the intensity of the “exercise desire” desire unit 6 1B, and automatically responds to the increase in the intensity of the “exercise desire” unit 6 1B.
- Appetite "Desire unit 6 Decreases the intensity of 1 A.
- the emotional / instinct model unit 51 reciprocally connects the “curiosity” desire unit 6 1B and the “motor desire” unit 6 1C so that the “motion desire”
- the desire for exercise Curiosity naturally increases as the intensity of the unit 61 C increases.
- the emotion and instinct model unit 51 naturally “exercises” in response to the increase in the intensity of the “curiosity” desire unit 61B. Desire "Desire unit 6 1 Increase the strength of C.
- the emotion and instinct model unit 51 generates the behavior information D indicating the current or past behavior of the bet robot 2 itself, for example, the behavior such as “walking for a long time”, from the behavior determination mechanism unit 52 in the latter stage. 3 is supplied, and even if the same state recognition information D1 is given, different emotion and instinct state information D2 is generated according to the action of the pet robot 2 indicated by the action information D3. ing.
- the emotion and instinct model unit 51 includes action information D 3 indicating the action of the pet robot 2 and state recognition at the preceding stage of each emotion unit 60 A to 60 C.
- the intensity increase / decrease functions 65A to 65C for generating the intensity information D4A to D4C for increasing or decreasing the intensity of each emotion unit 60A to 60C are provided respectively.
- the intensity of each emotion unit 60A to 60C is increased or decreased according to the intensity information D4A to D4C output from the intensity increase / decrease function 65A to 65C.
- the emotion and instinct model unit 51 includes, if the user greets the user with his / her head, the behavior information D3 that greets the user and the state recognition information D1 that the user has patted the head. Is given to the intensity increasing / decreasing function 65 A, the “joy” emotion unit 6 increases the intensity of the OA, while stroking the head while performing any work, that is, Even if the action information D3 indicating that there is and the state recognition information D1 of being stroked are given to the intensity increasing / decreasing function 65A, the intensity of the "joy" emotion unit 6OA is not changed.
- the emotion and instinct model unit 51 refers to each of the emotion units 60 A to 60 C by referring not only to the state recognition information D 1 but also to the behavior information D 3 indicating the current or past behavior of the pet robot 2. Determining the intensity of the task, for example, when the user stroks his head with the intention of playing pranks while performing some task, causes ⁇ pleasure '' emotional unit 6 to cause unnatural emotions such as increasing the intensity of OA Can be avoided.
- the emotion and instinct model unit 51 similarly applies to each of the desire units 61 A to 61 C based on the supplied state recognition information D 1 and action information D 3.
- the intensity of 6 1 A to 61 C is increased or decreased, respectively.
- the emotion and instinct model unit 51 includes behavior information D3 indicating the behavior of the pet robot 2 and state recognition at the preceding stage of each desire unit 61A to 61C.
- the intensity increase / decrease functions 66 A to 66 C for generating the intensity information D 4 D to D 4 F for increasing / decreasing the intensity of each desire unit 61 A to 61 C, respectively.
- the intensity increase / decrease function 66 A to 66 C is used to increase or decrease the intensity of each desire unit 61 A to 61 C according to the intensity information D 4 D to D 4 F, respectively.
- the emotion and instinct model unit 51 gives the intensity increase / decrease function 6 6 A with the action information D 3 that the user is looking at a favorite color and the state recognition information D 1 that he is still for a while.
- the behavior information D 3 that the user has just walked and is taking a break and the state recognition information D 1 that he has been still for a while in an empty environment are included. Even if given to the intensity increase / decrease function 65A, it does not change the intensity of the "curiosity" desire unit 61B.
- the emotion and instinct model unit 51 refers to each of the desire units 61A to 61 by referring to not only the state recognition information D1 but also the behavior information D3 indicating the current or past behavior of the pet robot 2.
- the intensity of C for example, when looking at a favorite color while walking and taking a break, an unnatural instinct that increases the intensity of the “curiosity” desire unit 6 1 B Can be avoided.
- the intensity increase / decrease functions 65 A to 65 C and 66 A to 66 C When the knowledge information D1 and the behavior information D3 are input, the function is such that intensity information D4A to D4F is generated and output in accordance with a preset parameter. -By setting the evening to a different value for each robot 2, the personal robot 2 has a personality, for example, an offspring robot 2 or a bright robot 2 Can be made.
- the action determining mechanism 52 determines the next action based on the state recognition information D1 and the emotion / instinct state information D2, and transmits the content of the determined action to the action command information D5. To the posture transition mechanism 53.
- the action determining mechanism 52 displays the history of the state recognition information D 1 supplied in the past as an operation state (hereinafter, referred to as a state).
- This state is called a finite state automaton 70 having a finite number of states that determine the next action by transiting the state to another state based on the state recognition information D1 and the state at that time.
- the algorithm is used.
- the action determination mechanism 52 changes the state each time the state recognition information D1 is supplied, and determines the action according to the state that has been shifted, so that only the current state recognition information D1 is provided. The action is determined by referring to the past state recognition information D1 but not the state.
- the action determining mechanism 52 detects that a predetermined trigger has been made, it changes the current state to the next state.
- triggers include, for example, the time during which an action in the current state is executed reaches a certain value, or a specific state.
- Recognition information D 1 or emotionEmotion supplied from instinct model section 51Intensity status of emotion unit 60 A to 60 C and desire unit 61 A to 61 C indicated by instinct state information D 2 For example, the strength of the desired unit exceeds a predetermined threshold.
- the behavior determining mechanism 52 includes an emotion unit 60 A to 60 C and a desire unit 61 A to 61 C indicated by the emotion and the instinct state information D 2 supplied from the instinct model unit 51.
- the transition destination state is selected based on whether or not the intensity of the desired unit exceeds the predetermined threshold among the intensities of C. As a result, even when the same state recognition information D1 is input, for example, the action determining mechanism 52 differs depending on the intensity of the emotion unit 60A to 60C and the desire unit 61A to 61C. It is made to transit to the state.
- the behavior determining mechanism 52 detects, for example, that the palm has been put out in front of the eyes based on the supplied state recognition information D 1, and generates “anger” based on the emotion / instinct state information D 2.
- the intensity of the emotion unit 60 C is equal to or less than a predetermined threshold
- the state recognition information D 1 detects “not hungry”, that is, the battery voltage is equal to or higher than the predetermined threshold
- action command information D5 for performing the operation of the "front” is generated and transmitted to the posture transition mechanism 53.
- the behavior determining mechanism 52 may, for example, display a palm in front of the eyes, and the intensity of the emotion unit 60 C “anger” is less than or equal to a predetermined threshold value, and “hungry”.
- the behavior determining mechanism 52 When detecting that the voltage is less than the predetermined threshold value, it generates action command information D5 for performing an operation such as “licking the palm of a hand” and sends it to the posture transition mechanism 53.
- the action determining mechanism 52 When the action determining mechanism 52 detects, for example, that the palm is in front of the eyes and that the intensity of the “anger” emotion unit 60 C is equal to or higher than a predetermined threshold value, the action determination mechanism section 52 displays the message “Hungry. No ", that is, regardless of whether or not the battery voltage is equal to or higher than a predetermined threshold, action command information D5 for performing an operation such as" turning to the side "is generated. Send to 3. Further, the behavior determination mechanism 52 detects, for example, that the remaining amount of the battery 11 is equal to or less than a predetermined threshold based on the supplied state recognition information D 1 and based on the emotion / instinct state information D 2.
- action command information D 5 for performing the “charge” operation is generated, and this is used as a posture transition mechanism. Send to part 53.
- the action decision mechanism 52 includes an emotion unit 60 A to 60 C and a desire unit 61 A to 61 C indicated by the emotions and the emotion supplied from the instinct model unit 51. Based on the intensity of the desired unit, of the intensity, the parameters of the action performed in the transition destination state, such as the walking speed, the magnitude and speed of the movement when moving the limbs, and the pitch when producing the sound The size of the pod is determined, and action command information D3 corresponding to the parameter of the action is generated and sent to the posture transition mechanism 53.
- the information recognition information D 1 provided from the sensor input processing unit 50 has different emotions and instincts because the content of the information is different depending on the timing of being input to the emotion and instinct model unit 51 and the action determination mechanism unit 52.
- the data is input to the behavior determining mechanism 52 as well as the model 51.
- the controller 10 when the controller 10 is supplied with the information recognition information D1 of "patched on the head", the emotions and instinct model unit 51 generate the emotion and instinct state information D2 of "happy".
- the emotion / instinct state information D2 is supplied to the action determining mechanism 52.
- the action determining mechanism In part 52 when the information recognition information D1 that "the hand is in front of the eyes” is supplied, the action determining mechanism In part 52, the action command information of "please suddenly” based on the emotion of "happy” and the instinct status information D2 and the information recognition information D1 of "the hand is in front of you” in section 52 5 is generated and sent to the posture transition mechanism 53.
- the controller 10 receives the information recognition information D 1 indicating that “the battery 11 has almost no remaining capacity”
- the emotion / instinct model unit 51 indicates that the feeling “hungry” is obtained.
- ⁇ Generates instinct status information D 2 and performs the emotion 2 is supplied to the behavior determining mechanism 52.
- the behavior determining mechanism 52 Based on the feeling of feeling hungry and the instinct status information D2 and the information recognition information D1 wanting to cool down the body, action command information D5 of "get up and appeal" is generated. Is sent to the posture transition mechanism 53.
- the posture transition mechanism unit 53 transmits posture transition information D 6 for transitioning from the current posture to the next posture based on the behavior command information D 5 supplied from the behavior determination mechanism unit 52. It is generated and sent to the control mechanism 54.
- the posture that can be changed next from the current posture is the physical shape of the pet robot 2 such as the shape and weight of the body, hands and feet, and the connection state of each part, and the direction and angle at which the joint bends, for example. It is determined and good Una Akuchiyue Isseki of 5 to 7-Ai and 7 a 2 mechanism by. By the way, such transitionable postures are classified into postures that can transition directly from the current posture and postures that cannot transition directly.
- a four-legged pet robot 2 can directly transition from a lying down state to a lying down state by throwing out a large limb, but cannot directly transition to a standing state. It is necessary to perform a two-step operation in which the robot approaches a prone position, and then stands up. There are also postures that cannot be safely executed. For example, a four-legged pet robot can easily fall over when trying to banzai with both front legs raised while standing.
- the posture transition mechanism unit 53 registers in advance the transitionable posture, and when the behavior command information D5 supplied from the behavior determination mechanism unit 52 indicates the directly translatable posture, the behavior command While the information D5 is sent as it is to the control mechanism unit 54 as the posture transition information D6, if it indicates a posture that cannot be directly transited, it transitions once to another transmissible posture and then transits to the target posture. It generates posture transition information D6 to be transmitted and sends it to the control mechanism 54. Thereby, the pet robot 2 can avoid a situation in which a posture that cannot be transitioned is forcibly executed or a situation in which the pet robot 2 falls down.
- the posture transition mechanism unit 53 registers in advance the postures that the robot 2 can take, and records between the two postures that can be transitioned.
- the posture transition mechanism 53 expresses the postures that the pet robot 2 can take as nodes NODE1 to NODE5, and between the transitionable postures, that is, between the nodes NODE1 to NODE5.
- the posture transition mechanism 53 are connected by directed arcs ARC 1 to ARC 10 using an algorithm called a directed graph 80.
- the posture transition mechanism 53 responds to the node NODE corresponding to the current posture and the posture to be taken next indicated by the behavior command information D5. Searches the route from the current node NOD E to the next node NOD E according to the direction of the directed arc ARC so as to connect with the node NODE, and records the node NODEs on the searched route in order. By doing so, the plan of posture transition is made. Accordingly, the pet robot 2 can implement the action instructed by the action determining mechanism unit 52 while avoiding a situation in which a posture in which transition is impossible is forcibly performed or a situation in which the robot falls down.
- the posture transition mechanism unit 53 indicates the posture of “turn off” when the action command information D 5 of “sail” is supplied. Utilizing the fact that the node NODE 2 can directly transition to the node NODE 5 indicating a posture of “sit”, the posture transition information D 6 of “saw” is given to the control mechanism unit 54.
- the posture transition mechanism unit 53 searches for a route from the node NODE 2 of “disabled” to the node NODE 4 of “aruku”. A posture transition plan D is generated, and as a result, posture transition information D 6 for issuing an instruction “vertical” and then an instruction “walk” is generated and transmitted to the control mechanism 54.
- Control mechanism unit 11 54 generates a control signal S 10 for driving Akuchiyue Isseki 5 AAi ⁇ 7A based on posture transition information D 6, and 7A 2, which Akuchiyue Isseki 5AAi ⁇ 7Ai and the Akuchiyue Isseki 5 a and sent to 7A 2 A, by driving the 7-and 7 A 2, are adapted to perform the desired operation on the pet Torobo' Doo 2.
- the controller 10 in the robot 2 enters the charging request processing procedure RT1 shown in FIG. 19 from step SP0 at the time of operation, and is built in the robot 2 in the following step SP1.
- the remaining amount of the battery 11 is detected by the battery sensor 12.
- step SP2 determines whether or not the remaining amount of the battery 11 is "Low-Low” of 20 [%] or less. If a negative result is obtained in step SP2, this means that "Full” when the remaining capacity of the battery 11 is 80% or more, "Midd le-Full” when 80 to 50%, and 50 to 50%. 25 [%] “Midd 1 e” or 25 to 20 [%] “Low”. At this time, the controller 10 After controlling the action of the robot 2 according to each step, the process returns to step SP1 again.
- the controller 10 controls the step and speed of the walk to be relatively large when the remaining amount of the battery 11 is “Fu 11” of 80% or more, and also controls the remaining amount of the battery 11. Is 80 to 20 [%], that is, from “Middle-Ful lj” to “Low” via “Midd 1 e”, the momentum of Petrobot 2 decreases as the remaining amount decreases The units 5A to 5D, 6, and 7 are controlled so as to make them work. On the other hand, if a negative result is obtained in step SP2, the controller 10 proceeds to step SP4, changes the pet robot 2 to the station transfer posture described above, and appeals to the user for charging. Proceed to SP5 to end the charging request processing procedure RT1.
- the controller 10 controls the drive of the leg unit 5A to 5D so that the robot 2 is in the stage shift posture, and the drive of the tail unit 7 to control the tail.
- the head unit 6 is driven and controlled so that the LED provided at the position of the "eye” is turned on in a predetermined light emission pattern while the unit is swung in a predetermined direction, amplitude and speed.
- a predetermined warning sound for example, “I am hungry!” Is output from the built-in speaker 18 at the “ear” position.
- the controller 10 in the pet robot 2 When the controller 10 in the pet robot 2 actually operates, the controller 10 enters the internal temperature adjustment processing procedure RT2 shown in FIG. 20 from step SP10, and in the following step SP11, the body unit of the pet robot 2 The internal temperature of 4 is detected by the heat sensor 13.
- step SP12 the internal temperature of the pet robot 2 reaches a predetermined dangerous temperature (for example, 60 [° C], that is, a heat resistance temperature at which a general secondary battery is used).
- a predetermined dangerous temperature for example, 60 [° C], that is, a heat resistance temperature at which a general secondary battery is used.
- the process returns to step SP11 when a negative result is obtained, and proceeds to step SP13 when a positive result is obtained.
- the controller 10 determines whether the pet robot 2 is currently in the “down” or “prone” posture, and if a positive result is obtained, the controller 10 proceeds to step SP 14. Proceeding, the leg units 5A to 5D of the pet robot 2 are driven and controlled to shift to the "stand up” posture, and the process returns to step SP11 again to repeat the same processing as described above.
- step SP 13 if a negative result is obtained in step SP 13, this means that the battery 11 built in the body unit 4 of the robot 2 has failed, or that there is something in the body unit 4. Indicates that the outlet 4AX and / or the inlet 4BX is in a closed state due to sticking or wrapping, and at this time, the controller proceeds to step SP15, where the After appealing to the user, the process returns to step SP11 again and repeats the same processing as described above.
- the controller 10 controls the driving of the leg units 5A to 5D and the tail unit 7 so that the pet robot 2 is in the upright and immobile state, and also controls the driving of the head unit 6.
- the LED provided at the “eye” position flashes in a predetermined light emission pattern, and at the same time, a predetermined warning sound from the built-in speaker 18 at the “ear” position Output.
- this robot 2 the user performs an action such as “tapping” or “patching”, and the user generates a voice such as “walking” or “prone” orally or using a sound commander. By doing so, or by arranging objects with distinctive colors and shapes in the action area, these recognition results are converted into various emotions held by real animals.
- the pet robot 2 converts the emotions of “joy”, “sadness” and “anger” into the leg units 5A to 5D, the head unit 6 and the tail unit 7 according to the intensity.
- the actuaries 5 AA i to 7 A! And 7 A 2 By driving and controlling each of the actuaries 5 AA i to 7 A! And 7 A 2 , the same attitude and movement as the emotional expression performed by a real animal are performed.
- the tail 7T connected to the gearbox 22 in the base 7B can be moved not only vertically and horizontally but also in the turning direction. At the same time, it can be stationary while being bent in a desired state in the vertical direction. Therefore, the tail unit 7 expresses various emotional expressions based on the swinging or bending direction of the tail 7T, the amplitude and speed of the swing, the stationary position when bending, and the instantaneous movement. Can be done.
- the tail unit 7 moves the tail 7T in the left-right direction with an amplitude and a speed proportional to the degree of “joy”.
- the tail unit 7 keeps the tail 7 T in a curved state so as to hang downward.
- the tail unit 7 swings the tail 7T in the vertical direction with an amplitude and a speed proportional to the degree of “anger”.
- the pet robot system 1 While the unit 2 is operating, the remaining amount of the battery 11 incorporated in the body unit 4 is detected in a stepwise manner.
- the power consumption of the battery 11 can be saved.
- the leg units 5A to 5D are driven and controlled to change the pet robot 2 to the station transfer posture, and the head unit 6 and / or the tail unit 7 are moved.
- the drive By controlling the drive to blink the LED in a predetermined light emission pattern, to output a predetermined warning sound from the speaker 18, and to swing the tail 7 in a predetermined direction, amplitude and speed.
- the pet robot 2 can directly notify the user that the battery 11 is almost empty. As a result, while it is possible to inform the user that the pet robot 2 is promoting appetite like a real animal, it is possible to prevent the pet robot 2 from suddenly stopping and becoming dead. be able to.
- the pet robot system 1 detects the internal temperature of the body unit 4 during the operation of the pet robot 2 and detects the internal temperature when the pet robot 2 is currently in the “prone” or “prone” posture.
- the temperature reaches a predetermined dangerous temperature
- the pet robot 2 spontaneously moves by controlling the drive of the leg units 5A to 5D to shift the pet robot 2 to the "stand up” posture.
- the internal temperature of unit 4 can be ventilated. As a result, the pet robot 2 can be prevented from breaking down by its own operation.
- the head unit 6 and / or the tail unit 7 are controlled.
- the LED By flashing the LED in a predetermined light emission pattern, outputting a predetermined warning sound from the speaker 18, and swinging the tail 7 in a predetermined direction, amplitude and speed, the robot 2 Can be directly notified to the user that the user is in danger.
- the possibility of breakdown can be significantly reduced.
- the pet robot 2 converts the emotions of the real animal into the emotions of the real animal based on the action of the user, the input of the command, and the own action, and then converts the emotions to the tail unit 7.
- the movement and posture of the tail 7T it is possible to provide the user with a further sense of intimacy and satisfaction, thereby realizing the pet robot 2 that can significantly improve the amusement property.
- the pet robot 2 when it is detected that the battery 11 built in the body unit 4 is almost exhausted during the operation of the pet robot 2, the pet robot 2 is moved to the station transfer posture.
- the head unit 6 and / or the tail unit 7 are driven and controlled to express this to the user and notify the user of the appetite as if it were a real animal. It is possible to give the user a sense of encouragement as a sense of closeness or satisfaction, and thus to realize a pet robot system 1 that can significantly improve the amusement ability ⁇
- the pet robot system 1 when the internal temperature of the body user 4 reaches a predetermined dangerous temperature during the operation of the pet robot 2, the pet robot 2 is currently in the “prone” or “prone” posture. In some cases, the robot shifts to a “stand-up” posture, and at other times, the head unit 6 and / or the tail unit 7 are driven and expressed to that effect by the user to notify the robot. It is possible to prevent the battery 11 from breaking down inside 2 and thus implement the pet robot system 1 that can ensure safety. (8) Other embodiments
- the tail unit 7T of the tail unit 7 in the pet robot 2 is applied as a movable unit having one end rotatably connected in at least one axis or more.
- the present invention is not limited to this, and a tactile sense may be applied to a robot such as an insect.
- various operations bending, swinging and / or turning, etc. can be performed by the robot. If possible, apply widely to other moving parts. Can be used.
- the operation control means for operating the tail unit 7T of the tail unit 7 includes the controller 10 and the actuator 7A in the base unit 7B driven in accordance with the controller control.
- the controller 10 wide as 7 a 2, tooth wheel transmission mechanism 2 1 of the base portion 7 in the B, and has dealt with the case of applying the 2 2 and a gear box, the onset bright is not limited to this, and various other configurations Can be applied.
- the operation control means applied the state recognition information D 1 consisting of the recognition results of the microphone 15, the CCD camera 16, and the evening sensor 17 as input information for determining the operation of the tail 7 T.
- the present invention is not limited to this, and other information such as information on the action from the user and information on the environment around the pet robot 2 may be applied as the input information.
- a controller provided in the body unit 4 is used as an emotion instinct model changing means for determining the operation of the tail unit 7T of the tail unit 7 by changing the emotion instinct model.
- an emotion unit indicating emotions such as “surprise”, “fear”, and “disgust” can be added to the emotion units 60 A to 60 C that constitute the basic emotion group 60 described above with reference to FIG. good.
- the tail unit 7 swings the tail 7T right and left or up and down for a moment with an amplitude proportional to the degree of “surprise”. Just do it.
- the tail 7T may be stopped in a state where it is straight in the horizontal direction.
- the tail unit 7 It is sufficient to turn T at an amplitude and a speed in proportion to the degree of “disgust”.
- the present invention may be applied not only to emotional expressions but also to instinct expressions.
- the tail 7T when expressing “hungry” or “sleepiness”, the tail 7T may be set to the same posture as “sadness” described above.
- the tail 7T when expressing “fullness” or “greed for exercise”, the tail 7T may be operated in proportion to the degree as in the case of "joy” described above.
- the expressive power of the pet robot 2 can be significantly increased.
- the contents expressed by the movement and posture of the tail 7T of the tail unit 7 are determined by using a real animal determined using the microphone 15, the CCD camera 16 and the touch sensor 17
- the present invention is not limited to this, and may be applied to the state expression of various hardware constituting the pet robot 2.
- the state of the various hardware constituting the pet robot 2 includes the swinging or bending direction of the tail 7T, the amplitude and speed at the time of swinging, the stationary position at the time of bending, and What is necessary is just to make it express based on an instantaneous movement etc.
- a feeling of closeness and satisfaction can be given to a user by presenting an animal-like reaction to the user.
- an external air temperature measuring unit (not shown) may be provided in the pet robot 2, and the tail part 7T may be made to perform a desired posture and operation according to the measurement result.
- the controller 10 in the body unit 4 may cause the tail portion 7T to perform a desired posture and operation when writing or reading a program to or from the memory 10A.
- a communication means (not shown) is provided in the body unit 4, and the tail 7T is moved to a desired posture and the transmission amount and communication state of the data only during data transmission / reception via the communication means. An operation such as shaking may be performed in accordance with the operation.
- the present invention is applied to a quadruped-type mouth bot configured as shown in FIG. 2.
- the present invention is not limited to this.
- the present invention can be widely applied to robots having various other configurations such as a robot having a built-in power source.
- at least one or more antennae may be provided in the head unit 6 to operate in the same manner as the tail 7 T.
- an action may be determined using an algorithm called a state machine.
- a new state is generated each time the state recognition information D1 is supplied, and the action is performed according to the generated state. Should be determined.
- a plurality of states are selected as transition destination candidates based on the currently supplied state recognition information D1 and the state at that time, and the state of the transition destination among the plurality of selected states is randomized by a random number.
- the behavior may be determined using an algorithm called a probability finite automaton that is determined as follows.
- the present invention is not limited to this, and other various configurations may be applied.
- the controller controls the pet robot 2 to move to a predetermined posture and / or to perform a predetermined operation.
- the present invention is not limited to this, and the drive and control of the head unit 6 and / or the tail unit 7 based on the emotion and instinct model unit 51 read from the memory 1 OA are described.
- various other configurations may be widely applied.
- the pet robot 2 is caused to transition to the station transfer posture when the remaining amount of the battery 11 becomes equal to or lower than a predetermined level.
- the present invention is not limited to this. However, any other posture may be used as long as it is suitable for charging.
- the remaining amount of the battery 11 is detected at five levels, and the operation of the pet robot 2 is reduced in accordance with each of the five levels.
- the present invention is not limited to this, and the remaining amount of the battery 11 for causing the pet robot 2 to transition to a predetermined posture and / or to perform a predetermined operation may be freely set.
- the thermal sensor 13 constituting the internal sensor unit 14 of the body unit 4 is applied as a temperature detecting unit for detecting the internal temperature of the body unit 4.
- the present invention is not limited to this, and various other configurations may be applied.
- the motion control means for causing the pet robot 2 to transition to a predetermined posture and / or to exhibit a predetermined operation.
- controller 10 when controller 10 is applied, the present invention is not limited to this, and the leg units 5A to 5D are drive-controlled based on the emotion read from the memory 1OA and the instinct model unit 51, and the effect is expressed in an attitude. As long as the user can be notified of this, a variety of other configurations may be widely applied.
- the pet robot 2 when the internal temperature of the body unit 4 becomes equal to or higher than a predetermined temperature, the pet robot 2 is moved to an operation in which the body unit 4 is raised (that is, a “stand-up” posture).
- the operation is such that the ventilation port formed of the exhaust port 4 AX and the intake port 4 BX formed in the body unit 4 is opened. If so, transition to various operations may be performed.
- the present invention is not limited to this, and the behavior of the pet robot 2 may be controlled according to various other internal states of the pet robot 2 or the external environment.
- the gear ratio or the rotation speed of each actuator 5 AA i to 7 A 2 may be selected so as to be in the most efficient state, and the pet robot 2 may be operated.
- the action determination mechanism 52 may transition to the most energy efficient process among the state transition processes in the finite state automaton.
- the pet robot 2 may be caused to walk in a relatively cool direction when the external temperature rises.
- the robot apparatus and the control method thereof can be applied to a pet robot, an amusement robot, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Toys (AREA)
- Manipulator (AREA)
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/743,293 US6415203B1 (en) | 1999-05-10 | 2000-05-10 | Toboy device and method for controlling the same |
JP2000616969A JP4512963B2 (ja) | 1999-05-10 | 2000-05-10 | ロボット装置及びその制御方法 |
KR1020017000437A KR20010053488A (ko) | 1999-05-10 | 2000-05-10 | 로봇 장치 및 그 제어방법 |
EP00925595A EP1112821A4 (en) | 1999-05-10 | 2000-05-10 | TOY AND METHOD FOR CONTROLLING THEREOF |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP11/165756 | 1999-05-10 | ||
JP12927599 | 1999-05-10 | ||
JP16575699 | 1999-05-10 | ||
JP11/129275 | 1999-05-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000067960A1 true WO2000067960A1 (fr) | 2000-11-16 |
Family
ID=26464720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2000/002988 WO2000067960A1 (fr) | 1999-05-10 | 2000-05-10 | Dispositif jouet et sa technique de commande |
Country Status (6)
Country | Link |
---|---|
US (1) | US6415203B1 (ja) |
EP (1) | EP1112821A4 (ja) |
JP (1) | JP4512963B2 (ja) |
KR (1) | KR20010053488A (ja) |
CN (1) | CN1205002C (ja) |
WO (1) | WO2000067960A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002154083A (ja) * | 2000-11-17 | 2002-05-28 | Honda Motor Co Ltd | 人間型ロボットの電装品ボックス冷却構造 |
US7162331B2 (en) | 2002-07-24 | 2007-01-09 | Fujitsu Limited | Power supply control device and method for mobile robot |
JP2008307668A (ja) * | 2007-06-18 | 2008-12-25 | Honda Motor Co Ltd | 移動ロボットの駆動装置 |
US8072426B2 (en) | 2004-08-11 | 2011-12-06 | Pixart Imaging Inc. | Interactive device capable of improving image processing |
JP2012245612A (ja) * | 2011-05-30 | 2012-12-13 | Samsung Electronics Co Ltd | ロボット及びその制御方法 |
US9024880B2 (en) | 2004-08-11 | 2015-05-05 | Pixart Imaging Inc. | Interactive system capable of improving image processing |
WO2016002673A1 (ja) * | 2014-07-01 | 2016-01-07 | シャープ株式会社 | 姿勢制御装置、ロボット、およびプログラム |
JP2017196691A (ja) * | 2016-04-27 | 2017-11-02 | パナソニックIpマネジメント株式会社 | ロボット |
JPWO2019175936A1 (ja) * | 2018-03-12 | 2020-12-03 | 株式会社ソニー・インタラクティブエンタテインメント | ロボット |
Families Citing this family (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
US8788092B2 (en) | 2000-01-24 | 2014-07-22 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8412377B2 (en) | 2000-01-24 | 2013-04-02 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US6956348B2 (en) | 2004-01-28 | 2005-10-18 | Irobot Corporation | Debris sensor for cleaning apparatus |
JP2002127059A (ja) * | 2000-10-20 | 2002-05-08 | Sony Corp | 行動制御装置および方法、ペットロボットおよび制御方法、ロボット制御システム、並びに記録媒体 |
TWI236610B (en) * | 2000-12-06 | 2005-07-21 | Sony Corp | Robotic creature device |
US7571511B2 (en) | 2002-01-03 | 2009-08-11 | Irobot Corporation | Autonomous floor-cleaning robot |
US6690134B1 (en) | 2001-01-24 | 2004-02-10 | Irobot Corporation | Method and system for robot localization and confinement |
JP2002222033A (ja) * | 2001-01-29 | 2002-08-09 | Nec Corp | 省電力システム、タスク省電力処理方法、及びそのプログラム |
US8396592B2 (en) | 2001-06-12 | 2013-03-12 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US7663333B2 (en) | 2001-06-12 | 2010-02-16 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US6507773B2 (en) * | 2001-06-14 | 2003-01-14 | Sharper Image Corporation | Multi-functional robot with remote and video system |
JP4689107B2 (ja) * | 2001-08-22 | 2011-05-25 | 本田技研工業株式会社 | 自律行動ロボット |
KR100624403B1 (ko) * | 2001-10-06 | 2006-09-15 | 삼성전자주식회사 | 인체의 신경계 기반 정서 합성 장치 및 방법 |
US9128486B2 (en) | 2002-01-24 | 2015-09-08 | Irobot Corporation | Navigational control system for a robotic device |
KR20030068886A (ko) * | 2002-02-18 | 2003-08-25 | (주)로보옵틱스 | 이동로봇의 충전제어장치 및 방법 |
DE10231391A1 (de) * | 2002-07-08 | 2004-02-12 | Alfred Kärcher Gmbh & Co. Kg | Bodenbearbeitungssystem |
US8428778B2 (en) | 2002-09-13 | 2013-04-23 | Irobot Corporation | Navigational control system for a robotic device |
US8386081B2 (en) | 2002-09-13 | 2013-02-26 | Irobot Corporation | Navigational control system for a robotic device |
US7805220B2 (en) | 2003-03-14 | 2010-09-28 | Sharper Image Acquisition Llc | Robot vacuum with internal mapping system |
US20050234592A1 (en) * | 2004-01-15 | 2005-10-20 | Mega Robot, Inc. | System and method for reconfiguring an autonomous robot |
US7332890B2 (en) | 2004-01-21 | 2008-02-19 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US20060020369A1 (en) * | 2004-03-11 | 2006-01-26 | Taylor Charles E | Robot vacuum cleaner |
DE112005000738T5 (de) | 2004-03-29 | 2007-04-26 | Evolution Robotics, Inc., Pasadena | Verfahren und Vorrichtung zur Positionsbestimmung unter Verwendung von reflektierten Lichtquellen |
KR101142564B1 (ko) | 2004-06-24 | 2012-05-24 | 아이로보트 코퍼레이션 | 자동 로봇 장치용의 원격 제어 스케줄러 및 방법 |
US8972052B2 (en) | 2004-07-07 | 2015-03-03 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US7706917B1 (en) | 2004-07-07 | 2010-04-27 | Irobot Corporation | Celestial navigation system for an autonomous robot |
KR101240732B1 (ko) | 2005-02-18 | 2013-03-07 | 아이로보트 코퍼레이션 | 습식 및 건식 청소를 위한 자동 표면 청소 로봇 |
US8392021B2 (en) | 2005-02-18 | 2013-03-05 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US7620476B2 (en) | 2005-02-18 | 2009-11-17 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US8930023B2 (en) | 2009-11-06 | 2015-01-06 | Irobot Corporation | Localization by learning of wave-signal distributions |
WO2007041295A2 (en) | 2005-09-30 | 2007-04-12 | Irobot Corporation | Companion robot for personal interaction |
EP2816434A3 (en) | 2005-12-02 | 2015-01-28 | iRobot Corporation | Autonomous coverage robot |
ATE534941T1 (de) | 2005-12-02 | 2011-12-15 | Irobot Corp | Abdeckungsrobotermobilität |
ES2334064T3 (es) | 2005-12-02 | 2010-03-04 | Irobot Corporation | Robot modular. |
ES2522926T3 (es) | 2005-12-02 | 2014-11-19 | Irobot Corporation | Robot Autónomo de Cubrimiento |
EP2544065B1 (en) | 2005-12-02 | 2017-02-08 | iRobot Corporation | Robot system |
EP2394553B1 (en) | 2006-05-19 | 2016-04-20 | iRobot Corporation | Removing debris from cleaning robots |
US8417383B2 (en) | 2006-05-31 | 2013-04-09 | Irobot Corporation | Detecting robot stasis |
EP2123336A4 (en) * | 2007-01-23 | 2010-07-28 | Robo Catcher Partners Co Ltd | PREMIUM SLOW-MAKING AUTOMAT AND HUMANOIDER TWO-DIFFERENT SCRIPTURES |
KR101301834B1 (ko) | 2007-05-09 | 2013-08-29 | 아이로보트 코퍼레이션 | 소형 자율 커버리지 로봇 |
CN101406756A (zh) * | 2007-10-12 | 2009-04-15 | 鹏智科技(深圳)有限公司 | 表达情感的电子玩具及其情感表达方法、发光单元控制装置 |
JP4536134B2 (ja) * | 2008-06-02 | 2010-09-01 | 株式会社コナミデジタルエンタテインメント | ネットワークを利用したゲームシステム、ゲームプログラム、ゲーム装置、およびネットワークを利用したゲーム制御方法 |
US10163175B2 (en) | 2009-02-25 | 2018-12-25 | Humana Inc. | System and method for improving healthcare through social robotics |
EP3192419B1 (en) | 2010-02-16 | 2021-04-07 | iRobot Corporation | Vacuum brush |
US8483873B2 (en) * | 2010-07-20 | 2013-07-09 | Innvo Labs Limited | Autonomous robotic life form |
US20120256752A1 (en) * | 2011-04-06 | 2012-10-11 | James William Musser | System and method to extend operating life of rechargable batteries using battery charge management |
CN104440925B (zh) * | 2014-11-27 | 2016-05-04 | 国家康复辅具研究中心 | 一种宠物型陪护机器人及系统 |
CN108885436B (zh) | 2016-01-15 | 2021-12-14 | 美国iRobot公司 | 自主监视机器人系统 |
CN107225577A (zh) * | 2016-03-25 | 2017-10-03 | 深圳光启合众科技有限公司 | 应用在智能机器人上的触觉感知方法以及触觉感知装置 |
JP6343800B2 (ja) * | 2016-06-14 | 2018-06-20 | Groove X株式会社 | 涼しさを求める自律行動型ロボット |
WO2017218234A1 (en) * | 2016-06-15 | 2017-12-21 | Irobot Corporation | Systems and methods to control an autonomous mobile robot |
US10100968B1 (en) | 2017-06-12 | 2018-10-16 | Irobot Corporation | Mast systems for autonomous mobile robots |
CN111093913A (zh) * | 2017-09-11 | 2020-05-01 | Groove X 株式会社 | 注视对方的行为自主型机器人 |
CN110653813A (zh) * | 2018-06-29 | 2020-01-07 | 深圳市优必选科技有限公司 | 一种机器人控制方法、机器人及计算机存储介质 |
CN108901911A (zh) * | 2018-07-24 | 2018-11-30 | 深圳市必发达科技有限公司 | 宠物远程抚慰方法 |
US11110595B2 (en) | 2018-12-11 | 2021-09-07 | Irobot Corporation | Mast systems for autonomous mobile robots |
JP2024048495A (ja) * | 2022-09-28 | 2024-04-09 | カシオ計算機株式会社 | 機器の制御装置、機器の制御方法及びプログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6224988A (ja) * | 1985-07-23 | 1987-02-02 | 志井田 孝 | 感情をもつロボツト |
JPH06117393A (ja) * | 1992-10-06 | 1994-04-26 | Mitsubishi Electric Corp | ロボットの制御装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59201115A (ja) * | 1983-04-28 | 1984-11-14 | Sanyo Electric Co Ltd | 自走式作業ロボツトの充電装置 |
US4679152A (en) * | 1985-02-20 | 1987-07-07 | Heath Company | Navigation system and method for a mobile robot |
US4698775A (en) * | 1985-05-17 | 1987-10-06 | Flexible Manufacturing Systems, Inc. | Self-contained mobile reprogrammable automation device |
JP2906773B2 (ja) * | 1991-09-26 | 1999-06-21 | 株式会社安川電機 | 産業用ロボットの制御装置 |
JPH05111891A (ja) * | 1991-10-21 | 1993-05-07 | Mitsubishi Electric Corp | ロボツト制御装置 |
JP3293952B2 (ja) * | 1992-05-26 | 2002-06-17 | 本田技研工業株式会社 | 衝撃吸収手段を備えた脚式歩行ロボット |
JPH07325620A (ja) * | 1994-06-02 | 1995-12-12 | Hitachi Ltd | 知能ロボット装置及び知能ロボットシステム |
JP3413694B2 (ja) * | 1995-10-17 | 2003-06-03 | ソニー株式会社 | ロボット制御方法およびロボット |
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
JP3655056B2 (ja) * | 1997-08-04 | 2005-06-02 | 本田技研工業株式会社 | 脚式移動ロボットの制御装置 |
JPH11126017A (ja) * | 1997-08-22 | 1999-05-11 | Sony Corp | 記憶媒体、ロボット、情報処理装置、並びに電子ペットシステム |
US6249780B1 (en) * | 1998-08-06 | 2001-06-19 | Yamaha Hatsudoki Kabushiki Kaisha | Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object |
DE60130822T2 (de) * | 2000-01-11 | 2008-07-10 | Yamaha Corp., Hamamatsu | Vorrichtung und Verfahren zur Erfassung der Bewegung eines Spielers um interaktives Musikspiel zu steuern |
-
2000
- 2000-05-10 JP JP2000616969A patent/JP4512963B2/ja not_active Expired - Fee Related
- 2000-05-10 CN CNB00800823XA patent/CN1205002C/zh not_active Expired - Fee Related
- 2000-05-10 KR KR1020017000437A patent/KR20010053488A/ko not_active Application Discontinuation
- 2000-05-10 US US09/743,293 patent/US6415203B1/en not_active Expired - Fee Related
- 2000-05-10 EP EP00925595A patent/EP1112821A4/en not_active Withdrawn
- 2000-05-10 WO PCT/JP2000/002988 patent/WO2000067960A1/ja not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6224988A (ja) * | 1985-07-23 | 1987-02-02 | 志井田 孝 | 感情をもつロボツト |
JPH06117393A (ja) * | 1992-10-06 | 1994-04-26 | Mitsubishi Electric Corp | ロボットの制御装置 |
Non-Patent Citations (4)
Title |
---|
MASAHIRO FUJITA ET AL.: "Reconfigurable physical agents", PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS, 9 May 1998 (1998-05-09), pages 54 - 61, XP002933751 * |
MASAHIRO FUJITA: "Robot entertainment: Kogata 4 kyaku jiritsu robot", TRANSACTIONS OF NIPPON ROBOT, vol. 16, no. 3, 15 April 1998 (1998-04-15), pages 31 - 32, XP002933752 * |
See also references of EP1112821A4 * |
TETSUYA OGATA ET AL.: "Robot no shintaisei ni motozuku kanjo model to naibu hyosho kakutoku model", PROCEEDINGS DISTRIBUTED AT LECTURE MEETING OF NIPPON ROBOTICS, MECHATRONICS, vol. 1998, no. PT1, 26 June 1998 (1998-06-26), pages 2CII4.3(1) - 2CII4.3(2), XP002933753 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002154083A (ja) * | 2000-11-17 | 2002-05-28 | Honda Motor Co Ltd | 人間型ロボットの電装品ボックス冷却構造 |
US7162331B2 (en) | 2002-07-24 | 2007-01-09 | Fujitsu Limited | Power supply control device and method for mobile robot |
US9024880B2 (en) | 2004-08-11 | 2015-05-05 | Pixart Imaging Inc. | Interactive system capable of improving image processing |
US8072426B2 (en) | 2004-08-11 | 2011-12-06 | Pixart Imaging Inc. | Interactive device capable of improving image processing |
US8760390B2 (en) | 2004-08-11 | 2014-06-24 | Pixart Imaging Inc. | Interactive device capable of improving image processing |
JP2008307668A (ja) * | 2007-06-18 | 2008-12-25 | Honda Motor Co Ltd | 移動ロボットの駆動装置 |
JP2012245612A (ja) * | 2011-05-30 | 2012-12-13 | Samsung Electronics Co Ltd | ロボット及びその制御方法 |
WO2016002673A1 (ja) * | 2014-07-01 | 2016-01-07 | シャープ株式会社 | 姿勢制御装置、ロボット、およびプログラム |
JPWO2016002673A1 (ja) * | 2014-07-01 | 2017-04-27 | シャープ株式会社 | 姿勢制御装置、ロボット、プログラム、および姿勢制御方法 |
US10350763B2 (en) | 2014-07-01 | 2019-07-16 | Sharp Kabushiki Kaisha | Posture control device, robot, and posture control method |
JP2017196691A (ja) * | 2016-04-27 | 2017-11-02 | パナソニックIpマネジメント株式会社 | ロボット |
WO2017187965A1 (ja) * | 2016-04-27 | 2017-11-02 | パナソニックIpマネジメント株式会社 | ロボット |
JPWO2019175936A1 (ja) * | 2018-03-12 | 2020-12-03 | 株式会社ソニー・インタラクティブエンタテインメント | ロボット |
Also Published As
Publication number | Publication date |
---|---|
KR20010053488A (ko) | 2001-06-25 |
JP4512963B2 (ja) | 2010-07-28 |
CN1205002C (zh) | 2005-06-08 |
CN1304346A (zh) | 2001-07-18 |
EP1112821A1 (en) | 2001-07-04 |
US6415203B1 (en) | 2002-07-02 |
EP1112821A4 (en) | 2005-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000067960A1 (fr) | Dispositif jouet et sa technique de commande | |
US20050216121A1 (en) | Robot apparatus and emotion representing method therefor | |
WO2017191765A1 (ja) | ロボット | |
WO2002030628A1 (fr) | Appareil robotise et procede permettant de le commander | |
US8057275B2 (en) | Toy with pivoting portions capable of rolling over and methods thereof | |
JP2001025984A (ja) | ロボット装置及びその制御方法並びに記録媒体 | |
JP4617428B2 (ja) | 動作生成システム | |
JP2003071773A (ja) | ロボット装置 | |
CN110254554B (zh) | 一种老年人关怀机器人 | |
KR20010101235A (ko) | 로봇 장치 충전 시스템, 로봇 장치, 충전 장치, 로봇장치의 충전 방법 및 기록 매체 | |
JP4757979B2 (ja) | 電子玩具 | |
JP2004130427A (ja) | ロボット装置及びロボット装置の動作制御方法 | |
CN207950641U (zh) | 可编程全自动变形机器人 | |
JP2003191187A (ja) | ロボット装置及びその制御方法 | |
CN209221483U (zh) | 一种主动攻击式的娱乐机器人 | |
JP4649806B2 (ja) | ロボット装置及びロボット装置の衝撃吸収方法 | |
JP2001157983A (ja) | ロボット装置及びロボット装置の性格判別方法 | |
JP2002178282A (ja) | ロボット装置及びその制御方法 | |
JP2000218065A (ja) | 情報処理装置及び情報処理方法 | |
JP2002370185A (ja) | ロボット装置、玩具装置、並びに、ロボット装置の行動制御システム及び行動制御方法、並びに、制御プログラム及び記録媒体 | |
CN108043042B (zh) | 一种节奏运动球体装置 | |
JP4411503B2 (ja) | ロボット装置及びその制御方法 | |
JP2003136439A (ja) | ロボット装置及びロボット装置の歩行制御方法並びにロボット装置の歩行制御プログラム | |
JP4524524B2 (ja) | ロボット装置及びその制御方法 | |
JP2003266350A (ja) | ロボット装置及び内部状態表出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 00800823.X Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000925595 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020017000437 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09743293 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1020017000437 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2000925595 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2000925595 Country of ref document: EP |
|
WWR | Wipo information: refused in national office |
Ref document number: 1020017000437 Country of ref document: KR |