WO2006077868A1 - Robot toy - Google Patents

Robot toy Download PDF

Info

Publication number
WO2006077868A1
WO2006077868A1 PCT/JP2006/300617 JP2006300617W WO2006077868A1 WO 2006077868 A1 WO2006077868 A1 WO 2006077868A1 JP 2006300617 W JP2006300617 W JP 2006300617W WO 2006077868 A1 WO2006077868 A1 WO 2006077868A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion
robot toy
value
light
level
Prior art date
Application number
PCT/JP2006/300617
Other languages
French (fr)
Japanese (ja)
Inventor
Yuichi Aochi
Wakana Yamana
Eiji Takuma
Yoshiyuki Endo
Fujio Nobata
Original Assignee
Sega Toys Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Toys Co., Ltd. filed Critical Sega Toys Co., Ltd.
Publication of WO2006077868A1 publication Critical patent/WO2006077868A1/en
Priority to US11/775,133 priority Critical patent/US20070270074A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/22Optical, colour, or shadow toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates to a robot toy, and in particular to a cheap robot toy.
  • Japanese Patent No. 3277500 discloses an LED (Light Emitting Diode) having a shape and luminescent color according to the emotion for expressing the emotion of “joy” or “anger”. It is disclosed to be placed on the head and covered with a semi-transparent bar so that it can be viewed from the outside only when the light is emitted, and to drive these LEDs to blink according to the emotion of the robot toy.
  • LED Light Emitting Diode
  • Patent Document 1 Patent No. 3277500
  • the present invention has been made in view of such problems, and it is an object of the present invention to provide a robot toy capable of effectively performing emotional expression while being constructed inexpensively.
  • the robot toy of the present invention comprises a first light source capable of emitting at least two colors of light simultaneously or separately, and five or more lights arranged around the first light source.
  • various emotion expressions can be performed with a small number of light sources, and emotion expressions can be effectively performed while being inexpensively constructed.
  • FIG. 1 is a perspective view showing an appearance configuration of a robot toy according to the present embodiment.
  • FIG. 2 is a block diagram showing an internal configuration of a robot toy according to the present embodiment.
  • FIG. 3 is an exploded perspective view showing a specific configuration of the head.
  • FIG. 5 is a perspective view for explaining a pendulum gear.
  • FIG. 6 is a schematic plan view for describing a cam of a first cam gear.
  • Fig. 7 is a perspective view showing the structure of a lift member.
  • FIG. 8 is a perspective view showing the configuration of an ear driving member.
  • FIG. 9 is a schematic diagram for explaining the movement of the head.
  • FIG. 10 A perspective view showing the light emission state of the LED in which the external force is also visible.
  • FIG. 11 is a plan view showing an arrangement relationship of the LEDs in the LED display unit.
  • FIG. 12 A conceptual view for explaining the emotion of the robot toy.
  • FIG. 13 A conceptual view for explaining the emotion of the robot toy.
  • FIG. 14 is a conceptual diagram serving to explain a basic light emission pattern.
  • FIG. 15 is a flowchart showing an emotion generation expression processing procedure.
  • FIG. 16 is a flowchart showing a first emotion parameter change processing procedure.
  • FIG. 17 is a flowchart showing a second emotion parameter change processing procedure.
  • FIG. 18 is a flowchart showing a third emotion parameter change processing procedure.
  • FIG. 19 is a flowchart showing a first emotion expression processing procedure.
  • FIG. 20 is a flow chart showing a second emotion expression processing procedure.
  • FIG. 21 is a flowchart showing a third emotion expression processing procedure.
  • FIG. 22 is a schematic diagram for describing another embodiment.
  • FIG. 1 shows a robot toy 1 according to the present embodiment.
  • the robot toy 1 has an appearance that resembles a sitting state of an animal such as a dog as a whole. That is, the robot toy 1 has its head rotatably coupled to an axis extending in the vertical direction from the front end of the body 2, and the ears 4 A and 4 B are provided on the upper left and right sides of the head 3 respectively. It is movably attached.
  • the front legs 5A and 5B of rounded rod shape are respectively movably connected to the front ends of the left and right sides of the body 2 and the rear ends of the left and right sides of the body 2 are Protrusive rear legs 6A and 6B shorter than the forefoot are molded integrally with the body 2.
  • a joystick-like tail switch 7 is movably attached in the vicinity of the upper rear end of the body 2.
  • the body part 2 is formed in a rounded rectangular parallelepiped shape as a whole, and as shown in FIG. 2, the control part 10 which controls the operation of the entire robot toy 1 and the motor drive part 11 are provided therein.
  • a substrate 13 on which the sound amplifier unit 12 and the like are formed, and devices such as the microphone 14 and the speaker 15 are housed.
  • a jack jack is disposed on one rear foot 6B of the trunk 2.
  • the head 3 is formed in a rectangular parallelepiped shape which is flatter than the body 2 as a whole and has a light sensor 20, a device such as an LED display unit 21 having a plurality of LEDs therein, and a motor A drive mechanism 34 (FIG. 3) described later for driving the head 3 and the ears 4A and 4B with the power source 22 is stored. Further, at the upper end portion of the head 3, a head switch 23 composed of a pressing switch is disposed, and at the position of the nose in the head 3, a nose switch 24 composed of a pressing switch is disposed.
  • the forefoot portions 5A, 5B are longer than the rear foot portions 6A, 6B, so one side of the head 3 faces obliquely upward with respect to the mounting surface, and the LED display on this one side
  • the display surface of the section 21 the display can be made easy for the user to view.
  • the microphone 14 of the torso 2 sounds the surrounding sound and sends the obtained audio signal S 1 to the control unit 10.
  • the light sensor 20 of the head 3 detects the ambient brightness and sends the detection result to the control unit 10 as the brightness detection signal S2.
  • the tail switch 7, the head switch 23 and the nose switch 24 detect physical actions such as “toppling” or “pushing” to which the user's power is also given, and control results as operation detection signals S3A to S3C.
  • Send to The control unit 10 is also supplied with an audio signal S4 supplied from an external audio device via the earphone jack 16.
  • the control unit 10 has a microcomputer configuration including a central processing unit (CPU) 10 A, a memory 10 B, an analog Z digital conversion circuit IOC, and the like, and the sound signal S 1 from the microphone 14 and the light sensor 20 Based on the brightness detection signal S2 and operation detection signals S3A to S3C from the tail switch 7, the head switch 23 and the nose switch 24, the presence or absence of the surrounding situation or the user's power is recognized.
  • CPU central processing unit
  • control unit 10 sends motor drive signal S4 to motor drive unit 11 to drive motor 22.
  • the head 3 of the robot toy 1 is made to tilt, and the operation to open and close the ears 4A, 4B is realized.
  • the control unit 10 outputs a sound or music based on the audio signal S5 from the speaker 15 by giving a predetermined audio signal S5 to the speaker 15 through the sound amplifier unit 12 as necessary, or the LED display unit By giving a predetermined drive signal S6 to the LED 21, the LED of the LED display unit 21 is driven to blink in a predetermined light emission pattern.
  • FIG. 3 A specific configuration of the head 3 is shown in FIG.
  • the head 3 of the robot toy 1 has a first casing half 30 that forms an appearance on the back side of the head 3, and a front side of the head 3.
  • a housing comprising a second housing half 31 forming an external shape and first and second U-shaped housing side members 32 respectively forming the left and right sides of the head 3
  • the drive mechanism 34, the cover 35 of the drive mechanism 34, the LED display 21, the light leakage prevention member 36 and the filter cover 37 are sequentially stacked in the front direction indicated by the arrow a. It is composed of things.
  • a worm gear (not shown) is attached to the output shaft of the motor 22, and this worm gear is formed integrally with the gear 40 and the gear 40 coaxially. It meshes with gear 42 through.
  • a movable member 45 having a weight 44 attached to one end thereof is rotatably attached to the shaft 43 to which the gear 42 is attached.
  • a pendulum gear 46 rotatably mounted on the other end of the gear is meshed with a gear 47 coaxially and integrally formed with the gear 42.
  • first and second connection gears 48 and 49 are respectively provided on the left and right sides of the pendulum gear 46 so that the movable member 45 can mesh with the pendulum gear 46 when the movable member 45 rotates about the shaft 43. It is being done.
  • the rotational force when the motor 22 is driven to rotate in the normal direction, the rotational force also transmits the worm gear force attached to the output shaft of the motor 22 to the gear 47 via the gear train up to the gear 42. And the gear 47 rotates the movable member 45 integrally with the pendulum gear 46 in the clockwise direction of FIG. 4 based on the rotational force, thereby meshing the pendulum gear 46 with the first connecting gear 48. Conversely, when the motor 22 is reversely driven, the gear 47 rotates the movable member 45 in the counterclockwise direction in FIG. 4 on the basis of the rotational force, whereby the pendulum gear 46 is engaged with the second connecting gear 49. It is made to be able to be engaged with it.
  • the first connecting gear 48 meshes with the first cam gear 50, and the lower surface side of the first cam gear 50, as shown in FIG. Is formed.
  • the cam 50A is fitted in the engagement hole 51A of the elevating member 51 as shown in FIG. 7 which is disposed movably up and down as shown by the arrow b in FIG. By rotating 50, the lift member 51 can be raised and lowered by the cam 50A.
  • the upper end portion of the raising and lowering member 51 has a symmetrical position with respect to the shaft 52 (FIG. 4) implanted in the first housing half 30 (FIG. 4).
  • the first and second shafts 53A, 53B are implanted so as to be positioned in the first and second shafts 53A, 53B, as shown in FIG. ing.
  • Ear drive member 54 is, as shown in FIG. 8, integrally formed, through cylindrical portion 61, root portions of first and second spring portions 60A and 60B in the form of tweezers, each of which is also an elastic member. It is formed by flexible connection.
  • the ear driving member 54 is provided with holes 62AX and 62BX of the first and second engaging portions 62A and 62B disposed in the vicinity of the root portions of the first and second spring portions 60A and 60B, respectively. It is attached to the raising and lowering member 51 so as to be fitted to the corresponding first or second shaft 53A, 53B of the raising and lowering member 51, and to fit the cylindrical portion 61 with the shaft 52.
  • the first and second shafts 53A and 53B of the elevating member 51 move vertically with respect to the shaft 52.
  • the first and second spring portions 60A, 60B can be driven to open and close integrally with the first and second engaging portions 62A, 62B of the ear drive member 54.
  • the lower end portions of the corresponding ear portions 4A, 4B are respectively fitted into the first and second spring portions 60A, 60B of the ear drive member 54, and the respective ear portions 4A, 4B are respectively the lower end
  • the vicinity of the part is pivotally supported on the upper left and right sides of the first housing half 30, and by force, it interlocks with the opening and closing operation of the first and second spring parts 60A and 60B of the ear driving member 54.
  • the ears 4A, 4B can be driven to open and close.
  • the pendulum gear 46 (FIG. 5) meshes with the first connecting gear 48, and the rotational output of the motor 22 is Of cam gear 50
  • the cam 50A of the first cam gear 50 raises and lowers the raising and lowering member 51 based on the rotational output to open and close the first and second spring portions 60A and 60B of the ear driving member 54, thereby It is made to be able to operate section 4A, 4B to open and close!
  • the second connecting gear 49 meshes with the second cam gear 63, and the lower surface side of the second force gear 63 is, as shown in FIG. 9 (A), A cam 63A of a predetermined shape is formed. Further, on the lower side of the second cam gear 63, two parallel arm portions 65A, 65B of a bifurcated member 65 further fixed to the shaft body 64 fixed to the body portion 2 extend. The cam 63A of the second cam gear 63 is fitted between the two arms 65A and 65B of the bifurcated member 65.
  • the LED display unit 21 as shown in FIG. 3, seven LEDs 21A to 21G are arranged on the substrate 66 in a predetermined positional relationship. The positional relationship of the seven LEDs 21A to 21G and the light emission color of each will be described later.
  • the light leakage prevention member 36 is formed of, for example, a black resin material or rubber material which does not transmit light, and is attached in close contact with the substrate 66 of the LED display unit 21.
  • the pressing portion 24A and the contact portion 24B of the nose switch 24 are fixed to the lower end portion of the light leakage preventing member 36.
  • the light leakage prevention member 36 is provided with a total of seven holes 36A to 36G corresponding to the respective LEDs 21A to 21G of the LED display unit 21.
  • the preventing member 36 is attached to the substrate 66 of the LED display unit 21, the corresponding LEDs 21A to 21G of the LED display unit 21 are filtered through the respective holes 36A to 36G. It is designed to be exposed on the side.
  • the thickness of the light leakage prevention member 36 is set to the same size as the gap from the substrate 66 of the LED display unit 21 to the filter cover 37, whereby each LED 21 A of the LED display unit 21 The light emitted from each of the LEDs 21G can be emitted toward the filter cover 37 without being mixed with the light emitted from the other LEDs 21A to 21G.
  • the filter cover 37 is formed using a translucent resin material
  • the second casing half 31 is formed using a transparent resin material, whereby the LED display portion is formed.
  • the LEDs 21 8 to 21 of 21 do not turn off and turn off, and the LEDs 21A to 21G are turned on without recognizing (invisible) the external light.
  • the LEDs 21A to 21G can be recognized from the outside (this LEDs 21A to 21G can also see the emitted light).
  • the filter cover 37 in this way, when the LEDs 21A to 21G are lit, only the portions of the lit LEDs 21A to 21G do not brighten in a pinpoint manner, and the correspondence provided in the light leakage prevention member 36
  • the holes 36A-36G are made to diffuse light so that the overall shape is uniform brightness!
  • the filter cover 37 is formed in white so that the light emitted from each of the LEDs 21A to 21G of the LED display unit 21 visible from the outside through the filter cover 37 is not blurred by the filter cover 37. It is designed to be able to lightly rise in white.
  • the robot toy 1 generates the emotion of the robot toy 1 at that time based on the contents of the user's action and the presence or absence of the action on the control unit 10 (Fig. 2), and the type and degree of the emotion are generated.
  • an emotion expression function is provided to express the emotion of the robot toy 1 at that time.
  • FIG. 11 shows an arrangement relationship of the LEDs 21A to 21G of the LED display unit 21 used for expressing the emotion of the robot toy 1 as described above.
  • one LED 21 is placed at a predetermined position on the center line in the width direction of the head 3.
  • A is disposed, and the remaining six LEDs 21B to 21G are disposed on concentric circles centered on the LED 21A so as to be equidistantly spaced from each other and at equal intervals. That is, the surrounding six LEDs 21B to 21G are arranged in such a positional relationship as to be placed at each vertex position of a regular hexagonal shape centering on the central LED 21A.
  • the central LED 21 A an LED capable of emitting three colors of green, red and blue simultaneously or separately is used.
  • the surrounding LEDs 21B to 12G can be made to emit orange light by simultaneously lighting two colors of green and red.
  • the emotion of the robot toy 1 is defined by two parameters (hereinafter referred to as emotion parameters) respectively representing the “excitation level” and the “love level”. .
  • emotion parameters are stored in the memory 10B and take values in the range of "one eight" to "eight", respectively.
  • the values of the emotional parameters of "excitation level” and “love level” are both “negative” and “falling” t, and the emotions are correlated to the negative range, and the emotion parameter of "excitation level” is further correlated.
  • the value of the value of the affective parameter of the “love level” is negative, and “anger” and “disgust” t are associated with negative values.
  • the degree of emotion is expressed by the magnitude of the value of each of the “excitation level” and “love level” emotion parameters.
  • the control unit 10 controls the user power, the tail switch 7 and the head switch 23 and the nose switch 24 based on the operation detection signals S3A to S3C respectively supplied from the tail switch 7 and the head switch 23 and the nose switch 24.
  • the values of the emotional parameter values of “excitation level” and “love level” are “1-8” to Emotion is changed by increasing or decreasing it in the range of "8".
  • the user can shift the emotion of the robot toy 1 to “joy” and “love” t by moving the nose switch 24 of the robot toy 1 by pressing this “joy”. And the degree of the feeling of “love” can be increased, and by pressing the head switch 23, the feeling of the robot toy 1 is shifted to the feeling of “normal” or “calm”, or this “ It is possible to increase the degree of feeling of “normal” or “calm”.
  • the control unit 10 increases the value of the emotion parameter of “excitation level” by 1, while the value of the emotion parameter of “love level” is “8. Reciprocate while incrementing or decrementing by 1 in the range of to 8. Therefore, the user swings and operates the tail switch 7 of the mouth bot toy 1 so that the emotions of the robot toy 1 are "angry” and “disgusting” t, and the feeling, "falling” and “shyness” T You can make a transition to a different feeling or increase the degree of that feeling.
  • control unit 10 when the control unit 10 “falls” or “pushes” the user's power, the tail switch 7, the head switch 23 and the nose switch 24, the control unit 10 does not receive a certain period of time (for example, 30 seconds). Decreases the value of each of the emotion level of “excitation level” and “love level” by one. Therefore, at this time, the robot toy 1 shifts its own emotion to "slump” and “loneliness", and the degree of the emotion is increased.
  • a certain period of time for example, 30 seconds
  • each of the LEDs 21A to 21G of the LED display unit 21 is made to blink in a light emission pattern according to the emotion and its degree.
  • control unit 10 further determines the degree of emotion of the robot toy 1 at that time by further determining the values of the emotion parameter of “excitation level” and “love level”.
  • the LEDs 21A to 21G of the LED display unit 21 are blinked with a light emission pattern associated with the degree of emotion of the robot toy 1 at that time.
  • each light emitting pattern of LED display unit 21 with any light emission pattern It is determined in advance whether A to 12 G blinks, and for all the light emission patterns, the LEDs 21 A to 21 G of the LED display unit 21 blink in the light emission pattern.
  • a program (hereinafter referred to as an LED driving program) in which it is defined at what timing to blink the respective LEDs 21A to 21G for driving is stored in advance in the memory 10B.
  • the LEDs 21A to 21G of the LED display unit 21 are selected according to the LED drive program to be pressed. Flash with a light emission pattern according to your emotions and their degree.
  • the light emission patterns of the LEDs 21A to 21G in the LED display unit 21 have the peripheral LEDs 21B to 21G with respect to the central LED 21A, as shown in FIG.
  • a basic light emission pattern is used in which each of the LEDs 21B to 21G is repeatedly turned on and off sequentially to the adjacent LEDs 21B to 21G.
  • this light emission pattern is called a basic light emission pattern.
  • This basic light emission pattern is used as a light emission pattern expressing emotions other than “dropping” and “loneliness”, and “joy” and “love” ("excitation level” and “love level”).
  • the emotion parameter values are both positive, the corresponding LEDs 21B to 21G are driven to emit orange light, and the “normal” and “calm” (“excitation level” emotion parameter values are negative, “love level”)
  • the emotion parameter is 0 or positive
  • the corresponding LED 21B to 21G is driven to emit green, and the values of "anger” and “dislike” (“excitation level” emotion parameter) are 0 or positive
  • the corresponding LEDs 21B to 21G are driven to emit green light.
  • the surrounding LEDs 21 B to 21 G do not emit light, Only the central LED 21A is driven to blink blue.
  • the light emission patterns of the LEDs 21A to 1G in the LED display unit 21 include emotions of "joy” and “great love”, emotions of "normal” and “rest”, emotions of anger and dislike, and In this case, it is defined that the blinking cycle of the surrounding LEDs 21B to 21G becomes faster, so that the light rotates faster as the degree of emotion is greater, “dropping” and In the case of the emotion of “ ⁇ ,” it is defined that the larger the degree of the emotion, the larger, the faster the blinking period of the central LED 21A.
  • the emotion of the robot toy at that time can be visually recognized from the outside based on the light emission colors of the LEDs 21A to 21G in the LED display unit 21, and the LED 21A Based on the blinking speed of 21 D, the degree of emotion can be visually recognized.
  • control unit 10 executes the emotion generation processing of the robot toy 1 as described above and the expression processing of the generated emotion (control processing of the LED display unit 21) based on the above-mentioned LED driving program, as shown in FIG. Emotion generation expression processing procedure shown in Execute according to RT1.
  • the control unit 10 starts this emotion generation expression processing procedure RT1 in step SPO, and in the subsequent step SP1, it represents “the affection of the robot toy 1”
  • the value of each emotional parameter of “level” and “excitation level” is set to the initial value “0”.
  • step SP3 the control unit 10 thereafter proceeds to step SP3 and either the head switch 23 (FIG. 2), the nose switch 24 (FIG. 2) or the tail switch 7 (FIG. 2) is pressed or shaken. Determine whether the car has been moved.
  • step SP4 the control unit 10 obtains a negative result in this step SP3, it proceeds to step SP4 and reads the count value of an internal timer (not shown), and in the following step SP5, based on the count value read in step SP4.
  • Step SP2 set the emotional parameter values of "love level” and "excitation level” to their initial values, or set any of head switch 23, nose switch 24 and tail switch 7 first. It is determined whether or not the force has been pressed or oscillated and the force for which a predetermined time (for example, 30 seconds) has elapsed has passed.
  • a predetermined time for example, 30 seconds
  • control unit 10 If the control unit 10 obtains a negative result in this step SP5, it returns to step SP3 and repeats the loop of step SP3-SP 4-SP 5-SP3 until it obtains a positive result in step SP3 or step SP5 thereafter. .
  • step SP3 When the control unit 10 obtains an affirmative result in step SP3 by the user pressing the head switch 23 of the robot toy 1 eventually, the control unit 10 proceeds to step SP6 and proceeds to FIG.
  • First Emotional Parameter Change Processing Procedure Shown The value of each emotional parameter “emotional level” and “favorable level” is changed in accordance with RT2.
  • step SP6 the first emotion parameter change processing procedure RT2 starts in step SP20, and in the following step SP21, “love level” and “excitation level” respectively. Then, it is determined whether the value of the emotion parameter is the maximum value ("8" in this embodiment).
  • control unit 10 obtains a negative result in this step SP21, it proceeds to step SP22 and increments the values of the affective parameters of the “love level” and the “excitation level” by 1 each. Then, the control unit 10 proceeds to step SP25 to end the second emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
  • step SP21 when the control unit 10 determines in step SP21 that the value of the affective parameter “! Love level” is the maximum value, the control unit 10 proceeds to step SP 23 to “excitement level”. Increase the value of emotional parameter only by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP21 determines in step SP21 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP24 and the emotion of the “love level”. Increase only the parameter value by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP3 of the emotion generation and representation processing procedure RT1 by the user pressing the nose switch 24 of the robot toy 1, the control unit 10 proceeds to step SP7.
  • the second emotion parameter change processing procedure RT3 shown in 17 change the value of each emotion parameter of emotion "love level” and "excitation level".
  • step SP7 the second emotion parameter change processing procedure RT3 is started in step SP30, and in the subsequent step SP31, the value of the emotion parameter of “love level” is maximum. Emotional Paradigm of Force / None and “Excitation Level” It is determined whether or not the power value is a minimum value ("18" in this embodiment).
  • control unit 10 When the control unit 10 obtains a negative result in this step SP31, it proceeds to step SP32 to increase the value of the affection parameter of the “love level” by 1 and the value of the emotion parameter of the “excitation level” 1 Reduce. Further, the control unit 10 thereafter proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
  • control unit 10 determines that the value of the affection parameter of “love level” is the maximum value in step SP 31, the control unit 10 proceeds to step SP 33 and the value of the emotion parameter of “excitation level”. Reduce only one. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
  • control unit 10 determines that the value of the emotion parameter of “excitation level” is the minimum value in step SP31, it proceeds to step SP34 and the value of the emotion parameter of “love level”. Only increase by 1. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3 and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP3 of the emotion generation and representation processing procedure RT1 by swinging operation of the tail switch 7 of the robot toy 1, the control unit 10 proceeds to step SP8, According to the third emotion parameter change processing procedure RT4 shown in FIG. 18, the value of each emotion parameter of emotion "love level” and "excitation level” is changed.
  • step SP8 the third emotion parameter change processing procedure RT4 starts in step SP40, and in the subsequent step SP41, the value of the affection parameter of “love level” is minimum. It is determined whether the value ("18" in this embodiment) is a force or not and the value of the "excitation level” emotional parameter is a maximum value.
  • step SP41 when the control unit 10 obtains a negative result in this step SP41, it proceeds to step SP42 and decreases the value of the affection parameter of “love level” by 1 and the value of the emotion parameter of “excitation level” is 1 increase. Further, the control unit 10 thereafter proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and further performs the emotion generation expression thereafter. Procedure Go to step SP13 of RT1 ( Figure 15).
  • step SP41 determines in step SP41 that the value of the affection parameter of the “love level” is the minimum value
  • the control unit 10 proceeds to step SP43 and the emotion level of the “excitation level”. Increase only the parameter value by one.
  • the control unit 10 then proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
  • step SP41 determines in step SP41 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP44 to process the emotion “love level”. Decrease the value of the parameter by 1 only. Then, the control unit 10 proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • control unit 10 sets the values of affective parameters of “love level” and “excitation level” to initial values, respectively, or head switch 23, nose switch 24 and tail switch 7 in advance. If a positive result is obtained in step SP5 of the emotion generation and expression processing procedure RT1 when the force predetermined time has elapsed since any of the keys is pressed or oscillated, the process proceeds to step SP9. For each of “excitation level” and “excitation level”, it is determined whether or not the value of the emotion parameter is a minimum value.
  • control unit 10 When the control unit 10 obtains a negative result in this step SP9, it proceeds to step SP10 and increments the value of each of the affective parameters of the “love level” and the “excitation level” by one, and then proceeds to step Go to SP 13.
  • control unit 10 determines that the value of the affection parameter of “love level” is the minimum value in step SP 9
  • the control unit 10 proceeds to step SP 11 and the value of the emotion parameter of “excitation level” Only decrease by 1 and then proceed to step SP13.
  • control unit 10 determines in step SP9 that the value of the emotion parameter of “excitation level” is the minimum value, the control unit 10 proceeds to step SP12 and the value of the emotion parameter of “love level”. Only decrease by 1 and then proceed to step SP13.
  • control unit 10 updates the “excitation level” and “love level” updated according to the presence or absence of the user's action in step SP3 to step SP12 as described above. Based on the value of each emotion parameter, the emotion of the robot toy 1 and the degree thereof at that time are expressed by the light emission patterns of the LEDs 21A to 21G in the LED display 21 in steps SP13 to SP19.
  • the memory 10B (FIG. 2) also reads the values of the emotion parameters of the “excitation level” and “love level” after the change, and the values of these emotion parameters are judge. Then, the control unit 10 proceeds to step SP14 when the emotion parameters of "excitation level” and “love level” are both 0 or a positive value, according to the first emotion expression processing procedure RT5 shown in FIG.
  • the emotion (“joy” and “love”) and the degree of the robot toy 1 at that time are expressed as the light emission pattern of the LEDs 21A to 21G in the LED display unit 21.
  • step SP14 the first emotion expression processing procedure RT5 starts in step SP50, and in the subsequent step SP51, each emotion of “excitation level” and “love level” Determine the value of the parameter. Then, when the value of each emotion parameter of “excitation level” and “love level” is within the range of 0 to 4, the control unit 10 proceeds to step SP52, and orange light is rotated once per second.
  • the surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in sequence, and the values of emotion parameters of “excitation level” and “affection level” are not set, otherwise step SP53. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
  • step SP52 When the control unit 10 ends the processing of step SP52 or step SP53, the control unit 10 proceeds to step SP54 to end this first emotion expression processing procedure RT5, and thereafter to step SP3 of the emotion generation expression processing procedure RT1. Return.
  • step SP13 of the emotion generation and expression processing procedure RT1 the control unit 10 determines that the value of the emotion parameter of “excitation level” is 0 or positive and the emotion parameter of “love level” is negative. If so, the process proceeds to step SP15, and according to the second emotion expression processing procedure RT6 shown in FIG. 20, the emotion (“anger” and “dislike”) of the robot toy 1 at that time is displayed on the LED 21A in the LED display unit 21. Expressed as a light emission pattern of ⁇ 21 G.
  • step SP15 this second emotion expression processing procedure RT6 Is started in step SP60, and in the subsequent step SP61, values of emotion parameters of "excitation level” and "love level” are determined. Then, the control unit 10 proceeds to step SP62 when the value of each emotion parameter of "excitation level” and “love level” is within the range of 0 to 4, and the orange light is rotated once per second.
  • the surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in order, and the values of emotion parameters of “excitation level” and “affection level” are not set otherwise, step SP63. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
  • step SP64 to end the second emotion expression processing procedure RT6, and thereafter proceeds to step SP3 of the emotion generation expression processing procedure RT1.
  • control unit 10 determines that the value of the emotion parameter of “excitation level” is negative and the emotion parameter of “love level” is 0 or positive.
  • the emotion (“normal” and “calm”) of the robot toy 1 at that time is displayed according to the third emotion expression processing procedure RT7 shown in FIG. Expressed as a light emission pattern of
  • step SP16 the third emotion expression processing procedure RT3 is started in step SP70, and in the subsequent step SP71, each of the emotion parameters of “excitation level” and “love level” Determine the value. Then, when the value of the affection parameter of “love level” is in the range of 1 to 4 and the value of the emotion parameter of “excitation level” is in the range of 1 to 4 of control unit 10.
  • step SP72 the surrounding LEDs 21B to 21G in the LED display unit 21 are sequentially blinked in green so that the green light makes one rotation in 1 second, and the "excitation level” and "love level” If the value of each emotion parameter is not set, the process proceeds to step SP73, and the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink green so that the green light makes one rotation in 0.5 seconds.
  • step SP74 the control unit 10 proceeds to step SP74 to end the third emotion expression processing procedure RT7, and then proceeds to step SP3 of the emotion generation expression processing procedure RT1.
  • step SP17 Go ahead and determine the value of each emotion parameter of "excitation level” and "love level”.
  • step SP 18 displays the LED Only the central LED 21A in section 21 blinks once per second, and if the values of the emotion level for "excitation level” and “love level” are other than this, the process proceeds to step SP19 to display LED Only the central LED 21A in section 21 blinks once every 0.5 seconds.
  • control unit 10 blinks the LEDs 21A to 21G of the LED display unit 21 with a light emission pattern according to the emotion of the robot toy 1 at that time and the degree thereof.
  • the robot toy 1 expresses the emotions of "joy” and “love”, the emotions of "normal” and “calm”, and the emotions of “anger” and “dislike”.
  • the LEDs 21B to 21G among the LEDs 21A to 21G in the LED display unit 21 are blinked one by one in the clockwise direction to express the feelings of “dropping” and “loneliness”, only the central LED 21A is Blinks blue.
  • the emotion of the robot toy 1 can be expressed with various light emission patterns, as compared to the case where only the LED whose shape is fixed is simply blinked. Since the LED display unit 21 can be constructed with a smaller number of LEDs than when a plurality of LEDs are arranged in a matrix, emotion can be expressed effectively while constructing the robot toy 1 inexpensively.
  • the LED 21A (first light source) at the center of the LED display unit 21
  • one capable of emitting three colors of green, red and blue simultaneously or separately is used, and other surrounding L ED21B to 12G are used.
  • a (second light source) it emits two colors of green and red simultaneously or separately
  • the types and the number of luminescent colors of the central LED 21A and the types and the number of the luminescent colors of the other surrounding LEDs 21B to 12G may be other than this.
  • the remaining six LEDs 21 B to 21 G are arranged so as to be equidistant from the central LED 21 A and equally spaced from each other, but as the arrangement of the LEDs 21 A to: LED 21 G
  • Other arrangements can be applied.
  • FIGS. 22 (A-1) and (A-2) three LEDs 70B to 70D are arranged around the central LED 70, and FIGS. 22 (B-1) and (B-2).
  • Figure 22 (C-1) and (C-2) As shown in Figure 22 (C-1) and (C-2), 5 LEDs 72B-around the central LED 72A).
  • LEDs 73B to 73I may be arranged around the central LED 73A, or more LEDs may be arranged. Also in this case, the same effect as the above embodiment can be obtained.
  • the number and types of force emotional parameters may be other than the above, so that the emotional parameters are set to two types of “excitation level” and “love level”.
  • the same number of emotion types as the emission colors of the LEDs 21A to 21G are prepared, and the LEDs 21A to 21G are blinked in a light emission pattern in which the emotions correspond to the emission colors of the LEDs 21A to 21G.
  • each LED 21A is a light emission pattern that blinks only the central LED 21A of the LED display unit 21 or blinks only the peripheral LEDs 21B to 21G clockwise with respect to the central LED 21A.
  • a light emission pattern other than this may be used as the light emission pattern in which the light of ⁇ 21 G is made to blink.
  • only the peripheral LEDs 21B to 21G blink one by one in a counterclockwise direction with respect to the central LED 21A, or only the peripheral LEDs 21B to 21G with respect to the central LED 21A in a clockwise or counterclockwise direction.
  • the surrounding LEDs 21 ⁇ / b> B to 21 ⁇ / b> G may be controlled to sequentially turn on and off the adjacent LEDs 21 ⁇ / b> B to 21 ⁇ / b> G so that a plurality of them blink.
  • the “excitation level” and the “love level” are provided to facilitate understanding.
  • the force of changing the value of each emotional parameter in the range of 8 to 8 Various ranges beyond this are applicable.
  • the contents of the emotion generation and expression processing procedure RT1 can be appropriately changed. For example, in an actual computer process, it is processed as a number in the range of 8 to 8 and a range of numbers ⁇ to 15 (0 to F in hexadecimal notation). Emotion parameters such as those described above for step SP13 are positive or negative and are not deviated!
  • step SP14 or step SP15 to step SP17
  • the value of the emotion parameter is 0 to 3
  • step SP14 or step SP15 to step SP17
  • the LEDs 21A to 21G emit light with the color and the light emission pattern associated with the range. The point is that if the LEDs 21A to 21G are made to emit light with a light emission pattern corresponding to a combination of values of emotional parameters, various methods can be applied as the method industrial applicability
  • the present invention can be widely applied to various robot toys.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Robotics (AREA)
  • Toys (AREA)
  • Manipulator (AREA)

Abstract

A robot toy comprises a first light source emitting light of at least two colors simultaneously or individually, five or more second light sources arranged around the first light source, a memory for storing at least two types of emotion parameters constituting emotion, and a control means for increasing/decreasing the value of emotion parameters based on an external operation input. The control means controls the first and second light sources to emit light in an emission pattern corresponding to a combination of the values of emotion parameters, thus realizing an inexpensive robot toy capable of effectively representing emotion.

Description

明 細 書  Specification
ロボッ卜玩具  Robot 卜 toy
技術分野  Technical field
[0001] 本発明はロボット玩具に係り、特に安価なロボット玩具に適用できる。  [0001] The present invention relates to a robot toy, and in particular to a cheap robot toy.
背景技術  Background art
[0002] 近年、ユーザからの「叩く」又は「撫でる」といった働きかけや、周囲の環境などに基 づいて自己の感情を生成し、生成した感情を発光素子の発光パターンや音楽などの 音声によって表現するロボット玩具が提案され、商品化されている(例えば、特許文 献 1参照)。  [0002] In recent years, the user's "tap" or "browsing" work, and the user's own emotion is generated based on the surrounding environment, etc., and the generated emotion is expressed by the light emission pattern of the light emitting element or the voice of music. Robot toys have been proposed and commercialized (see, for example, Patent Document 1).
[0003] 例えば、特許第 3277500号公報には、「喜び」や「怒り」の感情を表現するための、 その感情に応じた形状及び発光色を有する LED (Light Emitting Diode)をロボット玩 具の頭部に配置すると共に、これを発光時にのみ外部から目視できるように半透明力 バーで覆い、ロボット玩具の感情に応じてこれら LEDを点滅駆動することが開示され ている。  For example, Japanese Patent No. 3277500 discloses an LED (Light Emitting Diode) having a shape and luminescent color according to the emotion for expressing the emotion of “joy” or “anger”. It is disclosed to be placed on the head and covered with a semi-transparent bar so that it can be viewed from the outside only when the light is emitted, and to drive these LEDs to blink according to the emotion of the robot toy.
[0004] またこの特許第 3277500号公報には、「喜び」や「怒り」の感情に応じた形状及び 発光色を有する LEDに代えて、多数の発光素子をマトリクス状に配置し、これら発光 素子を選択的に点滅駆動するようにしてロボット玩具の感情を表現することも開示さ れている。  [0004] Further, in this patent 3277500, a large number of light emitting elements are arranged in a matrix instead of the LED having a shape and a luminescent color according to the emotion of "joy" or "anger". It is also disclosed to express emotions of a robot toy by selectively driving to blink.
特許文献 1:特許第 3277500号公報  Patent Document 1: Patent No. 3277500
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problem that invention tries to solve
[0005] ところで、ロボット玩具の感情を発光素子の発光パターンで表現する場合、「喜び」 や「怒り」等の各種感情に応じた形状及び発光色を有する LEDをそれぞれ用意する よりも、上述のように多数の発光素子をマトリクス状に配置し、これらを感情に応じて 選択的に点滅駆動する手法を採用した方が多様な発光パターンで感情を表現する ことができる。 By the way, when expressing the emotion of the robot toy by the light emission pattern of the light emitting element, it is preferable to prepare the above-mentioned LED rather than preparing the shape and the light emission color according to various emotions such as “joy” and “anger”. As described above, if a method of arranging a large number of light emitting elements in a matrix and selectively driving them in a blinking manner according to emotions, emotions can be expressed with various light emission patterns.
[0006] しなしながら、安価なロボット玩具では、発光素子をマトリクス状に配置することはコ スト的に困難であり、限られたコストの中で、ロボット玩具の感情を発光素子の発光パ ターンで効果的に表現できるような工夫が必要となる。そして、このようにロボット玩具 の感情を発光素子の発光パターンで効果的に表現できれば、そのロボット玩具に対 するユーザの興味を向上させて、「玩具」としての商品価値を向上させ得るものと考え られる。 However, in an inexpensive robot toy, arranging the light emitting elements in a matrix form It is extremely difficult, and it is necessary to devise to be able to effectively express the emotion of the robot toy with the light emission pattern of the light emitting element, within the limited cost. And, if it is possible to effectively express the emotion of the robot toy by the light emission pattern of the light emitting element in this way, it is thought that the user's interest in the robot toy can be improved and the commercial value as a "toy" can be improved. Be
[0007] 本発明はこのような問題に鑑みてなされたものであり、安価に構築しながら感情表 現を効果的に行い得るロボット玩具を提供することを目的とするものである。  The present invention has been made in view of such problems, and it is an object of the present invention to provide a robot toy capable of effectively performing emotional expression while being constructed inexpensively.
課題を解決するための手段  Means to solve the problem
[0008] 上記の課題を解決するため、本発明のロボット玩具は、少なくとも 2色の光を同時に 又は個別に発光可能な第 1の光源と、第 1の光源の周囲に配置された 5つ以上の第 2の光源と、感情を構成する少なくとも 2種類の感情パラメータを記憶するメモリと、外 部からの操作入力に基づ 、て感情パラメータの値を増減させる制御手段とを有し、 制御手段は、各感情パラメータの値の組み合せに対応した発光パターンで第 1及び 又は第 2の光源を発光させる。 [0008] In order to solve the above problems, the robot toy of the present invention comprises a first light source capable of emitting at least two colors of light simultaneously or separately, and five or more lights arranged around the first light source. Control means for increasing or decreasing the value of the emotion parameter based on the operation input from the outside, and a control means for storing the second light source of the second light source, a memory for storing at least two kinds of emotion parameters constituting the emotion, Causes the first and / or second light sources to emit light with an emission pattern corresponding to the combination of values of the emotional parameters.
発明の効果  Effect of the invention
[0009] 本発明によれば、少な 、光源数で多様な感情表現を行うことができ、力べして安価 に構築しながら感情表現を効果的に行うことができる。  According to the present invention, various emotion expressions can be performed with a small number of light sources, and emotion expressions can be effectively performed while being inexpensively constructed.
図面の簡単な説明  Brief description of the drawings
[0010] [図 1]本実施の形態によるロボット玩具の外観構成を示す斜視図である。 FIG. 1 is a perspective view showing an appearance configuration of a robot toy according to the present embodiment.
[図 2]本実施の形態によるロボット玩具の内部構成を示すブロック図である。  FIG. 2 is a block diagram showing an internal configuration of a robot toy according to the present embodiment.
[図 3]頭部の具体的構成を示す分解斜視図である。  FIG. 3 is an exploded perspective view showing a specific configuration of the head.
圆 4]駆動機構部の構成を示す斜視図である。  4] It is a perspective view showing the configuration of a drive mechanism section.
[図 5]振子ギアの説明に供する斜視図である。  FIG. 5 is a perspective view for explaining a pendulum gear.
[図 6]第 1のカムギアのカムの説明に供する略線的な平面図である。  FIG. 6 is a schematic plan view for describing a cam of a first cam gear.
[図 7]昇降部材の構成を示す斜視図である。  [Fig. 7] Fig. 7 is a perspective view showing the structure of a lift member.
[図 8]耳駆動部材の構成を示す斜視図である。  FIG. 8 is a perspective view showing the configuration of an ear driving member.
[図 9]頭部の動きの説明の供する略線図である。  FIG. 9 is a schematic diagram for explaining the movement of the head.
[図 10]外部力も見える LEDの発光状態を示す斜視図である。 [図 11]LED表示部における各 LEDの配置関係を示す平面図である。 [Fig. 10] A perspective view showing the light emission state of the LED in which the external force is also visible. FIG. 11 is a plan view showing an arrangement relationship of the LEDs in the LED display unit.
[図 12]ロボット玩具の感情の説明に供する概念図である。  [FIG. 12] A conceptual view for explaining the emotion of the robot toy.
[図 13]ロボット玩具の感情の説明に供する概念図である。  [FIG. 13] A conceptual view for explaining the emotion of the robot toy.
[図 14]基本発光パターンの説明に供する概念図である。  FIG. 14 is a conceptual diagram serving to explain a basic light emission pattern.
[図 15]感情生成表現処理手順を示すフローチャートである。  FIG. 15 is a flowchart showing an emotion generation expression processing procedure.
[図 16]第 1の感情パラメータ変更処理手順を示すフローチャートである。  FIG. 16 is a flowchart showing a first emotion parameter change processing procedure.
[図 17]第 2の感情パラメータ変更処理手順を示すフローチャートである。  FIG. 17 is a flowchart showing a second emotion parameter change processing procedure.
[図 18]第 3の感情パラメータ変更処理手順を示すフローチャートである。  FIG. 18 is a flowchart showing a third emotion parameter change processing procedure.
[図 19]第 1の感情表現処理手順を示すフローチャートである。  FIG. 19 is a flowchart showing a first emotion expression processing procedure.
[図 20]第 2の感情表現処理手順を示すフローチャートである。  FIG. 20 is a flow chart showing a second emotion expression processing procedure.
[図 21]第 3の感情表現処理手順を示すフローチャートである。  FIG. 21 is a flowchart showing a third emotion expression processing procedure.
[図 22]他の実施の形態の説明に供する略線図である。  FIG. 22 is a schematic diagram for describing another embodiment.
符号の説明  Explanation of sign
[0011] 1……ロボット玩具、 3……頭部、 7……尻尾スィッチ、 10……制御部、 21…… LE D表示部、 21A〜21G、 70A〜70D、 71A〜71E、 72A〜72F、 73A〜73G…… LEDゝ 23……頭スィッチ、 24……鼻スィッチ。  [0011] 1 ... robot toy, 3 ... head, 7 ... tail switch, 10 ... control section, 21 ... LED display section, 21A to 21G, 70A to 70D, 71A to 71E, 72A to 72F , 73A ~ 73G ... LED ゝ 23 ... head switch, 24 ... nose switch.
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0012] 以下、各図を参照して本実施の形態について説明する。  Hereinafter, the present embodiment will be described with reference to the respective drawings.
[0013] (1)ロボット玩具の構成  (1) Configuration of Robot Toy
図 1は、本実施の形態に係わるロボット玩具 1を示すものである。このロボット玩具 1 は、全体として犬などの動物が座った状態に似せた外観形状を有している。すなわ ちロボット玩具 1は、胴体部 2の前端部から垂直方向に延びた軸に対して回動自在に 頭部が連結され、当該頭部 3の上端左右両側にそれぞれ耳部 4A, 4Bが可動自在 に取り付けられている。また胴体部 2の左右両側面の前端部には、それぞれ丸みを 帯びた棒形状の前足部 5A, 5Bが人手により可動自在に連結され、胴体部 2の左右 両側面の後端部には、それぞれ前足部より短い突起状の後足部 6A, 6Bが胴体部 2 と一体に成型されている。さらに胴体部 2の上面後端部近傍にはジョイステック状の 尻尾スィッチ 7が可動自在に取り付けられて 、る。 [0014] 胴体部 2は、全体として丸みを帯びた直方体形状に形成されており、図 2に示すよう に、その内部に、ロボット玩具 1全体の動作制御を司る制御部 10、モータ駆動部 11 及びサウンドアンプ部 12などが形成された基板 13と、マイクロホン 14及びスピーカ 1 5などのデバイスとなどが収納されている。また胴体部 2の一方の後足部 6Bにはィャ ホンジャックが配設されて 、る。 FIG. 1 shows a robot toy 1 according to the present embodiment. The robot toy 1 has an appearance that resembles a sitting state of an animal such as a dog as a whole. That is, the robot toy 1 has its head rotatably coupled to an axis extending in the vertical direction from the front end of the body 2, and the ears 4 A and 4 B are provided on the upper left and right sides of the head 3 respectively. It is movably attached. In addition, the front legs 5A and 5B of rounded rod shape are respectively movably connected to the front ends of the left and right sides of the body 2 and the rear ends of the left and right sides of the body 2 are Protrusive rear legs 6A and 6B shorter than the forefoot are molded integrally with the body 2. Furthermore, a joystick-like tail switch 7 is movably attached in the vicinity of the upper rear end of the body 2. The body part 2 is formed in a rounded rectangular parallelepiped shape as a whole, and as shown in FIG. 2, the control part 10 which controls the operation of the entire robot toy 1 and the motor drive part 11 are provided therein. And a substrate 13 on which the sound amplifier unit 12 and the like are formed, and devices such as the microphone 14 and the speaker 15 are housed. In addition, a jack jack is disposed on one rear foot 6B of the trunk 2.
[0015] 頭部 3は、全体として胴体部 2よりも平たい丸みを帯びた直方体形状に形成されて おり、その内部に光センサ 20、複数の LEDを有する LED表示部 21などのデバイス と、モータ 22を動力源として頭部 3や各耳部 4A, 4Bを駆動する後述の駆動機構 34 ( 図 3)となどが収納されている。また頭部 3における上端部には押圧スィッチでなる頭 スィッチ 23が配設され、頭部 3における鼻の位置には押圧スィッチでなる鼻スィッチ 2 4が配設されている。ロボット玩具 1全体としては、前足部 5A, 5Bのほうが後足部 6A , 6Bよりも長いため、頭部 3の一面が載置面に対して斜め上方を向くようになり、この 一面に LED表示部 21の表示面を配置することで、ユーザに表示を見やすくすること ができるようになされて 、る。  The head 3 is formed in a rectangular parallelepiped shape which is flatter than the body 2 as a whole and has a light sensor 20, a device such as an LED display unit 21 having a plurality of LEDs therein, and a motor A drive mechanism 34 (FIG. 3) described later for driving the head 3 and the ears 4A and 4B with the power source 22 is stored. Further, at the upper end portion of the head 3, a head switch 23 composed of a pressing switch is disposed, and at the position of the nose in the head 3, a nose switch 24 composed of a pressing switch is disposed. As the robot toy 1 as a whole, the forefoot portions 5A, 5B are longer than the rear foot portions 6A, 6B, so one side of the head 3 faces obliquely upward with respect to the mounting surface, and the LED display on this one side By arranging the display surface of the section 21, the display can be made easy for the user to view.
[0016] そして胴体部 2のマイクロホン 14は、周囲の音^^音し、得られた音声信号 S1を制 御部 10に送出する。また頭部 3の光センサ 20は、周囲の明るさを検出し、検出結果 を明るさ検出信号 S2として制御部 10に送出する。さらに尻尾スィッチ 7、頭スィッチ 2 3及び鼻スィッチ 24は、それぞれユーザ力も与えられた「倒す」又は「押す」といった 物理的な働きかけを検出し、検出結果を操作検出信号 S3A〜S3Cとして制御部 10 に送出する。なお、制御部 10には、イヤホンジャック 16を介して外部の音響機器から 供給される音声信号 S4も与えられる。  Then, the microphone 14 of the torso 2 sounds the surrounding sound and sends the obtained audio signal S 1 to the control unit 10. Further, the light sensor 20 of the head 3 detects the ambient brightness and sends the detection result to the control unit 10 as the brightness detection signal S2. Further, the tail switch 7, the head switch 23 and the nose switch 24 detect physical actions such as “toppling” or “pushing” to which the user's power is also given, and control results as operation detection signals S3A to S3C. Send to The control unit 10 is also supplied with an audio signal S4 supplied from an external audio device via the earphone jack 16.
[0017] 制御部 10は、 CPU (Central Processing Unit) 10A、メモリ 10B及びアナログ Zディ ジタル変換回路 IOC等などを含むマイクロコンピュータ構成でなり、マイクロホン 14か らの音声信号 S1及び光センサ 20からの明るさ検出信号 S2と、尻尾スィッチ 7、頭ス イッチ 23及び鼻スィッチ 24からの各操作検出信号 S3A〜S3Cとに基づいて周囲の 状況やユーザ力 の働きかけの有無を認識する。  The control unit 10 has a microcomputer configuration including a central processing unit (CPU) 10 A, a memory 10 B, an analog Z digital conversion circuit IOC, and the like, and the sound signal S 1 from the microphone 14 and the light sensor 20 Based on the brightness detection signal S2 and operation detection signals S3A to S3C from the tail switch 7, the head switch 23 and the nose switch 24, the presence or absence of the surrounding situation or the user's power is recognized.
[0018] そして制御部 10は、この認識結果と予めメモリ 10Bに格納されているプログラムとに 基づ 、て、モータ駆動部 11にモータ駆動信号 S4を送出してモータ 22を駆動させる ことによりロボット玩具 1の頭部 3を傾けさせたり、耳部 4A, 4Bを開閉させる動作を発 現させる。また制御部 10は、必要に応じて所定の音声信号 S5をサウンドアンプ部 12 を介してスピーカ 15に与えることにより当該音声信号 S5に基づく音や音楽をスピー 力 15から出力させたり、 LED表示部 21に所定の駆動信号 S6を与えることにより当該 LED表示部 21の LEDを所定の発光パターンで点滅駆動させる。 Then, based on the recognition result and the program stored in advance in memory 10B, control unit 10 sends motor drive signal S4 to motor drive unit 11 to drive motor 22. As a result, the head 3 of the robot toy 1 is made to tilt, and the operation to open and close the ears 4A, 4B is realized. In addition, the control unit 10 outputs a sound or music based on the audio signal S5 from the speaker 15 by giving a predetermined audio signal S5 to the speaker 15 through the sound amplifier unit 12 as necessary, or the LED display unit By giving a predetermined drive signal S6 to the LED 21, the LED of the LED display unit 21 is driven to blink in a predetermined light emission pattern.
[0019] なお、頭部 3の具体的な構成を図 3に示す。この図 3からも明らかなように、ロボット 玩具 1の頭部 3は、当該頭部 3の背面側の外観形状を形成する第 1の筐体半体 30と 、当該頭部 3の正面側の外観形状を形成する第 2の筐体半体 31と、当該頭部 3の左 右側面をそれぞれ形成するコ字状の第 1及び第 2の筐体側面部材 32とから構成され る筐体の内部に、駆動機構部 34、当該駆動機構部 34のカバー 35、 LED表示部 21 、光漏れ防止部材 36及びフィルタカバー 37が矢印 aで示す正面方向に順次積層す るよう〖こ収糸内されること〖こより構成されて 、る。  A specific configuration of the head 3 is shown in FIG. As is also apparent from FIG. 3, the head 3 of the robot toy 1 has a first casing half 30 that forms an appearance on the back side of the head 3, and a front side of the head 3. A housing comprising a second housing half 31 forming an external shape and first and second U-shaped housing side members 32 respectively forming the left and right sides of the head 3 The drive mechanism 34, the cover 35 of the drive mechanism 34, the LED display 21, the light leakage prevention member 36 and the filter cover 37 are sequentially stacked in the front direction indicated by the arrow a. It is composed of things.
[0020] 駆動機構部 34においては、図 4に示すように、モータ 22の出力軸に図示しないゥ オームギアが取り付けられ、このウォームギアがギア 40及び当該ギア 40と同軸かつ 一体に形成されたギア 41を介してギア 42と歯合している。また、このギア 42が取り付 けられた軸体 43には、図 5に示すように、一端部に錘 44が取り付けられた可動部材 4 5が回転自在に取り付けられており、この可動部材 45の他端側に回転自在に取り付 けられた振子ギア 46がギア 42と同軸かつ一体に形成されたギア 47と歯合している。 また振子ギア 46の左右両側には、可動部材 45が軸体 43を中心として回転したとき に振子ギア 46と歯合し得るように、それぞれ第 1又は第 2の連結ギア 48, 49が配設さ れている。  In the drive mechanism portion 34, as shown in FIG. 4, a worm gear (not shown) is attached to the output shaft of the motor 22, and this worm gear is formed integrally with the gear 40 and the gear 40 coaxially. It meshes with gear 42 through. Further, as shown in FIG. 5, a movable member 45 having a weight 44 attached to one end thereof is rotatably attached to the shaft 43 to which the gear 42 is attached. A pendulum gear 46 rotatably mounted on the other end of the gear is meshed with a gear 47 coaxially and integrally formed with the gear 42. In addition, first and second connection gears 48 and 49 are respectively provided on the left and right sides of the pendulum gear 46 so that the movable member 45 can mesh with the pendulum gear 46 when the movable member 45 rotates about the shaft 43. It is being done.
[0021] これにより駆動機構部 34においては、モータ 22を正転駆動したときに、その回転力 がモータ 22の出力軸に取り付けられたウォームギア力もギア 42までのギア列を介し てギア 47に伝達され、この回転力に基づいてギア 47が振子ギア 46と一体に可動部 材 45を図 4の時計回り方向に回転させることにより、振子ギア 46を第 1の連結ギア 48 と歯合させることができ、逆にモータ 22を反転駆動したときに、その回転力に基づい てギア 47が可動部材 45を図 4の反時計回り方向に回転させることにより、振子ギア 4 6を第 2の連結ギア 49と歯合させることができるようになされて 、る。 [0022] この場合、第 1の連結ギア 48は第 1のカムギア 50と歯合しており、この第 1のカムギ ァ 50の下面側には、図 6に示すように、所定形状のカム 50Aが形成されている。そし てこのカム 50Aは、図 4において矢印 bで示す上下に移動自在に配設された図 7に 示すような昇降部材 51の係合穴 51Aに嵌め込まれており、力べして第 1のカムギア 5 0を回転させることによって、カム 50Aにより昇降部材 51を昇降させることができるよう になされている。 Thus, in the drive mechanism section 34, when the motor 22 is driven to rotate in the normal direction, the rotational force also transmits the worm gear force attached to the output shaft of the motor 22 to the gear 47 via the gear train up to the gear 42. And the gear 47 rotates the movable member 45 integrally with the pendulum gear 46 in the clockwise direction of FIG. 4 based on the rotational force, thereby meshing the pendulum gear 46 with the first connecting gear 48. Conversely, when the motor 22 is reversely driven, the gear 47 rotates the movable member 45 in the counterclockwise direction in FIG. 4 on the basis of the rotational force, whereby the pendulum gear 46 is engaged with the second connecting gear 49. It is made to be able to be engaged with it. In this case, the first connecting gear 48 meshes with the first cam gear 50, and the lower surface side of the first cam gear 50, as shown in FIG. Is formed. The cam 50A is fitted in the engagement hole 51A of the elevating member 51 as shown in FIG. 7 which is disposed movably up and down as shown by the arrow b in FIG. By rotating 50, the lift member 51 can be raised and lowered by the cam 50A.
[0023] さらに昇降部材 51の上端部には、この図 7からも明らかなように、第 1の筐体半体 3 0に植設された軸体 52 (図 4)を基準として左右対称位置に位置するように第 1及び 第 2の軸体 53A, 53Bが植設されており、図 4に示すように、これら第 1及び第 2の軸 体 53A, 53Bに耳駆動部材 54が取り付けられている。  Further, as apparent from FIG. 7, the upper end portion of the raising and lowering member 51 has a symmetrical position with respect to the shaft 52 (FIG. 4) implanted in the first housing half 30 (FIG. 4). The first and second shafts 53A, 53B are implanted so as to be positioned in the first and second shafts 53A, 53B, as shown in FIG. ing.
[0024] 耳駆動部材 54は、図 8に示すように、それぞれ弾性材カもなるピンセット形状の第 1 及び第 2のばね部 60A, 60Bの各根元部を筒部 61を介して一体にかつ屈曲自在に 連結することにより形成されている。そして耳駆動部材 54は、これら第 1及び第 2のば ね部 60A, 60Bの根元部近傍にそれぞれ配設された第 1又は第 2の係合部 62A, 6 2Bの各孔 62AX, 62BXを、それぞれ昇降部材 51の対応する第 1又は第 2の軸体 5 3A, 53Bと嵌め合わせ、かつ筒部 61を軸体 52と嵌め合わせるようにして、昇降部材 51に取り付けられている。  Ear drive member 54 is, as shown in FIG. 8, integrally formed, through cylindrical portion 61, root portions of first and second spring portions 60A and 60B in the form of tweezers, each of which is also an elastic member. It is formed by flexible connection. The ear driving member 54 is provided with holes 62AX and 62BX of the first and second engaging portions 62A and 62B disposed in the vicinity of the root portions of the first and second spring portions 60A and 60B, respectively. It is attached to the raising and lowering member 51 so as to be fitted to the corresponding first or second shaft 53A, 53B of the raising and lowering member 51, and to fit the cylindrical portion 61 with the shaft 52.
[0025] これにより駆動機構部 34においては、昇降部材 51が昇降駆動されたときに、当該 昇降部材 51の第 1及び第 2の軸体 53A, 53Bが軸体 52に対して上下方向に移動す ることによって、耳駆動部材 54の第 1及び第 2の係合部 62A, 62Bと一体に第 1及び 第 2のばね部 60A, 60Bを開閉するように駆動することができるようになされて!、る。  Thus, in the drive mechanism section 34, when the elevating member 51 is driven to move up and down, the first and second shafts 53A and 53B of the elevating member 51 move vertically with respect to the shaft 52. Thus, the first and second spring portions 60A, 60B can be driven to open and close integrally with the first and second engaging portions 62A, 62B of the ear drive member 54. !
[0026] さらに耳駆動部材 54の第 1及び第 2のばね部 60A, 60Bには、対応する耳部 4A, 4Bの下端部がそれぞれ嵌めこまれると共に、各耳部 4A, 4Bは、それぞれ下端部近 傍が第 1の筐体半体 30の上端左右両側において軸支されており、力べして耳駆動部 材 54の第 1及び第 2のばね部 60A, 60Bの開閉動作に連動して耳部 4A, 4Bを開閉 駆動し得るようになされて ヽる。  Furthermore, the lower end portions of the corresponding ear portions 4A, 4B are respectively fitted into the first and second spring portions 60A, 60B of the ear drive member 54, and the respective ear portions 4A, 4B are respectively the lower end The vicinity of the part is pivotally supported on the upper left and right sides of the first housing half 30, and by force, it interlocks with the opening and closing operation of the first and second spring parts 60A and 60B of the ear driving member 54. The ears 4A, 4B can be driven to open and close.
[0027] これにより駆動機構部 34においては、モータ 22を正転駆動させたときに、振子ギア 46 (図 5)が第 1の連結ギア 48と歯合してモータ 22の回転出力を第 1のカムギア 50に 伝達し、この回転出力に基づいて第 1のカムギア 50のカム 50Aが昇降部材 51を昇 降させて耳駆動部材 54の第 1及び第 2のばね部 60A, 60Bを開閉駆動することで、 耳部 4A, 4Bを開閉するように動作させ得るようになされて!ヽる。 Thus, in the drive mechanism section 34, when the motor 22 is driven to rotate in the forward direction, the pendulum gear 46 (FIG. 5) meshes with the first connecting gear 48, and the rotational output of the motor 22 is Of cam gear 50 The cam 50A of the first cam gear 50 raises and lowers the raising and lowering member 51 based on the rotational output to open and close the first and second spring portions 60A and 60B of the ear driving member 54, thereby It is made to be able to operate section 4A, 4B to open and close!
[0028] これに対して、第 2の連結ギア 49は第 2のカムギア 63と歯合しており、この第 2の力 ムギア 63の下面側には、図 9 (A)に示すように、所定形状のカム 63Aが形成されて いる。また第 2のカムギア 63の下側には、胴体部 2に固定された軸体 64に対してさら に固定された二股部材 65の 2本の平行な腕部 65A, 65Bが伸びている。そして第 2 のカムギア 63のカム 63Aは、この二股部材 65のこれら 2本の腕部 65A, 65Bの間に 嵌め込まれている。 On the other hand, the second connecting gear 49 meshes with the second cam gear 63, and the lower surface side of the second force gear 63 is, as shown in FIG. 9 (A), A cam 63A of a predetermined shape is formed. Further, on the lower side of the second cam gear 63, two parallel arm portions 65A, 65B of a bifurcated member 65 further fixed to the shaft body 64 fixed to the body portion 2 extend. The cam 63A of the second cam gear 63 is fitted between the two arms 65A and 65B of the bifurcated member 65.
[0029] これにより駆動機構部 34においては、モータ 22を逆転駆動させたときに、振子ギア 46 (図 5)が第 2の連結ギア 49と歯合してモータ 22の回転出力を第 2のカムギア 63に 伝達し、この回転出力に基づ!/、て第 2のカムギア 63のカム 63Aが図 9 (B)及び(C) のように二股部材 65の腕部 65A, 65Bを押すことにより、その反作用によって頭部 3 を軸体 64を中心として左右方向に揺れるように動作させ得るようになされて!、る。な お、力かるモータ 22の正転と逆転は、その回転方向がある回転方向に対して逆方向 の回転であれば良ぐ正転と逆転が、それぞれ時計回りか反時計回りか等に限定さ れるものではない。  Thus, in the drive mechanism section 34, when the motor 22 is reversely driven, the pendulum gear 46 (FIG. 5) meshes with the second connection gear 49 to rotate the motor 22 for the second time. By transmitting to the cam gear 63 and based on this rotational output, the cam 63A of the second cam gear 63 pushes the arms 65A and 65B of the bifurcated member 65 as shown in FIGS. 9 (B) and (C). The reaction is such that the head 3 can be moved to swing in the lateral direction about the shaft 64! Note that normal rotation and reverse rotation of the motor 22 are good as long as the rotation direction is reverse rotation with respect to a certain rotation direction. The forward rotation and reverse rotation are limited to clockwise or counterclockwise, respectively. It is not something to be done.
[0030] 一方、 LED表示部 21においては、図 3に示すように、基板 66上に 7個の LED21A 〜21Gが所定の位置関係で配設されることにより構成されている。この 7個の LED2 1A〜21Gの位置関係及びそれぞれの発光色については後述する。  On the other hand, in the LED display unit 21, as shown in FIG. 3, seven LEDs 21A to 21G are arranged on the substrate 66 in a predetermined positional relationship. The positional relationship of the seven LEDs 21A to 21G and the light emission color of each will be described later.
[0031] 光漏れ防止部材 36は、光を通さない例えば黒色の榭脂材又はゴム材等を用いて 形成されており、 LED表示部 21の基板 66に密着するように取り付けられている。な お鼻スィッチ 24の押圧部 24A及び接点部 24Bは、この光漏れ防止部材 36の下端 部に固定されている。  The light leakage prevention member 36 is formed of, for example, a black resin material or rubber material which does not transmit light, and is attached in close contact with the substrate 66 of the LED display unit 21. The pressing portion 24A and the contact portion 24B of the nose switch 24 are fixed to the lower end portion of the light leakage preventing member 36.
[0032] そして、この光漏れ防止部材 36には、 LED表示部 21の各 LED21A〜21Gにそ れぞれ対応させて合計 7つの穴 36A〜36Gが設けられており、力べしてこの光漏れ 防止部材 36を LED表示部 21の基板 66に取り付けたときに、これら各穴 36A〜36G をそれぞれ介して LED表示部 21の対応する LED21 A〜 21 Gをフィルタカバー 37 側に露出させ得るようになされている。 The light leakage prevention member 36 is provided with a total of seven holes 36A to 36G corresponding to the respective LEDs 21A to 21G of the LED display unit 21. When the preventing member 36 is attached to the substrate 66 of the LED display unit 21, the corresponding LEDs 21A to 21G of the LED display unit 21 are filtered through the respective holes 36A to 36G. It is designed to be exposed on the side.
[0033] またこの光漏れ防止部材 36の厚みは、 LED表示部 21の基板 66からフィルタカバ 一 37までの隙間と同じ大きさに設定されており、これにより LED表示部 21の各 LED 21A〜21Gからそれぞれ発射された光を、他の LED21A〜21Gから発射された光 と混じわらせることなくフィルタカバー 37の方向に出射させ得るようになされている。  Further, the thickness of the light leakage prevention member 36 is set to the same size as the gap from the substrate 66 of the LED display unit 21 to the filter cover 37, whereby each LED 21 A of the LED display unit 21 The light emitted from each of the LEDs 21G can be emitted toward the filter cover 37 without being mixed with the light emitted from the other LEDs 21A to 21G.
[0034] さらにフィルタカバー 37は半透明な榭脂材を用いて形成されると共に、第 2の筐体 半体 31は透明な榭脂財を用 、て形成されており、これにより LED表示部 21の LED 21八〜210が消灯してぃるときには当該1^:021八〜210を外部カ 認識させず( 見えず)、 LED21A〜21Gが点灯したときに、図 10に示すように、この LED21A〜2 1Gを外部から認識し得る(この LED21A〜21G力も発射された光を見ることができ る)ようになされている。またこのようにフィルタカバー 37を設けることによって、 LED2 1A〜21Gが点灯したときに、当該点灯した LED21A〜21Gの箇所のみがピンポィ ント的に明るくならず、光漏れ防止部材 36に設けられた対応する穴 36A〜36Gの形 状全体が一様な明るさとなるように光を拡散させ得るようになされて!、る。  Furthermore, the filter cover 37 is formed using a translucent resin material, and the second casing half 31 is formed using a transparent resin material, whereby the LED display portion is formed. As shown in FIG. 10, when the LEDs 21 8 to 21 of 21 do not turn off and turn off, and the LEDs 21A to 21G are turned on without recognizing (invisible) the external light. The LEDs 21A to 21G can be recognized from the outside (this LEDs 21A to 21G can also see the emitted light). Further, by providing the filter cover 37 in this way, when the LEDs 21A to 21G are lit, only the portions of the lit LEDs 21A to 21G do not brighten in a pinpoint manner, and the correspondence provided in the light leakage prevention member 36 The holes 36A-36G are made to diffuse light so that the overall shape is uniform brightness!
[0035] この場合、フィルタカバー 37は白色に形成されており、これによりフィルタカバー 37 を通して外部から見える LED表示部 21の各 LED21A〜 21 Gから発射された光をフ ィルタカバー 37によってくすませることなぐ白色内にほのかに浮かび上らせることが できるようになされている。  In this case, the filter cover 37 is formed in white so that the light emitted from each of the LEDs 21A to 21G of the LED display unit 21 visible from the outside through the filter cover 37 is not blurred by the filter cover 37. It is designed to be able to lightly rise in white.
[0036] (2)ロボット玩具の感情表現  (2) Emotional expression of robot toy
(2- 1)ロボット玩具の感情表現  (2-1) Emotional expression of robot toy
次に、このロボット玩具 1の感情表現について説明する。このロボット玩具 1には、制 御部 10 (図 2)力 ユーザからの働きかけの内容や働きかけの有無に基づいてそのと きのロボット玩具 1の感情を生成し、その感情の種類及び度合いに応じた発光パター ンで LED表示部 21の各 LED21A〜21Gを点滅させることによって、そのときのロボ ット玩具 1の感情を表現する感情表現機能が搭載されている。  Next, emotional expression of the robot toy 1 will be described. The robot toy 1 generates the emotion of the robot toy 1 at that time based on the contents of the user's action and the presence or absence of the action on the control unit 10 (Fig. 2), and the type and degree of the emotion are generated. By causing each of the LEDs 21A to 21G of the LED display unit 21 to blink in the light emission pattern, an emotion expression function is provided to express the emotion of the robot toy 1 at that time.
[0037] 図 11は、このようなロボット玩具 1の感情表現に用いられる LED表示部 21の各 LE D21A〜21Gの配置関係を示すものである。この図 11からも明らかなように、 LED表 示部 21においては、頭部 3における幅方向の中心線上の所定位置に 1つの LED21 Aが配置され、この LED21Aを中心として、同心円上に、当該 LED21A力 等距離 にかつ相互に等間隔に位置するように残りの 6個の LED21B〜21Gが配設されてい る。つまり、周囲の 6個の LED21B〜21Gは、中央の LED21Aを中心とする正六角 形の各頂点位置にそれぞ 立置するような位置関係で配置されている。 FIG. 11 shows an arrangement relationship of the LEDs 21A to 21G of the LED display unit 21 used for expressing the emotion of the robot toy 1 as described above. As apparent from FIG. 11, in the LED display portion 21, one LED 21 is placed at a predetermined position on the center line in the width direction of the head 3. A is disposed, and the remaining six LEDs 21B to 21G are disposed on concentric circles centered on the LED 21A so as to be equidistantly spaced from each other and at equal intervals. That is, the surrounding six LEDs 21B to 21G are arranged in such a positional relationship as to be placed at each vertex position of a regular hexagonal shape centering on the central LED 21A.
[0038] このうち中央の LED21Aとしては、緑色、赤色及び青色の 3色を同時又は別個に 発光し得るものが用いられている。また、他の周囲の LED21B〜12Gとしては、緑色 及び赤色の 2色を同時又は別個に発光し得るものが用いられている。従って、周囲の LED21B〜12Gについては、緑色及び赤色の 2色を同時に点灯させることで橙色に 発光させることができるようになされて 、る。  Among these, as the central LED 21 A, an LED capable of emitting three colors of green, red and blue simultaneously or separately is used. Moreover, as other surrounding LED21B-12G, what can light-emit two colors of green and red simultaneously or separately is used. Therefore, the surrounding LEDs 21B to 12G can be made to emit orange light by simultaneously lighting two colors of green and red.
[0039] 一方、ロボット玩具 1の感情は、図 12に示すように、「興奮レベル」及び「愛情レベル 」をそれぞれ表す 2つのパラメータ(以下、これらを感情パラメータと呼ぶ)によって定 義されている。これら 2つの感情パラメータは、メモリ 10Bに保持されており、それぞれ 「一 8」乃至「8」の範囲の値をとる。  On the other hand, as shown in FIG. 12, the emotion of the robot toy 1 is defined by two parameters (hereinafter referred to as emotion parameters) respectively representing the “excitation level” and the “love level”. . These two emotion parameters are stored in the memory 10B and take values in the range of "one eight" to "eight", respectively.
[0040] そして、「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共に 0又は正 の範囲に「喜び」及び「大好き」 t 、う感情が対応付けられ、「愛情レベル」の感情パラ メータの値が 0又は正で、「興奮レベル」の感情パラメータの値が負の範囲に「通常」 及び「落ち着き」 、う感情が対応付けられて!/、る。  [0040] Then, the values of "feeling level" and "love level" are both 0 or positive in the range of "joy" and "love" t, and the feeling is associated with "feeling level" emotion. "Normal" and "calm" are associated with negative values of the parameter of "excitation level" with the parameter value being 0 or positive, and the emotion is associated with! / ,.
[0041] また「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共に負の範囲に「 寂し 、」及び「落ち込み」 t 、う感情が対応付けられ、さらに「興奮レベル」の感情パラ メータの値力 ^又は正で、「愛情レベル」の感情パラメータの値が負の範囲に「怒り」及 び「嫌 、」 t 、う感情が対応付けられて 、る。  [0041] In addition, the values of the emotional parameters of "excitation level" and "love level" are both "negative" and "falling" t, and the emotions are correlated to the negative range, and the emotion parameter of "excitation level" is further correlated. The value of the value of the affective parameter of the “love level” is negative, and “anger” and “disgust” t are associated with negative values.
[0042] さらに、このような「喜び」等の感情の度合い (感情の強さ)は、「興奮レベル」及び「 愛情レベル」の各感情パラメータの値の大きさによって表現され、これら感情パラメ一 タの値の絶対値が大きければ大き 、ほどその感情の度合 、が大き!/、ことを意味する  Furthermore, the degree of emotion (intensity of emotion) such as “joy” is expressed by the magnitude of the value of each of the “excitation level” and “love level” emotion parameters. The greater the absolute value of the value of T, the greater the degree of that feeling, which means that /!
[0043] 従って、例えばある時間における「興奮レベル」及び「愛情レベル」の感情パラメ一 タの値が共に「8」であるときには、これら感情パラメータの値が共に正の値であること から、ロボット玩具 1の感情が「喜び」及び「大好き」という感情を抱いており、さらにこ れら感情パラメータの値が共に最大値であることから、その「喜び」及び「大好き」とい う感情の度合 ヽが最大となって ヽることを表すこと〖こなる。 Therefore, for example, when the values of emotion parameters of “excitation level” and “love level” at a certain time are both “8”, since the values of these emotion parameters are both positive values, the robot The emotions of Toy 1 have feelings of "joy" and "love", and Since the values of both of these emotional parameters are maximum values, it means that the degree of the emotions of "joy" and "love" is maximum.
[0044] そして制御部 10は、尻尾スィッチ 7や頭スィッチ 23及び鼻スィッチ 24からそれぞれ 与えられる操作検出信号 S3A〜S3Cに基づいて、ユーザ力 尻尾スィッチ 7ゃ頭ス イッチ 23及び鼻スィッチ 24を「倒す」又は「押す」と 、う働きかけが与えられた場合や 、この働きかけが一定時間行われな力つたときに、「興奮レベル」及び「愛情レベル」 の感情パラメータの値を「一 8」乃至「8」の範囲で増減させることにより、感情を変化さ せるようになされている。  Then, the control unit 10 controls the user power, the tail switch 7 and the head switch 23 and the nose switch 24 based on the operation detection signals S3A to S3C respectively supplied from the tail switch 7 and the head switch 23 and the nose switch 24. When the user is given a squeeze action when the squeeze action is given, or when the squeeze action is not performed for a certain period of time, the values of the emotional parameter values of “excitation level” and “love level” are “1-8” to Emotion is changed by increasing or decreasing it in the range of "8".
[0045] この場合、力かるユーザからの働きかけに応じて「興奮レベル」及び「愛情レベル」 の各感情パラメータをどのように増減させるかは予め定められており、例えば鼻スイツ チ 24が押圧操作された場合には、「興奮レベル」及び「愛情レベル」の各感情パラメ ータの値がそれぞれ 1増加され、頭スィッチが押圧操作された場合には、「興奮レべ ル」の感情パラメータの値が 1減少されると共に「愛情レベル」の感情パラメータの値 が 1増加される。  In this case, it is determined in advance how to increase or decrease each emotional parameter of “excitation level” and “love level” in accordance with the work from the powerful user. For example, the nose switch 24 is pressed When the head switch is depressed, the values of the "excitation level" and "love level" emotion parameters are increased by 1, respectively. As the value is decreased by one, the value of the affective parameter of “love level” is increased by one.
[0046] 従って、ユーザは、ロボット玩具 1の鼻スィッチ 24を押圧操作することによって、ロボ ット玩具 1の感情を「喜び」及び「大好き」 t 、う感情に移行させたり、この「喜び」及び「 大好き」という感情の度合いを大きくさせたりすることができ、頭スィッチ 23を押圧操 作することによって、ロボット玩具 1の感情を「通常」又は「落ち着き」という感情に移行 させたり、この「通常」又は「落ち着き」の感情の度合 、を大きくさせたりすることができ る。  Therefore, the user can shift the emotion of the robot toy 1 to “joy” and “love” t by moving the nose switch 24 of the robot toy 1 by pressing this “joy”. And the degree of the feeling of “love” can be increased, and by pressing the head switch 23, the feeling of the robot toy 1 is shifted to the feeling of “normal” or “calm”, or this “ It is possible to increase the degree of feeling of “normal” or “calm”.
[0047] また制御部 10は、尻尾スィッチ 7が揺動操作された場合には、「興奮レベル」の感 情パラメータの値を 1増加させる一方、「愛情レベル」の感情パラメータの値を「 8」 乃至「8」の範囲で 1ずつ増加又は減少させながら往復させる。従って、ユーザは、口 ボット玩具 1の尻尾スィッチ 7を揺動操作することによって、ロボット玩具 1の感情を「怒 り」及び「嫌 、」 t 、う感情と、「落ち込み」及び「寂しがる」 t 、う感情との 、ずれかに移 行させたり、その感情の度合いを大きくさせたりすることができる。  Further, when the tail switch 7 is rocked, the control unit 10 increases the value of the emotion parameter of “excitation level” by 1, while the value of the emotion parameter of “love level” is “8. Reciprocate while incrementing or decrementing by 1 in the range of to 8. Therefore, the user swings and operates the tail switch 7 of the mouth bot toy 1 so that the emotions of the robot toy 1 are "angry" and "disgusting" t, and the feeling, "falling" and "shyness" T You can make a transition to a different feeling or increase the degree of that feeling.
[0048] さらに制御部 10は、ユーザ力 尻尾スィッチ 7や頭スィッチ 23及び鼻スィッチ 24を 「倒す」又は「押す」と 、う働きかけが一定期間(例えば 30秒)与えられな力つた場合に は、「興奮レベル」及び「愛情レベル」の各感情パラメータの値をそれぞれ 1減少させ る。従って、このときロボット玩具 1は、自己の感情を「落ち込み」及び「寂しい」という 感情に移行させてゆき、その感情の度合いを大きくさせるとなる。 Furthermore, when the control unit 10 “falls” or “pushes” the user's power, the tail switch 7, the head switch 23 and the nose switch 24, the control unit 10 does not receive a certain period of time (for example, 30 seconds). Decreases the value of each of the emotion level of “excitation level” and “love level” by one. Therefore, at this time, the robot toy 1 shifts its own emotion to "slump" and "loneliness", and the degree of the emotion is increased.
[0049] そして制御部 10は、このようにして感情の「興奮レベル」及び「愛情レベル」の感情 ノラメータの値を変更したときには、変更後のこれら 2つの感情パラメータの値によつ て決定されるそのときのロボット玩具 1の感情とその度合いとに応じて、 LED表示部 2 1の各 LED21A〜21Gをその感情とその度合いに応じた発光パターンで点滅させる ようになされている。  When the control unit 10 changes the values of the emotion excitement level and the affective feeling emotionality parameter of the affection level in this way, it is determined by the values of these two emotion parameters after the change. According to the emotion of the robot toy 1 at that time and its degree, each of the LEDs 21A to 21G of the LED display unit 21 is made to blink in a light emission pattern according to the emotion and its degree.
[0050] 実際上、制御部 10は、上述のようにして感情の「興奮レベル」及び「愛情レベル」の 感情パラメータの値を変更したときには、そのときの「興奮レベル」及び「愛情レベル」 の各感情パラメータの値をメモリ 10B力も読み出し、これら読み出した 2つの感情パラ メータの値を判別することにより、そのときのロボット玩具 1の感情力 「喜び」及び「大 好き」 t 、う感情(「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共に 0 又は正の範囲)と、「通常」及び「落ち着き」 、う感情(「愛情レベル」の感情パラメ一 タの値が 0又は正で、「興奮レベル」の感情パラメータの値が負)と、「落ち込み」及び 「寂 ヽ」 t 、う感情(「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共 に負)と、「怒り」及び「嫌 、」 t 、う感情(「興奮レベル」の感情パラメータの値が 0又は 正で、「愛情レベル」の感情パラメータの値が負)とのうちのどれに該当するかを判断 する。  In practice, when the control unit 10 changes the values of the emotion parameters of the “excitation level” and “love level” of the emotion as described above, the control section 10 at that time “excitation level” and “love level” By reading the values of each emotion parameter in the memory 10B and determining the values of these two emotion parameters read out, the emotion power of the robot toy 1 at that time “joy” and “great love” t, emotion (“ "Emotional level" and "love level" both of the emotional parameter values are 0 or positive range, "normal" and "calmness", and the emotion level of "love level" (0 or positive value) In the “excitation level” emotional parameter value is negative), “falling in” and “ヽ ヽ” t, and 感情 emotion (“excitation level” and “love level” emotional parameter values are both negative), and , "Anger" and "don't like "T", which is one of the following: Emotional feeling (Emotional parameter value of "Excitation level" is 0 or positive and emotional parameter value of "Love level" is negative).
[0051] そして制御部 10は、この後さらに「興奮レベル」及び「愛情レベル」の感情パラメ一 タの値をそれぞれ判別することにより、そのときのロボット玩具 1の感情の度合 、を判 断し、これらの判断結果に基づいて LED表示部 21を制御することにより、そのときの ロボット玩具 1の感情のその度合いと対応付けられた発光パターンで当該 LED表示 部 21の LED21A〜21Gを点滅させる。  Then, the control unit 10 further determines the degree of emotion of the robot toy 1 at that time by further determining the values of the emotion parameter of “excitation level” and “love level”. By controlling the LED display unit 21 based on the determination results, the LEDs 21A to 21G of the LED display unit 21 are blinked with a light emission pattern associated with the degree of emotion of the robot toy 1 at that time.
[0052] そのための手段として、例えば図 13に示すように、「興奮レベル」及び「愛情レベル 」の各感情パラメータの値の組合せに対してどのような発光パターンで LED表示部 2 1の各 LED21 A〜 12Gを点滅させるかが予め定められており、これらすベての発光 パターンについて、 LED表示部 21の LED21A〜21Gをその発光パターンで点滅 駆動するためにとの LED21A〜21Gをどのようなタイミングで点滅させるかが定義さ れたプログラム(以下、これを LED駆動プログラムと呼ぶ)がメモリ 10Bに予め格納さ れている。 As a means for that, for example, as shown in FIG. 13, with respect to the combination of the value of each emotional parameter of “excitation level” and “affection level”, each light emitting pattern of LED display unit 21 with any light emission pattern It is determined in advance whether A to 12 G blinks, and for all the light emission patterns, the LEDs 21 A to 21 G of the LED display unit 21 blink in the light emission pattern. A program (hereinafter referred to as an LED driving program) in which it is defined at what timing to blink the respective LEDs 21A to 21G for driving is stored in advance in the memory 10B.
[0053] そして制御部 10は、上述のようにしてロボット玩具 1の感情及びその度合いを判断 した後に、力かる LED駆動プログラムに従って、 LED表示部 21の LED21A〜21G を、そのときのロボット玩具 1の感情及びその度合いに応じた発光パターンで点滅さ せる。  Then, after the control unit 10 determines the emotion of the robot toy 1 and the degree thereof as described above, the LEDs 21A to 21G of the LED display unit 21 are selected according to the LED drive program to be pressed. Flash with a light emission pattern according to your emotions and their degree.
[0054] なお、この図 13からも明らかなように、このような LED表示部 21における LED21A 〜21Gの発光パターンは、図 14に示すように、中央の LED21Aに対して周囲の LE D21B〜21Gのみを時計回り方向に 1個ずつ点滅させるように、各 LED21B〜21G をそれぞれ隣り合う LED21B〜LED21Gに対して順次点灯と消灯を繰り返えさせる 発光パターンを基本としている。以下においては、この発光パターンを基本発光パタ ーンと呼ぶ。  As is clear from FIG. 13, the light emission patterns of the LEDs 21A to 21G in the LED display unit 21 have the peripheral LEDs 21B to 21G with respect to the central LED 21A, as shown in FIG. In order to blink only one at a time in the clockwise direction, a basic light emission pattern is used in which each of the LEDs 21B to 21G is repeatedly turned on and off sequentially to the adjacent LEDs 21B to 21G. Hereinafter, this light emission pattern is called a basic light emission pattern.
[0055] この基本発光パターンは、「落ち込み」及び「寂しい」以外の感情を表現する発光パ ターンとして用いられており、「喜び」及び「大好き」(「興奮レベル」及び「愛情レベル」 の各感情パラメータの値が共に正)の感情のときには対応する LED21B〜21Gが橙 色に発光駆動され、「通常」及び「落ち着き」(「興奮レベル」の感情パラメータの値が 負で、「愛情レベル」の感情パラメータが 0又は正)の感情のときには対応する LED2 1B〜21Gが緑色に発光駆動され、「怒り」及び「嫌い」(「興奮レベル」の感情パラメ一 タの値が 0又は正で、「愛情レベル」の感情パラメータの値が負)の感情のときには対 応する LED21B〜21Gが緑色に発光駆動される。またロボット玩具の感情が「落ち 込み」及び「寂 ヽ」 (「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共 に負)であるときには、周囲の LED21B〜21Gは発光せず、中央の LED21Aのみ が青色に点滅するように駆動される。  [0055] This basic light emission pattern is used as a light emission pattern expressing emotions other than "dropping" and "loneliness", and "joy" and "love" ("excitation level" and "love level"). When the emotion parameter values are both positive, the corresponding LEDs 21B to 21G are driven to emit orange light, and the “normal” and “calm” (“excitation level” emotion parameter values are negative, “love level”) When the emotion parameter is 0 or positive, the corresponding LED 21B to 21G is driven to emit green, and the values of "anger" and "dislike" ("excitation level" emotion parameter) are 0 or positive, In the case of an emotion with a “love level” emotion parameter value being negative), the corresponding LEDs 21B to 21G are driven to emit green light. In addition, when the emotion of the robot toy is “fall down” and “dropping” (the values of the “excitation level” and “love level” emotion parameters are both negative), the surrounding LEDs 21 B to 21 G do not emit light, Only the central LED 21A is driven to blink blue.
[0056] さらに LED表示部 21における LED21A〜1Gの発光パターンは、「喜び」及び「大 好き」の感情と、「通常」及び「落ち着き」の感情と、「怒り」及び「嫌い」の感情との場合 には、その感情の度合いが大きければ大きいほど光がより速く回転するように、すな わち周囲の LED21B〜21Gの点滅周期が速くなるように規定され、「落ち込み」及び 「寂 、」の感情の場合には、その感情の度合 、が大きければ大き 、ほど中央の LE D21Aの点滅周期が速くなるように規定されている。 [0056] Furthermore, the light emission patterns of the LEDs 21A to 1G in the LED display unit 21 include emotions of "joy" and "great love", emotions of "normal" and "rest", emotions of anger and dislike, and In this case, it is defined that the blinking cycle of the surrounding LEDs 21B to 21G becomes faster, so that the light rotates faster as the degree of emotion is greater, “dropping” and In the case of the emotion of “寂,” it is defined that the larger the degree of the emotion, the larger, the faster the blinking period of the central LED 21A.
[0057] このようにしてこのロボット玩具 1では、 LED表示部 21における LED21A〜21Gの 発光色に基づいてそのときのロボット玩具の感情を外部から視覚的に認識でき、さら にこのときの LED21A〜21Dの点滅速度に基づいてその感情の度合い外部カも視 覚的に認識できるようになされている。  In this manner, in the robot toy 1, the emotion of the robot toy at that time can be visually recognized from the outside based on the light emission colors of the LEDs 21A to 21G in the LED display unit 21, and the LED 21A Based on the blinking speed of 21 D, the degree of emotion can be visually recognized.
[0058] (2— 2)感情生成表現処理手順  (2-2) Emotion Generating Expression Processing Procedure
ここで、制御部 10は、上述のようなロボット玩具 1の感情生成処理及び生成した感 情の表現処理 (LED表示部 21の制御処理)を、上述の LED駆動プログラムに基づ き、図 15に示す感情生成表現処理手順 RT1に従って実行する。  Here, the control unit 10 executes the emotion generation processing of the robot toy 1 as described above and the expression processing of the generated emotion (control processing of the LED display unit 21) based on the above-mentioned LED driving program, as shown in FIG. Emotion generation expression processing procedure shown in Execute according to RT1.
[0059] すなわち制御部 10は、ロボット玩具 1の電源が投入されると、この感情生成表現処 理手順 RT1をステップ SPOにおいて開始し、続くステップ SP1において、ロボット玩 具 1の感情を表す「愛情レベル」及び「興奮レベル」の各感情パラメータの値をそれぞ れ初期値である「0」に設定する。  That is, when the power of the robot toy 1 is turned on, the control unit 10 starts this emotion generation expression processing procedure RT1 in step SPO, and in the subsequent step SP1, it represents “the affection of the robot toy 1” The value of each emotional parameter of “level” and “excitation level” is set to the initial value “0”.
[0060] また制御部 10は、この後ステップ SP3に進んで、頭スィッチ 23 (図 2)、鼻スィッチ 2 4 (図 2)及び尻尾スィッチ 7 (図 2)のうちのいずれかが押圧又は揺動操作されたカゝ否 かを判断する。そして制御部 10は、このステップ SP3において否定結果を得ると、ス テツプ SP4に進んで、図示しない内部タイマのカウント値を読み出し、続くステップ S P5において、ステップ SP4において読み出したカウント値に基づいて、ステップ SP2 にお!/、て「愛情レベル」及び「興奮レベル」の各感情パラメータの値をそれぞれ初期 値に設定し、又は先に頭スィッチ 23、鼻スィッチ 24及び尻尾スィッチ 7のうちのいず れかが押圧又は揺動操作されて力 所定時間(例えば 30秒)が経過した力否かを判 断する。  Further, the control unit 10 thereafter proceeds to step SP3 and either the head switch 23 (FIG. 2), the nose switch 24 (FIG. 2) or the tail switch 7 (FIG. 2) is pressed or shaken. Determine whether the car has been moved. When the control unit 10 obtains a negative result in this step SP3, it proceeds to step SP4 and reads the count value of an internal timer (not shown), and in the following step SP5, based on the count value read in step SP4. In Step SP2, set the emotional parameter values of "love level" and "excitation level" to their initial values, or set any of head switch 23, nose switch 24 and tail switch 7 first. It is determined whether or not the force has been pressed or oscillated and the force for which a predetermined time (for example, 30 seconds) has elapsed has passed.
[0061] 制御部 10は、このステップ SP5において否定結果を得ると、ステップ SP3に戻り、こ の後ステップ SP3又はステップ SP5において肯定結果を得るまでステップ SP3— SP 4 - SP5 - SP3のループを繰り返す。  If the control unit 10 obtains a negative result in this step SP5, it returns to step SP3 and repeats the loop of step SP3-SP 4-SP 5-SP3 until it obtains a positive result in step SP3 or step SP5 thereafter. .
[0062] そして制御部 10は、やがてユーザがロボット玩具 1の頭スィッチ 23を押圧操作する ことによりステップ SP3において肯定結果を得ると、ステップ SP6に進んで、図 16に 示す第 1の感情パラメータ変更処理手順 RT2に従って感情の「愛情レベル」及び「興 奮レベル」の各感情パラメータの値を変更する。 When the control unit 10 obtains an affirmative result in step SP3 by the user pressing the head switch 23 of the robot toy 1 eventually, the control unit 10 proceeds to step SP6 and proceeds to FIG. First Emotional Parameter Change Processing Procedure Shown The value of each emotional parameter “emotional level” and “favorable level” is changed in accordance with RT2.
[0063] すなわち制御部 10は、ステップ SP6に進むと、この第 1の感情パラメータ変更処理 手順 RT2をステップ SP20において開始し、続くステップ SP21において、「愛情レべ ル」及び「興奮レベル」のそれぞれにつ 、て、感情パラメータの値が最大値 (この実施 の形態では「8」 )であるか否かを判断する。  That is, when the control unit 10 proceeds to step SP6, the first emotion parameter change processing procedure RT2 starts in step SP20, and in the following step SP21, “love level” and “excitation level” respectively. Then, it is determined whether the value of the emotion parameter is the maximum value ("8" in this embodiment).
[0064] そして制御部 10は、このステップ SP21において否定結果を得ると、ステップ SP22 に進んで、「愛情レベル」及び「興奮レベル」の各感情パラメータの値をそれぞれ 1増 カロさせる。そして制御部 10は、この後ステップ SP25に進んでこの第 2の感情パラメ一 タ変更処理手順 RT2を終了し、さらにこの後感情生成表現処理手順 RT1 (図 15)の ステップ SP 13に進む。  Then, if the control unit 10 obtains a negative result in this step SP21, it proceeds to step SP22 and increments the values of the affective parameters of the “love level” and the “excitation level” by 1 each. Then, the control unit 10 proceeds to step SP25 to end the second emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
[0065] これに対して制御部 10は、ステップ SP21にお!/、て「愛情レベル」の感情パラメータ の値が最大値であると判断した場合には、ステップ SP23に進んで「興奮レベル」の 感情パラメータの値のみを 1増加させる。そして制御部 10は、この後ステップ SP25に 進んでこの第 1の感情パラメータ変更処理手順 RT2を終了し、さらにこの後感情生成 表現処理手順 RT1のステップ SP 13に進む。  On the other hand, when the control unit 10 determines in step SP21 that the value of the affective parameter “! Love level” is the maximum value, the control unit 10 proceeds to step SP 23 to “excitement level”. Increase the value of emotional parameter only by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
[0066] また制御部 10は、ステップ SP21にお!/、て「興奮レベル」の感情パラメータの値が最 大値であると判断した場合には、ステップ SP24に進んで「愛情レベル」の感情パラメ ータの値のみを 1増加させる。また制御部 10は、この後ステップ SP25に進んでこの 第 1の感情パラメータ変更処理手順 RT2を終了し、さらにこの後感情生成表現処理 手順 RT1のステップ SP13に進む。  If the control unit 10 determines in step SP21 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP24 and the emotion of the “love level”. Increase only the parameter value by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
[0067] 一方、制御部 10は、ユーザがロボット玩具 1の鼻スィッチ 24を押圧操作することに より感情生成表現処理手順 RT1のステップ SP3において肯定結果を得たときには、 ステップ SP7に進んで、図 17に示す第 2の感情パラメータ変更処理手順 RT3に従つ て感情の「愛情レベル」及び「興奮レベル」の各感情パラメータの値を変更する。  On the other hand, when the control unit 10 obtains a positive result in step SP3 of the emotion generation and representation processing procedure RT1 by the user pressing the nose switch 24 of the robot toy 1, the control unit 10 proceeds to step SP7. According to the second emotion parameter change processing procedure RT3 shown in 17, change the value of each emotion parameter of emotion "love level" and "excitation level".
[0068] すなわち制御部 10は、ステップ SP7に進むと、この第 2の感情パラメータ変更処理 手順 RT3をステップ SP30において開始し、続くステップ SP31において、「愛情レべ ル」の感情パラメータの値が最大値である力否力及び「興奮レベル」の感情パラメ一 タの値が最小値 (この実施の形態では「一 8」)である力否かを判断する。 That is, when the control unit 10 proceeds to step SP7, the second emotion parameter change processing procedure RT3 is started in step SP30, and in the subsequent step SP31, the value of the emotion parameter of “love level” is maximum. Emotional Paradigm of Force / None and “Excitation Level” It is determined whether or not the power value is a minimum value ("18" in this embodiment).
[0069] そして制御部 10は、このステップ SP31において否定結果を得ると、ステップ SP32 に進んで、「愛情レベル」の感情パラメータの値を 1増加させると共に「興奮レベル」の 感情パラメータの値を 1減少させる。また制御部 10は、この後ステップ SP35に進ん でこの第 2の感情パラメータ変更処理手順 RT3を終了し、さらにこの後感情生成表現 処理手順 RT1 (図 15)のステップ SP 13に進む。  When the control unit 10 obtains a negative result in this step SP31, it proceeds to step SP32 to increase the value of the affection parameter of the “love level” by 1 and the value of the emotion parameter of the “excitation level” 1 Reduce. Further, the control unit 10 thereafter proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
[0070] これに対して制御部 10は、ステップ SP31において「愛情レベル」の感情パラメータ の値が最大値であると判断した場合には、ステップ SP33に進んで「興奮レベル」の 感情パラメータの値のみを 1減少させる。そして制御部 10は、この後ステップ SP35に 進んでこの第 2の感情パラメータ変更処理手順 RT3を終了し、さらにこの後感情生成 表現処理手順 RT1のステップ SP 13に進む。  On the other hand, when the control unit 10 determines that the value of the affection parameter of “love level” is the maximum value in step SP 31, the control unit 10 proceeds to step SP 33 and the value of the emotion parameter of “excitation level”. Reduce only one. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
[0071] また制御部 10は、ステップ SP31において「興奮レベル」の感情パラメータの値が最 小値であると判断した場合には、ステップ SP34に進んで「愛情レベル」の感情パラメ ータの値のみを 1増加させる。そして制御部 10は、この後ステップ SP35に進んでこ の第 2の感情パラメータ変更処理手順 RT3を終了し、さらにこの後感情生成表現処 理手順 RT1のステップ SP13に進む。  Further, when the control unit 10 determines that the value of the emotion parameter of “excitation level” is the minimum value in step SP31, it proceeds to step SP34 and the value of the emotion parameter of “love level”. Only increase by 1. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3 and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
[0072] 他方、制御部 10は、ユーザがロボット玩具 1の尻尾スィッチ 7を揺動操作することに より感情生成表現処理手順 RT1のステップ SP3において肯定結果を得たときには、 ステップ SP8に進んで、図 18に示す第 3の感情パラメータ変更処理手順 RT4に従つ て感情の「愛情レベル」及び「興奮レベル」の各感情パラメータの値を変更する。  On the other hand, when the control unit 10 obtains a positive result in step SP3 of the emotion generation and representation processing procedure RT1 by swinging operation of the tail switch 7 of the robot toy 1, the control unit 10 proceeds to step SP8, According to the third emotion parameter change processing procedure RT4 shown in FIG. 18, the value of each emotion parameter of emotion "love level" and "excitation level" is changed.
[0073] すなわち制御部 10は、ステップ SP8に進むと、この第 3の感情パラメータ変更処理 手順 RT4をステップ SP40において開始し、続くステップ SP41において、「愛情レべ ル」の感情パラメータの値が最小値 (この実施の形態では「一 8」)である力否か及び「 興奮レベル」の感情パラメータの値が最大値であるか否かを判断する。  That is, when the control unit 10 proceeds to step SP8, the third emotion parameter change processing procedure RT4 starts in step SP40, and in the subsequent step SP41, the value of the affection parameter of “love level” is minimum. It is determined whether the value ("18" in this embodiment) is a force or not and the value of the "excitation level" emotional parameter is a maximum value.
[0074] そして制御部 10は、このステップ SP41において否定結果を得ると、ステップ SP42 に進んで、「愛情レベル」の感情パラメータの値を 1減少させると共に「興奮レベル」の 感情パラメータの値を 1増加させる。また制御部 10は、この後ステップ SP45に進ん でこの第 3の感情パラメータ変更処理手順 RT4を終了し、さらにこの後感情生成表現 処理手順 RT1 (図 15)のステップ SP 13に進む。 Then, when the control unit 10 obtains a negative result in this step SP41, it proceeds to step SP42 and decreases the value of the affection parameter of “love level” by 1 and the value of the emotion parameter of “excitation level” is 1 increase. Further, the control unit 10 thereafter proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and further performs the emotion generation expression thereafter. Procedure Go to step SP13 of RT1 (Figure 15).
[0075] これに対して制御部 10は、ステップ SP41にお 、て「愛情レベル」の感情パラメータ の値が最小値であると判断した場合には、ステップ SP43に進んで「興奮レベル」の 感情パラメータの値のみを 1増加させる。そして制御部 10は、この後ステップ SP45に 進んでこの第 3の感情パラメータ変更処理手順 RT4を終了し、さらにこの後感情生成 表現処理手順 RT1のステップ SP 13に進む。  On the other hand, when the control unit 10 determines in step SP41 that the value of the affection parameter of the “love level” is the minimum value, the control unit 10 proceeds to step SP43 and the emotion level of the “excitation level”. Increase only the parameter value by one. The control unit 10 then proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
[0076] また制御部 10は、ステップ SP41にお!/、て「興奮レベル」の感情パラメータの値が最 大値であると判断した場合には、ステップ SP44に進んで「愛情レベル」の感情パラメ ータの値のみを 1減少させる。そして制御部 10は、この後ステップ SP45に進んでこ の第 3の感情パラメータ変更処理手順 RT4を終了し、さらにこの後感情生成表現処 理手順 RT1のステップ SP13に進む。  If the control unit 10 determines in step SP41 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP44 to process the emotion “love level”. Decrease the value of the parameter by 1 only. Then, the control unit 10 proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
[0077] 他方、制御部 10は、「愛情レベル」及び「興奮レベル」の各感情パラメータの値をそ れぞれ初期値に設定し、又は先に頭スィッチ 23、鼻スィッチ 24及び尻尾スィッチ 7の うちのいずれかが押圧又は揺動操作されて力 所定時間が経過することにより感情 生成表現処理手順 RT1のステップ SP5にお 、て肯定結果を得たときには、ステップ SP9に進んで、「愛情レベル」及び「興奮レベル」のそれぞれについて、感情パラメ一 タの値が最小値であるか否かを判断する。  On the other hand, control unit 10 sets the values of affective parameters of “love level” and “excitation level” to initial values, respectively, or head switch 23, nose switch 24 and tail switch 7 in advance. If a positive result is obtained in step SP5 of the emotion generation and expression processing procedure RT1 when the force predetermined time has elapsed since any of the keys is pressed or oscillated, the process proceeds to step SP9. For each of “excitation level” and “excitation level”, it is determined whether or not the value of the emotion parameter is a minimum value.
[0078] そして制御部 10は、このステップ SP9において否定結果を得ると、ステップ SP10 に進んで、「愛情レベル」及び「興奮レベル」の各感情パラメータの値をそれぞれ 1増 加させ、この後ステップ SP 13に進む。  When the control unit 10 obtains a negative result in this step SP9, it proceeds to step SP10 and increments the value of each of the affective parameters of the “love level” and the “excitation level” by one, and then proceeds to step Go to SP 13.
[0079] これに対して制御部 10は、ステップ SP9において「愛情レベル」の感情パラメータ の値が最小値であると判断した場合には、ステップ SP11に進んで「興奮レベル」の 感情パラメータの値のみを 1減少させ、この後ステップ SP 13に進む。  On the other hand, when the control unit 10 determines that the value of the affection parameter of “love level” is the minimum value in step SP 9, the control unit 10 proceeds to step SP 11 and the value of the emotion parameter of “excitation level” Only decrease by 1 and then proceed to step SP13.
[0080] また制御部 10は、ステップ SP9において「興奮レベル」の感情パラメータの値が最 小値であると判断した場合には、ステップ SP12に進んで「愛情レベル」の感情パラメ ータの値のみを 1減少させ、この後ステップ SP 13に進む。  If the control unit 10 determines in step SP9 that the value of the emotion parameter of “excitation level” is the minimum value, the control unit 10 proceeds to step SP12 and the value of the emotion parameter of “love level”. Only decrease by 1 and then proceed to step SP13.
[0081] そして制御部 10は、上述のようにステップ SP3〜ステップ SP12においてユーザか らの働きかけの有無やその内容に応じて更新した「興奮レベル」及び「愛情レベル」 の各感情パラメータの値に基づいて、この後、ステップ SP13〜ステップ SP19におい て、そのときのロボット玩具 1の感情及びその度合いを LED表示部 21における LED 21A〜21Gの発光パターンによって表現する。 Then, as described above, control unit 10 updates the “excitation level” and “love level” updated according to the presence or absence of the user's action in step SP3 to step SP12 as described above. Based on the value of each emotion parameter, the emotion of the robot toy 1 and the degree thereof at that time are expressed by the light emission patterns of the LEDs 21A to 21G in the LED display 21 in steps SP13 to SP19.
[0082] すなわち制御部 10は、ステップ SP13に進むと、かかる変更後の「興奮レベル」及 び「愛情レベル」の感情パラメータの値をメモリ 10B (図 2)力も読み出し、これら感情 パラメータの値を判定する。そして制御部 10は、「興奮レベル」及び「愛情レベル」の 各感情パラメータが共に 0又は正の値であったときにはステップ SP14に進んで、図 1 9に示す第 1の感情表現処理手順 RT5に従って、そのときのロボット玩具 1の感情(「 喜び」及び「大好き」 )及びその度合 、を LED表示部 21における LED21A〜21Gの 発光パターンとして表現する。  That is, when the control unit 10 proceeds to step SP13, the memory 10B (FIG. 2) also reads the values of the emotion parameters of the “excitation level” and “love level” after the change, and the values of these emotion parameters are judge. Then, the control unit 10 proceeds to step SP14 when the emotion parameters of "excitation level" and "love level" are both 0 or a positive value, according to the first emotion expression processing procedure RT5 shown in FIG. The emotion ("joy" and "love") and the degree of the robot toy 1 at that time are expressed as the light emission pattern of the LEDs 21A to 21G in the LED display unit 21.
[0083] 実際上、制御部 10は、ステップ SP14に進むと、この第 1の感情表現処理手順 RT5 をステップ SP50において開始し、続くステップ SP51において、「興奮レベル」及び「 愛情レベル」の各感情パラメータの値を判定する。そして制御部 10は、「興奮レベル 」及び「愛情レベル」の各感情パラメータの値が 0から 4までの範囲内である場合には 、ステップ SP52に進んで、橙色の光が 1秒間で 1周するように LED表示部 21におけ る周囲の各 LED21B〜21Gを順次橙色に点滅させ、「興奮レベル」及び「愛情レべ ル」の各感情パラメータの値カこれ以外の場合には、ステップ SP53に進んで、橙色 の光が 0.5秒で 1周するように LED表示部 21における周囲の各 LED21B〜21Gを 順次橙色に点滅させる。  In practice, when the control unit 10 proceeds to step SP14, the first emotion expression processing procedure RT5 starts in step SP50, and in the subsequent step SP51, each emotion of “excitation level” and “love level” Determine the value of the parameter. Then, when the value of each emotion parameter of “excitation level” and “love level” is within the range of 0 to 4, the control unit 10 proceeds to step SP52, and orange light is rotated once per second. The surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in sequence, and the values of emotion parameters of “excitation level” and “affection level” are not set, otherwise step SP53. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
[0084] また制御部 10は、ステップ SP52又はステップ SP53の処理を終了すると、ステップ SP54に進んでこの第 1の感情表現処理手順 RT5を終了し、その後感情生成表現処 理手順 RT1のステップ SP3に戻る。  When the control unit 10 ends the processing of step SP52 or step SP53, the control unit 10 proceeds to step SP54 to end this first emotion expression processing procedure RT5, and thereafter to step SP3 of the emotion generation expression processing procedure RT1. Return.
[0085] 一方、制御部 10は、感情生成表現処理手順 RT1のステップ SP13の判定におい て、「興奮レベル」の感情パラメータの値が 0又は正で、「愛情レベル」の感情パラメ一 タが負であつたときにはステップ SP 15に進んで、図 20に示す第 2の感情表現処理 手順 RT6に従って、そのときのロボット玩具 1の感情(「怒り」及び「嫌い」)を LED表 示部 21における LED21A〜21Gの発光パターンとして表現する。  On the other hand, in the determination of step SP13 of the emotion generation and expression processing procedure RT1, the control unit 10 determines that the value of the emotion parameter of “excitation level” is 0 or positive and the emotion parameter of “love level” is negative. If so, the process proceeds to step SP15, and according to the second emotion expression processing procedure RT6 shown in FIG. 20, the emotion (“anger” and “dislike”) of the robot toy 1 at that time is displayed on the LED 21A in the LED display unit 21. Expressed as a light emission pattern of ̃21 G.
[0086] すなわち制御部 10は、ステップ SP15に進むと、この第 2の感情表現処理手順 RT6 をステップ SP60において開始し、続くステップ SP61において、「興奮レベル」及び「 愛情レベル」の各感情パラメータの値を判定する。そして制御部 10は、「興奮レベル 」及び「愛情レベル」の各感情パラメータの値が 0から 4までの範囲内である場合には 、ステップ SP62に進んで、橙色の光が 1秒間で 1周するように LED表示部 21におけ る周囲の各 LED21B〜21Gを順次橙色に点滅させ、「興奮レベル」及び「愛情レべ ル」の各感情パラメータの値カこれ以外の場合には、ステップ SP63に進んで、橙色 の光が 0.5秒で 1周するように LED表示部 21における周囲の各 LED21B〜21Gを 順次橙色に点滅させる。 That is, when the control unit 10 proceeds to step SP15, this second emotion expression processing procedure RT6 Is started in step SP60, and in the subsequent step SP61, values of emotion parameters of "excitation level" and "love level" are determined. Then, the control unit 10 proceeds to step SP62 when the value of each emotion parameter of "excitation level" and "love level" is within the range of 0 to 4, and the orange light is rotated once per second. The surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in order, and the values of emotion parameters of “excitation level” and “affection level” are not set otherwise, step SP63. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
[0087] また制御部 10は、ステップ SP62又はステップ SP63の処理を終了すると、ステップ SP64に進んでこの第 2の感情表現処理手順 RT6を終了し、その後感情生成表現処 理手順 RT1のステップ SP3に戻る。  When the control unit 10 ends the processing of step SP62 or step SP63, the control unit 10 proceeds to step SP64 to end the second emotion expression processing procedure RT6, and thereafter proceeds to step SP3 of the emotion generation expression processing procedure RT1. Return.
[0088] 他方、制御部 10は、感情生成表現処理手順 RT1のステップ SP13の判定におい て、「興奮レベル」の感情パラメータの値が負で、「愛情レベル」の感情パラメータが 0 又は正であったときには、ステップ SP16に進んで、図 21に示す第 3の感情表現処理 手順 RT7に従って、そのときのロボット玩具 1の感情(「通常」及び「落ち着き」)を LE D表示部 21における LED21A〜21Gの発光パターンとして表現する。  On the other hand, in the determination of step SP13 of emotion generation expression processing procedure RT1, control unit 10 determines that the value of the emotion parameter of “excitation level” is negative and the emotion parameter of “love level” is 0 or positive. When the process proceeds to step SP16, the emotion (“normal” and “calm”) of the robot toy 1 at that time is displayed according to the third emotion expression processing procedure RT7 shown in FIG. Expressed as a light emission pattern of
[0089] すなわち制御部 10は、ステップ SP16に進むと、この第 3の感情表現処理手順 RT3 をステップ SP70において開始し、続くステップ SP71において、「興奮レベル」及び「 愛情レベル」の各感情パラメータの値を判定する。そして制御部 10は、「愛情レベル 」の感情パラメータの値が― 1から— 4までの範囲内であり、かつ「興奮レベル」の感 情パラメータの値が 1から 4までの範囲内にある場合には、ステップ SP72に進んで、 緑色の光が 1秒間で 1周するように LED表示部 21における周囲の各 LED21B〜21 Gを順次緑色に点滅させ、「興奮レベル」及び「愛情レベル」の各感情パラメータの値 力 れ以外の場合には、ステップ SP73に進んで、緑色の光が 0.5秒で 1周するように LED表示部 21における周囲の各 LED21B〜21Gを順次緑色に点滅させる。  That is, when the control unit 10 proceeds to step SP16, the third emotion expression processing procedure RT3 is started in step SP70, and in the subsequent step SP71, each of the emotion parameters of “excitation level” and “love level” Determine the value. Then, when the value of the affection parameter of “love level” is in the range of 1 to 4 and the value of the emotion parameter of “excitation level” is in the range of 1 to 4 of control unit 10. In step SP72, the surrounding LEDs 21B to 21G in the LED display unit 21 are sequentially blinked in green so that the green light makes one rotation in 1 second, and the "excitation level" and "love level" If the value of each emotion parameter is not set, the process proceeds to step SP73, and the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink green so that the green light makes one rotation in 0.5 seconds.
[0090] また制御部 10は、ステップ SP72又はステップ SP73の処理を終了すると、ステップ SP74に進んでこの第 3の感情表現処理手順 RT7を終了し、その後感情生成表現処 理手順 RT1のステップ SP3に戻る。 [0091] 他方、制御部 10は、感情生成表現処理手順 RT1のステップ SP13の判定におい て、「興奮レベル」及び「愛情レベル」の各感情パラメータの値が負であったときには、 ステップ SP 17に進んで、「興奮レベル」及び「愛情レベル」の各感情パラメータの値 を判定する。 When the control unit 10 ends the processing of step SP72 or step SP73, the control unit 10 proceeds to step SP74 to end the third emotion expression processing procedure RT7, and then proceeds to step SP3 of the emotion generation expression processing procedure RT1. Return. On the other hand, when it is determined in step SP13 of the emotion generation and expression processing procedure RT1 that the value of each emotion parameter of “excitation level” and “love level” is negative, the control unit 10 proceeds to step SP17. Go ahead and determine the value of each emotion parameter of "excitation level" and "love level".
[0092] そして制御部 10は、「興奮レベル」及び「愛情レベル」の各感情パラメータの値が共 に— 1から— 4までの範囲内である場合には、ステップ SP18に進んで、 LED表示部 21における中央の LED21Aのみを 1秒間に 1回の周期で点滅させ、「興奮レベル」 及び「愛情レベル」の各感情パラメータの値がこれ以外の場合には、ステップ SP19 に進んで、 LED表示部 21における中央の LED21Aのみを 0.5秒間に 1回の周期で 点滅させる。また制御部 10は、ステップ SP18又はステップ SP19の処理を終了する と、ステップ SP3に戻る。  Then, when the values of the “excitation level” and “love level” emotional parameters are both in the range of 1 to 4, the control unit 10 proceeds to step SP 18 and displays the LED Only the central LED 21A in section 21 blinks once per second, and if the values of the emotion level for "excitation level" and "love level" are other than this, the process proceeds to step SP19 to display LED Only the central LED 21A in section 21 blinks once every 0.5 seconds. When the control unit 10 completes the process of step SP18 or step SP19, the control unit 10 returns to step SP3.
[0093] このようにして制御部 10は、そのときのロボット玩具 1の感情及びその度合いに応じ た発光パターンで LED表示部 21の LED21A〜21Gを点滅させる。  Thus, the control unit 10 blinks the LEDs 21A to 21G of the LED display unit 21 with a light emission pattern according to the emotion of the robot toy 1 at that time and the degree thereof.
[0094] (3)効果  (3) Effects
以上のように、本実施の形態のロボット玩具 1は、「喜び」及び「大好き」の感情と、「 通常」及び「落ち着き」の感情と、「怒り」及び「嫌い」の感情とを表現するときには、 LE D表示部 21における LED21A〜21Gのうちの周囲の LED21B〜21Gのみを時計 回り方向に 1個ずつ点滅させ、「落ち込み」及び「寂しい」の感情を表現するときには、 中央の LED21Aのみを青色に点滅させる。  As described above, the robot toy 1 according to the present embodiment expresses the emotions of "joy" and "love", the emotions of "normal" and "calm", and the emotions of "anger" and "dislike". Sometimes, when the LEDs 21B to 21G among the LEDs 21A to 21G in the LED display unit 21 are blinked one by one in the clockwise direction to express the feelings of “dropping” and “loneliness”, only the central LED 21A is Blinks blue.
[0095] そしてこのような表現方法によれば、例えば予め形状の定まった LEDを単に点滅さ せるだけの場合に比べて多様な発光パターンでロボット玩具 1の感情を表現すること ができる一方で、複数の LEDをマトリクス状に配置する場合に比べて少な 、LED数 で LED表示部 21を構築することができるため、ロボット玩具 1を安価に構築しながら 効果的に感情を表現することができる。  According to such an expression method, for example, the emotion of the robot toy 1 can be expressed with various light emission patterns, as compared to the case where only the LED whose shape is fixed is simply blinked. Since the LED display unit 21 can be constructed with a smaller number of LEDs than when a plurality of LEDs are arranged in a matrix, emotion can be expressed effectively while constructing the robot toy 1 inexpensively.
[0096] (4)他の実施形態  (4) Other Embodiments
なお上記実施形態では、 LED表示部 21の中央の LED21A (第 1の光源)として、 緑色、赤色及び青色の 3色を同時又は別個に発光し得るものを用い、他の周囲の L ED21B〜12G (第 2の光源)として、緑色及び赤色の 2色を同時又は別個に発光し 得るものを用いるようにしたが、中央の LED21Aの発光色の種類及びその数と、他 の周囲の LED21B〜12Gの発光色の種類及びその数とについては、これ以外であ つてもよい。 In the above embodiment, as the LED 21A (first light source) at the center of the LED display unit 21, one capable of emitting three colors of green, red and blue simultaneously or separately is used, and other surrounding L ED21B to 12G are used. As a (second light source), it emits two colors of green and red simultaneously or separately Although what was obtained was used, the types and the number of luminescent colors of the central LED 21A and the types and the number of the luminescent colors of the other surrounding LEDs 21B to 12G may be other than this.
[0097] また上記実施形態では、中央の LED21Aから等距離にかつ相互に等間隔に位置 するように残りの 6つの LED21B〜21Gを配置するようにしたが、 LED21A〜: LED2 1Gの配置としては、これ以外の配置を適用することができる。例えば図 22 (A— 1)及 び(A— 2)のように中央の LED70の周囲に 3個の LED70B〜: LED70Dを配置する ようにしたり、図 22 (B— 1)及び(B— 2)のように中央の LED71 Aの周囲に 4個の LE D71B〜71Eを配置したり、図 22 (C— 1)及び(C— 2)のように中央の LED72Aの 周囲に 5個の LED72B〜72Fを配置するようにしてもよ!、。また図 22 (D)のように中 央の LED73Aの周囲に 6個の LED73B〜73Gを配置する場合でも、この図のように 配置するようにしてもよい。さらに図 22 (E)のように中央の LED73Aの周囲に 8個の LED73B〜73Iを配置してもよぐこれ以上の LEDを配置するようにしてもよい。この ようにしても上記実施形態と同様の効果を得ることができる。  In the above embodiment, the remaining six LEDs 21 B to 21 G are arranged so as to be equidistant from the central LED 21 A and equally spaced from each other, but as the arrangement of the LEDs 21 A to: LED 21 G Other arrangements can be applied. For example, as shown in FIGS. 22 (A-1) and (A-2), three LEDs 70B to 70D are arranged around the central LED 70, and FIGS. 22 (B-1) and (B-2). ) As shown in Figure 22 (C-1) and (C-2) as shown in Figure 22 (C-1) and (C-2), 5 LEDs 72B-around the central LED 72A). You may arrange 72F! Further, even when six LEDs 73B to 73G are arranged around the central LED 73A as shown in FIG. 22D, they may be arranged as shown in this figure. Further, as shown in FIG. 22E, eight LEDs 73B to 73I may be arranged around the central LED 73A, or more LEDs may be arranged. Also in this case, the same effect as the above embodiment can be obtained.
[0098] さらに上記実施形態では、感情パラメータを「興奮レベル」及び「愛情レベル」の 2種 類とするようにした力 感情パラメータの数及び種類はこれ以外のものであってもよ ヽ 。例えば LED21A〜21Gの発光色と同じ数だけ感情の種類を用意して、感情と LE D21A〜21Gの発光色とを対応付けた発光パターンで LED21A〜21Gを点滅させ るようにしてちょい。  Furthermore, in the above embodiment, the number and types of force emotional parameters may be other than the above, so that the emotional parameters are set to two types of “excitation level” and “love level”. For example, the same number of emotion types as the emission colors of the LEDs 21A to 21G are prepared, and the LEDs 21A to 21G are blinked in a light emission pattern in which the emotions correspond to the emission colors of the LEDs 21A to 21G.
[0099] さらに上記実施形態では、 LED表示部 21の中央の LED21Aのみを点滅させたり 、中央の LED21Aに対して周囲の LED21B〜21Gのみを時計回り方向に 1個ずつ 点滅させる発光パターンで各 LED21A〜21Gを点滅させるようにした力 発光パタ ーンとしてはこれ以外の発光パターンであってもよい。例えば、中央の LED21Aに 対して周囲の LED21B〜21Gのみを反時計回り方向に 1個ずつ点滅させたり、又は 中央の LED21Aに対して周囲の LED21B〜21Gのみを時計回り若しくは反時計回 り方向に複数個ずつ点滅させるように、周囲の LED21B〜21Gを、隣り合う LED21 B〜21Gに対して順次点灯と消灯を繰り返すように制御するようにしてもよい。  Furthermore, in the above embodiment, each LED 21A is a light emission pattern that blinks only the central LED 21A of the LED display unit 21 or blinks only the peripheral LEDs 21B to 21G clockwise with respect to the central LED 21A. A light emission pattern other than this may be used as the light emission pattern in which the light of ̃21 G is made to blink. For example, only the peripheral LEDs 21B to 21G blink one by one in a counterclockwise direction with respect to the central LED 21A, or only the peripheral LEDs 21B to 21G with respect to the central LED 21A in a clockwise or counterclockwise direction. The surrounding LEDs 21 </ b> B to 21 </ b> G may be controlled to sequentially turn on and off the adjacent LEDs 21 </ b> B to 21 </ b> G so that a plurality of them blink.
[0100] さらに上記実施形態では、理解しやすくするため「興奮レベル」及び「愛情レベル」 の各感情パラメータの値を 8〜8の範囲で変化させるようにした力 この範囲はこれ に限定されるものではなぐ種々の範囲を適用することができる。またこれに伴って、 感情生成表現処理手順 RT1の内容も適宜変更することができる。例えば、実際のコ ンピュータ処理では、 8〜8の範囲の数力^〜 15 (16進法での 0〜F)の範囲の数と して処理されるため、例えば感情生成表現処理手順 RT1のステップ SP13について 上述したような感情パラメータが正又は負の 、ずれであるかと!/、つた判定は行われず 、ステップ SP14 (又はステップ SP15〜ステップ SP17)と合わせて感情パラメータの 値が 0〜3, 4〜7, 8〜B, C〜Fの何れの範囲に含まれるかを判定して、その範囲に 対応付けられた色及び発光パターンで LED21A〜21Gを発光させるようにしても良 い。要は、感情パラメータの値の組み合わせに対応した発光パターンで LED21A〜 21Gを発光させるのであれば、その手法としては種々の手法を適用することができる 産業上の利用可能性 Furthermore, in the above embodiment, the “excitation level” and the “love level” are provided to facilitate understanding. The force of changing the value of each emotional parameter in the range of 8 to 8 Various ranges beyond this are applicable. Also, along with this, the contents of the emotion generation and expression processing procedure RT1 can be appropriately changed. For example, in an actual computer process, it is processed as a number in the range of 8 to 8 and a range of numbers ^ to 15 (0 to F in hexadecimal notation). Emotion parameters such as those described above for step SP13 are positive or negative and are not deviated! /, No judgment is made, and the value of the emotion parameter is 0 to 3, together with step SP14 (or step SP15 to step SP17) It may be determined which one of the ranges 4 to 7 and 8 to B and C to F, and the LEDs 21A to 21G emit light with the color and the light emission pattern associated with the range. The point is that if the LEDs 21A to 21G are made to emit light with a light emission pattern corresponding to a combination of values of emotional parameters, various methods can be applied as the method industrial applicability
本発明は、種々のロボット玩具に広く適用することができる。  The present invention can be widely applied to various robot toys.

Claims

請求の範囲 The scope of the claims
[1] 少なくとも 2色の光を同時に又は個別に発光可能な第 1の光源と、  [1] a first light source capable of emitting light of at least two colors simultaneously or separately;
前記第 1の光源の周囲に配置された 5つ以上の第 2の光源と、  Five or more second light sources disposed around the first light source;
感情を定義する少なくとも 2種類の感情パラメータを記憶するメモリと、 外部からの操作入力に基づ 、て前記感情パラメータの値を増減させる制御手段と を備え、  A memory for storing at least two types of emotion parameters defining an emotion; and control means for increasing or decreasing the value of the emotion parameter based on an external operation input,
前記制御手段は、  The control means
各前記感情パラメータの値の組み合せに対応した発光パターンで前記第 1及び又 は前記第 2の光源を発光させる  The first and / or second light sources are caused to emit light in an emission pattern corresponding to the combination of values of the emotional parameters.
ことを特徴とするロボット玩具。  A robot toy characterized by
[2] 前記制御手段は、 [2] The control means
少なくとも一方の各前記感情パラメータの値によって決定される前記感情の度合い に応じて、当該度合いが大きいほど速い周期で前記第 1及び又は第 2の光源を点滅 させる  Depending on the degree of the emotion determined by the value of at least one of the emotion parameters, the first and / or the second light source blinks in a faster cycle as the degree is larger.
ことを特徴とする請求項 1に記載のロボット玩具。  The robot toy according to claim 1, characterized in that:
[3] 前記制御手段は、 [3] The control means is
各前記感情パラメータの値の組み合わせによって決定される前記感情の種類に応 じた色で前記第 1及び又は前記第 2の光源を点滅させる  Blinking the first and / or the second light source in a color according to the kind of emotion determined by the combination of values of the respective emotion parameters
ことを特徴とする請求項 1に記載のロボット玩具。  The robot toy according to claim 1, characterized in that:
[4] 各前記第 2の光源は、 [4] Each of the second light sources is
前記第 1の光源を中心に等間隔に配置され、  The first light sources are equally spaced apart from each other,
前記制御手段によって、隣り合う前記第 2の光源に対して順次点灯と消灯を繰り返 すように制御される  It is controlled by the control means to sequentially turn on and off the second light sources adjacent to each other.
ことを特徴とする請求項 2又は請求項 3に記載のロボット玩具。  The robot toy according to claim 2 or 3 characterized by things.
[5] 前記第 1の光源は、前記第 2の光源が別個に発光可能な色の数よりも多い数の色 を発光可能である [5] The first light source can emit more colors than the second light source can emit separately.
ことを特徴とする請求項 1に記載のロボット玩具。  The robot toy according to claim 1, characterized in that:
[6] 前記制御手段は、 各前記感情パラメータの値の組み合わせによって決定される前記感情の種類が予 め定められた所定種類に該当するときには、前記第 1の光源を、前記第 2の光源が 発光できな ヽ所定色で発光させる [6] The control means is When the kind of emotion determined by the combination of the values of each of the emotion parameters corresponds to a predetermined predetermined kind, the first light source can not emit light, and the second light source can not emit light. Let
ことを特徴とする請求項 1又は請求項 5に記載のロボット玩具。  The robot toy according to claim 1 or 5 characterized by things.
[7] 前記第 1及び前記第 2の光源を覆うように配設された半透明なカバー [7] A translucent cover disposed to cover the first and second light sources
を備えることを特徴とする請求項 1に記載のロボット玩具。  The robot toy according to claim 1, comprising:
[8] 前記ロボット玩具は、 2つ以上の操作入力スィッチを有し、 [8] The robot toy has two or more operation input switches,
前記制御手段は、  The control means
前記操作入力スィッチのそれぞれの操作入力に対応して、前記感情パラメータの 値をそれぞれ増加させ、所定時間の操作入力がない場合に、前記感情パラメータの 値を減少させる  The value of the emotion parameter is increased corresponding to each operation input of the operation input switch, and the value of the emotion parameter is decreased when there is no operation input for a predetermined time.
ことを特徴とする請求項 1に記載のロボット玩具。  The robot toy according to claim 1, characterized in that:
PCT/JP2006/300617 2005-01-18 2006-01-18 Robot toy WO2006077868A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/775,133 US20070270074A1 (en) 2005-01-18 2007-07-09 Robot Toy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-010771 2005-01-18
JP2005010771A JP2006198017A (en) 2005-01-18 2005-01-18 Robot toy

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/775,133 Continuation US20070270074A1 (en) 2005-01-18 2007-07-09 Robot Toy

Publications (1)

Publication Number Publication Date
WO2006077868A1 true WO2006077868A1 (en) 2006-07-27

Family

ID=36692257

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/300617 WO2006077868A1 (en) 2005-01-18 2006-01-18 Robot toy

Country Status (3)

Country Link
US (1) US20070270074A1 (en)
JP (1) JP2006198017A (en)
WO (1) WO2006077868A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515092B2 (en) 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
CN105676740A (en) * 2016-03-01 2016-06-15 深圳前海勇艺达机器人有限公司 Method and apparatus enabling robot to have illumination function

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM285388U (en) * 2005-10-05 2006-01-11 Wen-Bin Shiu Pet toy combining with MP3 player
US8292689B2 (en) 2006-10-02 2012-10-23 Mattel, Inc. Electronic playset
US8062089B2 (en) 2006-10-02 2011-11-22 Mattel, Inc. Electronic playset
CN101406756A (en) * 2007-10-12 2009-04-15 鹏智科技(深圳)有限公司 Electronic toy for expressing emotion and method for expressing emotion, and luminous unit control device
CN101411946B (en) * 2007-10-19 2012-03-28 鸿富锦精密工业(深圳)有限公司 Toy dinosaur
US20100305448A1 (en) * 2009-05-26 2010-12-02 Anne Cecile Dagonneau Apparatus and method for indicating ultrasound probe orientation and activation status
US8438233B2 (en) 2011-03-23 2013-05-07 Color Labs, Inc. Storage and distribution of content for a user device group
US20120320077A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Communicating status and expression
US9656392B2 (en) * 2011-09-20 2017-05-23 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US8412772B1 (en) 2011-09-21 2013-04-02 Color Labs, Inc. Content sharing via social networking
WO2013093757A1 (en) * 2011-12-21 2013-06-27 Koninklijke Philips Electronics N.V. Peel and stick cpr assistance device
CN102980103A (en) * 2012-11-27 2013-03-20 华南理工大学 Machine vision LED (light emitting diode) illumination source
US20160025326A1 (en) * 2014-07-28 2016-01-28 Sergei GONCHAR Sparkly childrens products
US20170326443A1 (en) 2016-05-13 2017-11-16 Universal Entertainment Corporation Gaming machine
JP6572943B2 (en) 2017-06-23 2019-09-11 カシオ計算機株式会社 Robot, robot control method and program
JP6936081B2 (en) * 2017-08-30 2021-09-15 パナソニック株式会社 robot
JP1622874S (en) * 2017-12-29 2019-01-28 robot
JPWO2020009098A1 (en) * 2018-07-02 2021-08-05 Groove X株式会社 robot
KR102012968B1 (en) * 2018-08-07 2019-08-27 주식회사 서큘러스 Method and server for controlling interaction robot
KR102415997B1 (en) * 2020-11-05 2022-07-05 (주)로보티즈 Companion robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3277500B2 (en) * 1999-05-10 2002-04-22 ソニー株式会社 Robot device
JP2002154081A (en) * 2000-11-16 2002-05-28 Nec Access Technica Ltd Robot, its facial expression method and detecting method for step difference and lifting-up state

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337552B1 (en) * 1999-01-20 2002-01-08 Sony Corporation Robot apparatus
US4556932A (en) * 1983-03-28 1985-12-03 Lehrer Bradley D Lighted novelty item
JPS60128699U (en) * 1984-02-07 1985-08-29 株式会社トミー radio controlled toy
US5141464A (en) * 1991-01-23 1992-08-25 Mattel, Inc. Touch responsive animated toy figure
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5461188A (en) * 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
US5668333A (en) * 1996-06-05 1997-09-16 Hasbro, Inc. Musical rainbow toy
JP4518229B2 (en) * 1999-05-10 2010-08-04 ソニー株式会社 Robot device
JP2001191284A (en) * 1999-10-25 2001-07-17 Sony Corp Robot device and its learning method
USD448433S1 (en) * 1999-11-02 2001-09-25 Sega Toys Ltd. Robotic dog
USD457203S1 (en) * 1999-11-02 2002-05-14 Sega Toys, Ltd. Robotic dog
JP2001191281A (en) * 1999-12-29 2001-07-17 Sony Corp Editing device, editing method, and storage medium
US6672934B2 (en) * 2000-02-04 2004-01-06 Trendmasters, Inc. Amusement device
AU2001259610A1 (en) * 2000-05-09 2001-11-20 Hasbro, Inc. Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered
JP2002018146A (en) * 2000-07-04 2002-01-22 Tomy Co Ltd Interactive toy, reaction behavior generator and reaction behavior pattern generation method
JP2002127059A (en) * 2000-10-20 2002-05-08 Sony Corp Action control device and method, pet robot and control method, robot control system and recording medium
US6507773B2 (en) * 2001-06-14 2003-01-14 Sharper Image Corporation Multi-functional robot with remote and video system
JP2003060745A (en) * 2001-08-22 2003-02-28 Sony Corp Device and method for transmitting information and monitoring device
JP2003071765A (en) * 2001-09-04 2003-03-12 Sony Corp Robot device and input method therefor
JP2004237392A (en) * 2003-02-05 2004-08-26 Sony Corp Robotic device and expression method of robotic device
US7374482B2 (en) * 2003-08-12 2008-05-20 Ghaly Nabil N Interactive slot machine

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3277500B2 (en) * 1999-05-10 2002-04-22 ソニー株式会社 Robot device
JP2002154081A (en) * 2000-11-16 2002-05-28 Nec Access Technica Ltd Robot, its facial expression method and detecting method for step difference and lifting-up state

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"<Hodo Kankei Shiryo> idog Shinshohin & Zensekai Hatsubai Kettei!, Kabushiki Kaisha Sega Toys", 19 January 2005 (2005-01-19), XP003000534, Retrieved from the Internet <URL:http://www.segatoys.co.jp/company_information/press_release/pdf/20050119.pdf> *
KANO M. ET AL.: "Kansei Kaiwagata Robot 'ifbot' no Hyojo Seigyo no Kanjo Kukan eno Mapping", DAI 66 KAI (HEISEI 16 NEN) ZENKOKU TAIKAI KOEN RONBUNSHU (4), INTERFACE COMPUTER TO NINGEN SHAKAI, INFORMATION PROCESSING SOCIETY OF JAPAN, 9 March 2004 (2004-03-09), pages 4-77 - 4-78, XP003000533 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515092B2 (en) 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
CN105676740A (en) * 2016-03-01 2016-06-15 深圳前海勇艺达机器人有限公司 Method and apparatus enabling robot to have illumination function

Also Published As

Publication number Publication date
US20070270074A1 (en) 2007-11-22
JP2006198017A (en) 2006-08-03

Similar Documents

Publication Publication Date Title
WO2006077868A1 (en) Robot toy
JP7226872B2 (en) Light Emitting Diode Switch Elements and Arrays
JP2006198017A5 (en)
US7442107B1 (en) Electronic toy, control method thereof, and storage medium
WO2000035548A1 (en) Interactive toy
WO2001038050A1 (en) Insect robot
US20110195632A1 (en) Toy
US20070190894A1 (en) Holiday displays having active figurines
JP4757979B2 (en) Electronic toy
KR200385518Y1 (en) Light radiating ornaments
US20080090489A1 (en) Doll with two conductor tethered remote control
KR20010091876A (en) Electronic toy and method of controlling the same and memory media
JP7458109B1 (en) ornaments
CN204218264U (en) A kind of child intelligence toothbrush
TW200922671A (en) A toy with emotion expression function and method therefor, and a means of mixing light
JPH02136160A (en) Operating device
TWM504915U (en) Wearable dynamic LED light chain
JP6150246B2 (en) Light block that shines in response to vibration
TWM268514U (en) Apparatus of splendor lamp
JP2007125171A (en) Accessory toy
TW201033993A (en) Display apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11775133

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 11775133

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 06711886

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6711886

Country of ref document: EP