WO2006077868A1 - ロボット玩具 - Google Patents

ロボット玩具 Download PDF

Info

Publication number
WO2006077868A1
WO2006077868A1 PCT/JP2006/300617 JP2006300617W WO2006077868A1 WO 2006077868 A1 WO2006077868 A1 WO 2006077868A1 JP 2006300617 W JP2006300617 W JP 2006300617W WO 2006077868 A1 WO2006077868 A1 WO 2006077868A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotion
robot toy
value
light
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2006/300617
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Yuichi Aochi
Wakana Yamana
Eiji Takuma
Yoshiyuki Endo
Fujio Nobata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Toys Co Ltd
Original Assignee
Sega Toys Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Toys Co Ltd filed Critical Sega Toys Co Ltd
Publication of WO2006077868A1 publication Critical patent/WO2006077868A1/ja
Priority to US11/775,133 priority Critical patent/US20070270074A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/22Optical, colour, or shadow toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates to a robot toy, and in particular to a cheap robot toy.
  • Japanese Patent No. 3277500 discloses an LED (Light Emitting Diode) having a shape and luminescent color according to the emotion for expressing the emotion of “joy” or “anger”. It is disclosed to be placed on the head and covered with a semi-transparent bar so that it can be viewed from the outside only when the light is emitted, and to drive these LEDs to blink according to the emotion of the robot toy.
  • LED Light Emitting Diode
  • Patent Document 1 Patent No. 3277500
  • the present invention has been made in view of such problems, and it is an object of the present invention to provide a robot toy capable of effectively performing emotional expression while being constructed inexpensively.
  • the robot toy of the present invention comprises a first light source capable of emitting at least two colors of light simultaneously or separately, and five or more lights arranged around the first light source.
  • various emotion expressions can be performed with a small number of light sources, and emotion expressions can be effectively performed while being inexpensively constructed.
  • FIG. 1 is a perspective view showing an appearance configuration of a robot toy according to the present embodiment.
  • FIG. 2 is a block diagram showing an internal configuration of a robot toy according to the present embodiment.
  • FIG. 3 is an exploded perspective view showing a specific configuration of the head.
  • FIG. 5 is a perspective view for explaining a pendulum gear.
  • FIG. 6 is a schematic plan view for describing a cam of a first cam gear.
  • Fig. 7 is a perspective view showing the structure of a lift member.
  • FIG. 8 is a perspective view showing the configuration of an ear driving member.
  • FIG. 9 is a schematic diagram for explaining the movement of the head.
  • FIG. 10 A perspective view showing the light emission state of the LED in which the external force is also visible.
  • FIG. 11 is a plan view showing an arrangement relationship of the LEDs in the LED display unit.
  • FIG. 12 A conceptual view for explaining the emotion of the robot toy.
  • FIG. 13 A conceptual view for explaining the emotion of the robot toy.
  • FIG. 14 is a conceptual diagram serving to explain a basic light emission pattern.
  • FIG. 15 is a flowchart showing an emotion generation expression processing procedure.
  • FIG. 16 is a flowchart showing a first emotion parameter change processing procedure.
  • FIG. 17 is a flowchart showing a second emotion parameter change processing procedure.
  • FIG. 18 is a flowchart showing a third emotion parameter change processing procedure.
  • FIG. 19 is a flowchart showing a first emotion expression processing procedure.
  • FIG. 20 is a flow chart showing a second emotion expression processing procedure.
  • FIG. 21 is a flowchart showing a third emotion expression processing procedure.
  • FIG. 22 is a schematic diagram for describing another embodiment.
  • FIG. 1 shows a robot toy 1 according to the present embodiment.
  • the robot toy 1 has an appearance that resembles a sitting state of an animal such as a dog as a whole. That is, the robot toy 1 has its head rotatably coupled to an axis extending in the vertical direction from the front end of the body 2, and the ears 4 A and 4 B are provided on the upper left and right sides of the head 3 respectively. It is movably attached.
  • the front legs 5A and 5B of rounded rod shape are respectively movably connected to the front ends of the left and right sides of the body 2 and the rear ends of the left and right sides of the body 2 are Protrusive rear legs 6A and 6B shorter than the forefoot are molded integrally with the body 2.
  • a joystick-like tail switch 7 is movably attached in the vicinity of the upper rear end of the body 2.
  • the body part 2 is formed in a rounded rectangular parallelepiped shape as a whole, and as shown in FIG. 2, the control part 10 which controls the operation of the entire robot toy 1 and the motor drive part 11 are provided therein.
  • a substrate 13 on which the sound amplifier unit 12 and the like are formed, and devices such as the microphone 14 and the speaker 15 are housed.
  • a jack jack is disposed on one rear foot 6B of the trunk 2.
  • the head 3 is formed in a rectangular parallelepiped shape which is flatter than the body 2 as a whole and has a light sensor 20, a device such as an LED display unit 21 having a plurality of LEDs therein, and a motor A drive mechanism 34 (FIG. 3) described later for driving the head 3 and the ears 4A and 4B with the power source 22 is stored. Further, at the upper end portion of the head 3, a head switch 23 composed of a pressing switch is disposed, and at the position of the nose in the head 3, a nose switch 24 composed of a pressing switch is disposed.
  • the forefoot portions 5A, 5B are longer than the rear foot portions 6A, 6B, so one side of the head 3 faces obliquely upward with respect to the mounting surface, and the LED display on this one side
  • the display surface of the section 21 the display can be made easy for the user to view.
  • the microphone 14 of the torso 2 sounds the surrounding sound and sends the obtained audio signal S 1 to the control unit 10.
  • the light sensor 20 of the head 3 detects the ambient brightness and sends the detection result to the control unit 10 as the brightness detection signal S2.
  • the tail switch 7, the head switch 23 and the nose switch 24 detect physical actions such as “toppling” or “pushing” to which the user's power is also given, and control results as operation detection signals S3A to S3C.
  • Send to The control unit 10 is also supplied with an audio signal S4 supplied from an external audio device via the earphone jack 16.
  • the control unit 10 has a microcomputer configuration including a central processing unit (CPU) 10 A, a memory 10 B, an analog Z digital conversion circuit IOC, and the like, and the sound signal S 1 from the microphone 14 and the light sensor 20 Based on the brightness detection signal S2 and operation detection signals S3A to S3C from the tail switch 7, the head switch 23 and the nose switch 24, the presence or absence of the surrounding situation or the user's power is recognized.
  • CPU central processing unit
  • control unit 10 sends motor drive signal S4 to motor drive unit 11 to drive motor 22.
  • the head 3 of the robot toy 1 is made to tilt, and the operation to open and close the ears 4A, 4B is realized.
  • the control unit 10 outputs a sound or music based on the audio signal S5 from the speaker 15 by giving a predetermined audio signal S5 to the speaker 15 through the sound amplifier unit 12 as necessary, or the LED display unit By giving a predetermined drive signal S6 to the LED 21, the LED of the LED display unit 21 is driven to blink in a predetermined light emission pattern.
  • FIG. 3 A specific configuration of the head 3 is shown in FIG.
  • the head 3 of the robot toy 1 has a first casing half 30 that forms an appearance on the back side of the head 3, and a front side of the head 3.
  • a housing comprising a second housing half 31 forming an external shape and first and second U-shaped housing side members 32 respectively forming the left and right sides of the head 3
  • the drive mechanism 34, the cover 35 of the drive mechanism 34, the LED display 21, the light leakage prevention member 36 and the filter cover 37 are sequentially stacked in the front direction indicated by the arrow a. It is composed of things.
  • a worm gear (not shown) is attached to the output shaft of the motor 22, and this worm gear is formed integrally with the gear 40 and the gear 40 coaxially. It meshes with gear 42 through.
  • a movable member 45 having a weight 44 attached to one end thereof is rotatably attached to the shaft 43 to which the gear 42 is attached.
  • a pendulum gear 46 rotatably mounted on the other end of the gear is meshed with a gear 47 coaxially and integrally formed with the gear 42.
  • first and second connection gears 48 and 49 are respectively provided on the left and right sides of the pendulum gear 46 so that the movable member 45 can mesh with the pendulum gear 46 when the movable member 45 rotates about the shaft 43. It is being done.
  • the rotational force when the motor 22 is driven to rotate in the normal direction, the rotational force also transmits the worm gear force attached to the output shaft of the motor 22 to the gear 47 via the gear train up to the gear 42. And the gear 47 rotates the movable member 45 integrally with the pendulum gear 46 in the clockwise direction of FIG. 4 based on the rotational force, thereby meshing the pendulum gear 46 with the first connecting gear 48. Conversely, when the motor 22 is reversely driven, the gear 47 rotates the movable member 45 in the counterclockwise direction in FIG. 4 on the basis of the rotational force, whereby the pendulum gear 46 is engaged with the second connecting gear 49. It is made to be able to be engaged with it.
  • the first connecting gear 48 meshes with the first cam gear 50, and the lower surface side of the first cam gear 50, as shown in FIG. Is formed.
  • the cam 50A is fitted in the engagement hole 51A of the elevating member 51 as shown in FIG. 7 which is disposed movably up and down as shown by the arrow b in FIG. By rotating 50, the lift member 51 can be raised and lowered by the cam 50A.
  • the upper end portion of the raising and lowering member 51 has a symmetrical position with respect to the shaft 52 (FIG. 4) implanted in the first housing half 30 (FIG. 4).
  • the first and second shafts 53A, 53B are implanted so as to be positioned in the first and second shafts 53A, 53B, as shown in FIG. ing.
  • Ear drive member 54 is, as shown in FIG. 8, integrally formed, through cylindrical portion 61, root portions of first and second spring portions 60A and 60B in the form of tweezers, each of which is also an elastic member. It is formed by flexible connection.
  • the ear driving member 54 is provided with holes 62AX and 62BX of the first and second engaging portions 62A and 62B disposed in the vicinity of the root portions of the first and second spring portions 60A and 60B, respectively. It is attached to the raising and lowering member 51 so as to be fitted to the corresponding first or second shaft 53A, 53B of the raising and lowering member 51, and to fit the cylindrical portion 61 with the shaft 52.
  • the first and second shafts 53A and 53B of the elevating member 51 move vertically with respect to the shaft 52.
  • the first and second spring portions 60A, 60B can be driven to open and close integrally with the first and second engaging portions 62A, 62B of the ear drive member 54.
  • the lower end portions of the corresponding ear portions 4A, 4B are respectively fitted into the first and second spring portions 60A, 60B of the ear drive member 54, and the respective ear portions 4A, 4B are respectively the lower end
  • the vicinity of the part is pivotally supported on the upper left and right sides of the first housing half 30, and by force, it interlocks with the opening and closing operation of the first and second spring parts 60A and 60B of the ear driving member 54.
  • the ears 4A, 4B can be driven to open and close.
  • the pendulum gear 46 (FIG. 5) meshes with the first connecting gear 48, and the rotational output of the motor 22 is Of cam gear 50
  • the cam 50A of the first cam gear 50 raises and lowers the raising and lowering member 51 based on the rotational output to open and close the first and second spring portions 60A and 60B of the ear driving member 54, thereby It is made to be able to operate section 4A, 4B to open and close!
  • the second connecting gear 49 meshes with the second cam gear 63, and the lower surface side of the second force gear 63 is, as shown in FIG. 9 (A), A cam 63A of a predetermined shape is formed. Further, on the lower side of the second cam gear 63, two parallel arm portions 65A, 65B of a bifurcated member 65 further fixed to the shaft body 64 fixed to the body portion 2 extend. The cam 63A of the second cam gear 63 is fitted between the two arms 65A and 65B of the bifurcated member 65.
  • the LED display unit 21 as shown in FIG. 3, seven LEDs 21A to 21G are arranged on the substrate 66 in a predetermined positional relationship. The positional relationship of the seven LEDs 21A to 21G and the light emission color of each will be described later.
  • the light leakage prevention member 36 is formed of, for example, a black resin material or rubber material which does not transmit light, and is attached in close contact with the substrate 66 of the LED display unit 21.
  • the pressing portion 24A and the contact portion 24B of the nose switch 24 are fixed to the lower end portion of the light leakage preventing member 36.
  • the light leakage prevention member 36 is provided with a total of seven holes 36A to 36G corresponding to the respective LEDs 21A to 21G of the LED display unit 21.
  • the preventing member 36 is attached to the substrate 66 of the LED display unit 21, the corresponding LEDs 21A to 21G of the LED display unit 21 are filtered through the respective holes 36A to 36G. It is designed to be exposed on the side.
  • the thickness of the light leakage prevention member 36 is set to the same size as the gap from the substrate 66 of the LED display unit 21 to the filter cover 37, whereby each LED 21 A of the LED display unit 21 The light emitted from each of the LEDs 21G can be emitted toward the filter cover 37 without being mixed with the light emitted from the other LEDs 21A to 21G.
  • the filter cover 37 is formed using a translucent resin material
  • the second casing half 31 is formed using a transparent resin material, whereby the LED display portion is formed.
  • the LEDs 21 8 to 21 of 21 do not turn off and turn off, and the LEDs 21A to 21G are turned on without recognizing (invisible) the external light.
  • the LEDs 21A to 21G can be recognized from the outside (this LEDs 21A to 21G can also see the emitted light).
  • the filter cover 37 in this way, when the LEDs 21A to 21G are lit, only the portions of the lit LEDs 21A to 21G do not brighten in a pinpoint manner, and the correspondence provided in the light leakage prevention member 36
  • the holes 36A-36G are made to diffuse light so that the overall shape is uniform brightness!
  • the filter cover 37 is formed in white so that the light emitted from each of the LEDs 21A to 21G of the LED display unit 21 visible from the outside through the filter cover 37 is not blurred by the filter cover 37. It is designed to be able to lightly rise in white.
  • the robot toy 1 generates the emotion of the robot toy 1 at that time based on the contents of the user's action and the presence or absence of the action on the control unit 10 (Fig. 2), and the type and degree of the emotion are generated.
  • an emotion expression function is provided to express the emotion of the robot toy 1 at that time.
  • FIG. 11 shows an arrangement relationship of the LEDs 21A to 21G of the LED display unit 21 used for expressing the emotion of the robot toy 1 as described above.
  • one LED 21 is placed at a predetermined position on the center line in the width direction of the head 3.
  • A is disposed, and the remaining six LEDs 21B to 21G are disposed on concentric circles centered on the LED 21A so as to be equidistantly spaced from each other and at equal intervals. That is, the surrounding six LEDs 21B to 21G are arranged in such a positional relationship as to be placed at each vertex position of a regular hexagonal shape centering on the central LED 21A.
  • the central LED 21 A an LED capable of emitting three colors of green, red and blue simultaneously or separately is used.
  • the surrounding LEDs 21B to 12G can be made to emit orange light by simultaneously lighting two colors of green and red.
  • the emotion of the robot toy 1 is defined by two parameters (hereinafter referred to as emotion parameters) respectively representing the “excitation level” and the “love level”. .
  • emotion parameters are stored in the memory 10B and take values in the range of "one eight" to "eight", respectively.
  • the values of the emotional parameters of "excitation level” and “love level” are both “negative” and “falling” t, and the emotions are correlated to the negative range, and the emotion parameter of "excitation level” is further correlated.
  • the value of the value of the affective parameter of the “love level” is negative, and “anger” and “disgust” t are associated with negative values.
  • the degree of emotion is expressed by the magnitude of the value of each of the “excitation level” and “love level” emotion parameters.
  • the control unit 10 controls the user power, the tail switch 7 and the head switch 23 and the nose switch 24 based on the operation detection signals S3A to S3C respectively supplied from the tail switch 7 and the head switch 23 and the nose switch 24.
  • the values of the emotional parameter values of “excitation level” and “love level” are “1-8” to Emotion is changed by increasing or decreasing it in the range of "8".
  • the user can shift the emotion of the robot toy 1 to “joy” and “love” t by moving the nose switch 24 of the robot toy 1 by pressing this “joy”. And the degree of the feeling of “love” can be increased, and by pressing the head switch 23, the feeling of the robot toy 1 is shifted to the feeling of “normal” or “calm”, or this “ It is possible to increase the degree of feeling of “normal” or “calm”.
  • the control unit 10 increases the value of the emotion parameter of “excitation level” by 1, while the value of the emotion parameter of “love level” is “8. Reciprocate while incrementing or decrementing by 1 in the range of to 8. Therefore, the user swings and operates the tail switch 7 of the mouth bot toy 1 so that the emotions of the robot toy 1 are "angry” and “disgusting” t, and the feeling, "falling” and “shyness” T You can make a transition to a different feeling or increase the degree of that feeling.
  • control unit 10 when the control unit 10 “falls” or “pushes” the user's power, the tail switch 7, the head switch 23 and the nose switch 24, the control unit 10 does not receive a certain period of time (for example, 30 seconds). Decreases the value of each of the emotion level of “excitation level” and “love level” by one. Therefore, at this time, the robot toy 1 shifts its own emotion to "slump” and “loneliness", and the degree of the emotion is increased.
  • a certain period of time for example, 30 seconds
  • each of the LEDs 21A to 21G of the LED display unit 21 is made to blink in a light emission pattern according to the emotion and its degree.
  • control unit 10 further determines the degree of emotion of the robot toy 1 at that time by further determining the values of the emotion parameter of “excitation level” and “love level”.
  • the LEDs 21A to 21G of the LED display unit 21 are blinked with a light emission pattern associated with the degree of emotion of the robot toy 1 at that time.
  • each light emitting pattern of LED display unit 21 with any light emission pattern It is determined in advance whether A to 12 G blinks, and for all the light emission patterns, the LEDs 21 A to 21 G of the LED display unit 21 blink in the light emission pattern.
  • a program (hereinafter referred to as an LED driving program) in which it is defined at what timing to blink the respective LEDs 21A to 21G for driving is stored in advance in the memory 10B.
  • the LEDs 21A to 21G of the LED display unit 21 are selected according to the LED drive program to be pressed. Flash with a light emission pattern according to your emotions and their degree.
  • the light emission patterns of the LEDs 21A to 21G in the LED display unit 21 have the peripheral LEDs 21B to 21G with respect to the central LED 21A, as shown in FIG.
  • a basic light emission pattern is used in which each of the LEDs 21B to 21G is repeatedly turned on and off sequentially to the adjacent LEDs 21B to 21G.
  • this light emission pattern is called a basic light emission pattern.
  • This basic light emission pattern is used as a light emission pattern expressing emotions other than “dropping” and “loneliness”, and “joy” and “love” ("excitation level” and “love level”).
  • the emotion parameter values are both positive, the corresponding LEDs 21B to 21G are driven to emit orange light, and the “normal” and “calm” (“excitation level” emotion parameter values are negative, “love level”)
  • the emotion parameter is 0 or positive
  • the corresponding LED 21B to 21G is driven to emit green, and the values of "anger” and “dislike” (“excitation level” emotion parameter) are 0 or positive
  • the corresponding LEDs 21B to 21G are driven to emit green light.
  • the surrounding LEDs 21 B to 21 G do not emit light, Only the central LED 21A is driven to blink blue.
  • the light emission patterns of the LEDs 21A to 1G in the LED display unit 21 include emotions of "joy” and “great love”, emotions of "normal” and “rest”, emotions of anger and dislike, and In this case, it is defined that the blinking cycle of the surrounding LEDs 21B to 21G becomes faster, so that the light rotates faster as the degree of emotion is greater, “dropping” and In the case of the emotion of “ ⁇ ,” it is defined that the larger the degree of the emotion, the larger, the faster the blinking period of the central LED 21A.
  • the emotion of the robot toy at that time can be visually recognized from the outside based on the light emission colors of the LEDs 21A to 21G in the LED display unit 21, and the LED 21A Based on the blinking speed of 21 D, the degree of emotion can be visually recognized.
  • control unit 10 executes the emotion generation processing of the robot toy 1 as described above and the expression processing of the generated emotion (control processing of the LED display unit 21) based on the above-mentioned LED driving program, as shown in FIG. Emotion generation expression processing procedure shown in Execute according to RT1.
  • the control unit 10 starts this emotion generation expression processing procedure RT1 in step SPO, and in the subsequent step SP1, it represents “the affection of the robot toy 1”
  • the value of each emotional parameter of “level” and “excitation level” is set to the initial value “0”.
  • step SP3 the control unit 10 thereafter proceeds to step SP3 and either the head switch 23 (FIG. 2), the nose switch 24 (FIG. 2) or the tail switch 7 (FIG. 2) is pressed or shaken. Determine whether the car has been moved.
  • step SP4 the control unit 10 obtains a negative result in this step SP3, it proceeds to step SP4 and reads the count value of an internal timer (not shown), and in the following step SP5, based on the count value read in step SP4.
  • Step SP2 set the emotional parameter values of "love level” and "excitation level” to their initial values, or set any of head switch 23, nose switch 24 and tail switch 7 first. It is determined whether or not the force has been pressed or oscillated and the force for which a predetermined time (for example, 30 seconds) has elapsed has passed.
  • a predetermined time for example, 30 seconds
  • control unit 10 If the control unit 10 obtains a negative result in this step SP5, it returns to step SP3 and repeats the loop of step SP3-SP 4-SP 5-SP3 until it obtains a positive result in step SP3 or step SP5 thereafter. .
  • step SP3 When the control unit 10 obtains an affirmative result in step SP3 by the user pressing the head switch 23 of the robot toy 1 eventually, the control unit 10 proceeds to step SP6 and proceeds to FIG.
  • First Emotional Parameter Change Processing Procedure Shown The value of each emotional parameter “emotional level” and “favorable level” is changed in accordance with RT2.
  • step SP6 the first emotion parameter change processing procedure RT2 starts in step SP20, and in the following step SP21, “love level” and “excitation level” respectively. Then, it is determined whether the value of the emotion parameter is the maximum value ("8" in this embodiment).
  • control unit 10 obtains a negative result in this step SP21, it proceeds to step SP22 and increments the values of the affective parameters of the “love level” and the “excitation level” by 1 each. Then, the control unit 10 proceeds to step SP25 to end the second emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
  • step SP21 when the control unit 10 determines in step SP21 that the value of the affective parameter “! Love level” is the maximum value, the control unit 10 proceeds to step SP 23 to “excitement level”. Increase the value of emotional parameter only by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP21 determines in step SP21 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP24 and the emotion of the “love level”. Increase only the parameter value by one. The control unit 10 then proceeds to step SP25 to end the first emotion parameter change processing procedure RT2, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP3 of the emotion generation and representation processing procedure RT1 by the user pressing the nose switch 24 of the robot toy 1, the control unit 10 proceeds to step SP7.
  • the second emotion parameter change processing procedure RT3 shown in 17 change the value of each emotion parameter of emotion "love level” and "excitation level".
  • step SP7 the second emotion parameter change processing procedure RT3 is started in step SP30, and in the subsequent step SP31, the value of the emotion parameter of “love level” is maximum. Emotional Paradigm of Force / None and “Excitation Level” It is determined whether or not the power value is a minimum value ("18" in this embodiment).
  • control unit 10 When the control unit 10 obtains a negative result in this step SP31, it proceeds to step SP32 to increase the value of the affection parameter of the “love level” by 1 and the value of the emotion parameter of the “excitation level” 1 Reduce. Further, the control unit 10 thereafter proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1 (FIG. 15).
  • control unit 10 determines that the value of the affection parameter of “love level” is the maximum value in step SP 31, the control unit 10 proceeds to step SP 33 and the value of the emotion parameter of “excitation level”. Reduce only one. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
  • control unit 10 determines that the value of the emotion parameter of “excitation level” is the minimum value in step SP31, it proceeds to step SP34 and the value of the emotion parameter of “love level”. Only increase by 1. The control unit 10 then proceeds to step SP35 to end the second emotion parameter change processing procedure RT3 and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • step SP3 of the emotion generation and representation processing procedure RT1 by swinging operation of the tail switch 7 of the robot toy 1, the control unit 10 proceeds to step SP8, According to the third emotion parameter change processing procedure RT4 shown in FIG. 18, the value of each emotion parameter of emotion "love level” and "excitation level” is changed.
  • step SP8 the third emotion parameter change processing procedure RT4 starts in step SP40, and in the subsequent step SP41, the value of the affection parameter of “love level” is minimum. It is determined whether the value ("18" in this embodiment) is a force or not and the value of the "excitation level” emotional parameter is a maximum value.
  • step SP41 when the control unit 10 obtains a negative result in this step SP41, it proceeds to step SP42 and decreases the value of the affection parameter of “love level” by 1 and the value of the emotion parameter of “excitation level” is 1 increase. Further, the control unit 10 thereafter proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and further performs the emotion generation expression thereafter. Procedure Go to step SP13 of RT1 ( Figure 15).
  • step SP41 determines in step SP41 that the value of the affection parameter of the “love level” is the minimum value
  • the control unit 10 proceeds to step SP43 and the emotion level of the “excitation level”. Increase only the parameter value by one.
  • the control unit 10 then proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation and representation processing procedure RT1.
  • step SP41 determines in step SP41 that the value of the emotion parameter “! / Excitation level” is the maximum value, the control unit 10 proceeds to step SP44 to process the emotion “love level”. Decrease the value of the parameter by 1 only. Then, the control unit 10 proceeds to step SP45 to end the third emotion parameter change processing procedure RT4, and then proceeds to step SP13 of the emotion generation expression processing procedure RT1.
  • control unit 10 sets the values of affective parameters of “love level” and “excitation level” to initial values, respectively, or head switch 23, nose switch 24 and tail switch 7 in advance. If a positive result is obtained in step SP5 of the emotion generation and expression processing procedure RT1 when the force predetermined time has elapsed since any of the keys is pressed or oscillated, the process proceeds to step SP9. For each of “excitation level” and “excitation level”, it is determined whether or not the value of the emotion parameter is a minimum value.
  • control unit 10 When the control unit 10 obtains a negative result in this step SP9, it proceeds to step SP10 and increments the value of each of the affective parameters of the “love level” and the “excitation level” by one, and then proceeds to step Go to SP 13.
  • control unit 10 determines that the value of the affection parameter of “love level” is the minimum value in step SP 9
  • the control unit 10 proceeds to step SP 11 and the value of the emotion parameter of “excitation level” Only decrease by 1 and then proceed to step SP13.
  • control unit 10 determines in step SP9 that the value of the emotion parameter of “excitation level” is the minimum value, the control unit 10 proceeds to step SP12 and the value of the emotion parameter of “love level”. Only decrease by 1 and then proceed to step SP13.
  • control unit 10 updates the “excitation level” and “love level” updated according to the presence or absence of the user's action in step SP3 to step SP12 as described above. Based on the value of each emotion parameter, the emotion of the robot toy 1 and the degree thereof at that time are expressed by the light emission patterns of the LEDs 21A to 21G in the LED display 21 in steps SP13 to SP19.
  • the memory 10B (FIG. 2) also reads the values of the emotion parameters of the “excitation level” and “love level” after the change, and the values of these emotion parameters are judge. Then, the control unit 10 proceeds to step SP14 when the emotion parameters of "excitation level” and “love level” are both 0 or a positive value, according to the first emotion expression processing procedure RT5 shown in FIG.
  • the emotion (“joy” and “love”) and the degree of the robot toy 1 at that time are expressed as the light emission pattern of the LEDs 21A to 21G in the LED display unit 21.
  • step SP14 the first emotion expression processing procedure RT5 starts in step SP50, and in the subsequent step SP51, each emotion of “excitation level” and “love level” Determine the value of the parameter. Then, when the value of each emotion parameter of “excitation level” and “love level” is within the range of 0 to 4, the control unit 10 proceeds to step SP52, and orange light is rotated once per second.
  • the surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in sequence, and the values of emotion parameters of “excitation level” and “affection level” are not set, otherwise step SP53. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
  • step SP52 When the control unit 10 ends the processing of step SP52 or step SP53, the control unit 10 proceeds to step SP54 to end this first emotion expression processing procedure RT5, and thereafter to step SP3 of the emotion generation expression processing procedure RT1. Return.
  • step SP13 of the emotion generation and expression processing procedure RT1 the control unit 10 determines that the value of the emotion parameter of “excitation level” is 0 or positive and the emotion parameter of “love level” is negative. If so, the process proceeds to step SP15, and according to the second emotion expression processing procedure RT6 shown in FIG. 20, the emotion (“anger” and “dislike”) of the robot toy 1 at that time is displayed on the LED 21A in the LED display unit 21. Expressed as a light emission pattern of ⁇ 21 G.
  • step SP15 this second emotion expression processing procedure RT6 Is started in step SP60, and in the subsequent step SP61, values of emotion parameters of "excitation level” and "love level” are determined. Then, the control unit 10 proceeds to step SP62 when the value of each emotion parameter of "excitation level” and “love level” is within the range of 0 to 4, and the orange light is rotated once per second.
  • the surrounding LEDs 21B to 21G in the LED display unit 21 are blinked in orange in order, and the values of emotion parameters of “excitation level” and “affection level” are not set otherwise, step SP63. Then, the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink orange so that the orange light makes one rotation in 0.5 seconds.
  • step SP64 to end the second emotion expression processing procedure RT6, and thereafter proceeds to step SP3 of the emotion generation expression processing procedure RT1.
  • control unit 10 determines that the value of the emotion parameter of “excitation level” is negative and the emotion parameter of “love level” is 0 or positive.
  • the emotion (“normal” and “calm”) of the robot toy 1 at that time is displayed according to the third emotion expression processing procedure RT7 shown in FIG. Expressed as a light emission pattern of
  • step SP16 the third emotion expression processing procedure RT3 is started in step SP70, and in the subsequent step SP71, each of the emotion parameters of “excitation level” and “love level” Determine the value. Then, when the value of the affection parameter of “love level” is in the range of 1 to 4 and the value of the emotion parameter of “excitation level” is in the range of 1 to 4 of control unit 10.
  • step SP72 the surrounding LEDs 21B to 21G in the LED display unit 21 are sequentially blinked in green so that the green light makes one rotation in 1 second, and the "excitation level” and "love level” If the value of each emotion parameter is not set, the process proceeds to step SP73, and the surrounding LEDs 21B to 21G in the LED display unit 21 sequentially blink green so that the green light makes one rotation in 0.5 seconds.
  • step SP74 the control unit 10 proceeds to step SP74 to end the third emotion expression processing procedure RT7, and then proceeds to step SP3 of the emotion generation expression processing procedure RT1.
  • step SP17 Go ahead and determine the value of each emotion parameter of "excitation level” and "love level”.
  • step SP 18 displays the LED Only the central LED 21A in section 21 blinks once per second, and if the values of the emotion level for "excitation level” and “love level” are other than this, the process proceeds to step SP19 to display LED Only the central LED 21A in section 21 blinks once every 0.5 seconds.
  • control unit 10 blinks the LEDs 21A to 21G of the LED display unit 21 with a light emission pattern according to the emotion of the robot toy 1 at that time and the degree thereof.
  • the robot toy 1 expresses the emotions of "joy” and “love”, the emotions of "normal” and “calm”, and the emotions of “anger” and “dislike”.
  • the LEDs 21B to 21G among the LEDs 21A to 21G in the LED display unit 21 are blinked one by one in the clockwise direction to express the feelings of “dropping” and “loneliness”, only the central LED 21A is Blinks blue.
  • the emotion of the robot toy 1 can be expressed with various light emission patterns, as compared to the case where only the LED whose shape is fixed is simply blinked. Since the LED display unit 21 can be constructed with a smaller number of LEDs than when a plurality of LEDs are arranged in a matrix, emotion can be expressed effectively while constructing the robot toy 1 inexpensively.
  • the LED 21A (first light source) at the center of the LED display unit 21
  • one capable of emitting three colors of green, red and blue simultaneously or separately is used, and other surrounding L ED21B to 12G are used.
  • a (second light source) it emits two colors of green and red simultaneously or separately
  • the types and the number of luminescent colors of the central LED 21A and the types and the number of the luminescent colors of the other surrounding LEDs 21B to 12G may be other than this.
  • the remaining six LEDs 21 B to 21 G are arranged so as to be equidistant from the central LED 21 A and equally spaced from each other, but as the arrangement of the LEDs 21 A to: LED 21 G
  • Other arrangements can be applied.
  • FIGS. 22 (A-1) and (A-2) three LEDs 70B to 70D are arranged around the central LED 70, and FIGS. 22 (B-1) and (B-2).
  • Figure 22 (C-1) and (C-2) As shown in Figure 22 (C-1) and (C-2), 5 LEDs 72B-around the central LED 72A).
  • LEDs 73B to 73I may be arranged around the central LED 73A, or more LEDs may be arranged. Also in this case, the same effect as the above embodiment can be obtained.
  • the number and types of force emotional parameters may be other than the above, so that the emotional parameters are set to two types of “excitation level” and “love level”.
  • the same number of emotion types as the emission colors of the LEDs 21A to 21G are prepared, and the LEDs 21A to 21G are blinked in a light emission pattern in which the emotions correspond to the emission colors of the LEDs 21A to 21G.
  • each LED 21A is a light emission pattern that blinks only the central LED 21A of the LED display unit 21 or blinks only the peripheral LEDs 21B to 21G clockwise with respect to the central LED 21A.
  • a light emission pattern other than this may be used as the light emission pattern in which the light of ⁇ 21 G is made to blink.
  • only the peripheral LEDs 21B to 21G blink one by one in a counterclockwise direction with respect to the central LED 21A, or only the peripheral LEDs 21B to 21G with respect to the central LED 21A in a clockwise or counterclockwise direction.
  • the surrounding LEDs 21 ⁇ / b> B to 21 ⁇ / b> G may be controlled to sequentially turn on and off the adjacent LEDs 21 ⁇ / b> B to 21 ⁇ / b> G so that a plurality of them blink.
  • the “excitation level” and the “love level” are provided to facilitate understanding.
  • the force of changing the value of each emotional parameter in the range of 8 to 8 Various ranges beyond this are applicable.
  • the contents of the emotion generation and expression processing procedure RT1 can be appropriately changed. For example, in an actual computer process, it is processed as a number in the range of 8 to 8 and a range of numbers ⁇ to 15 (0 to F in hexadecimal notation). Emotion parameters such as those described above for step SP13 are positive or negative and are not deviated!
  • step SP14 or step SP15 to step SP17
  • the value of the emotion parameter is 0 to 3
  • step SP14 or step SP15 to step SP17
  • the LEDs 21A to 21G emit light with the color and the light emission pattern associated with the range. The point is that if the LEDs 21A to 21G are made to emit light with a light emission pattern corresponding to a combination of values of emotional parameters, various methods can be applied as the method industrial applicability
  • the present invention can be widely applied to various robot toys.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Robotics (AREA)
  • Toys (AREA)
  • Manipulator (AREA)
PCT/JP2006/300617 2005-01-18 2006-01-18 ロボット玩具 Ceased WO2006077868A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/775,133 US20070270074A1 (en) 2005-01-18 2007-07-09 Robot Toy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-010771 2005-01-18
JP2005010771A JP2006198017A (ja) 2005-01-18 2005-01-18 ロボット玩具

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/775,133 Continuation US20070270074A1 (en) 2005-01-18 2007-07-09 Robot Toy

Publications (1)

Publication Number Publication Date
WO2006077868A1 true WO2006077868A1 (ja) 2006-07-27

Family

ID=36692257

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/300617 Ceased WO2006077868A1 (ja) 2005-01-18 2006-01-18 ロボット玩具

Country Status (3)

Country Link
US (1) US20070270074A1 (enExample)
JP (1) JP2006198017A (enExample)
WO (1) WO2006077868A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515092B2 (en) 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
CN105676740A (zh) * 2016-03-01 2016-06-15 深圳前海勇艺达机器人有限公司 一种使机器人具备照明的方法和装置

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWM285388U (en) * 2005-10-05 2006-01-11 Wen-Bin Shiu Pet toy combining with MP3 player
US8292689B2 (en) 2006-10-02 2012-10-23 Mattel, Inc. Electronic playset
US8062089B2 (en) 2006-10-02 2011-11-22 Mattel, Inc. Electronic playset
CN101406756A (zh) * 2007-10-12 2009-04-15 鹏智科技(深圳)有限公司 表达情感的电子玩具及其情感表达方法、发光单元控制装置
CN101411946B (zh) * 2007-10-19 2012-03-28 鸿富锦精密工业(深圳)有限公司 玩具恐龙
US20100305448A1 (en) * 2009-05-26 2010-12-02 Anne Cecile Dagonneau Apparatus and method for indicating ultrasound probe orientation and activation status
US8438233B2 (en) 2011-03-23 2013-05-07 Color Labs, Inc. Storage and distribution of content for a user device group
US20120320077A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Communicating status and expression
US9656392B2 (en) * 2011-09-20 2017-05-23 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US8621019B2 (en) 2011-09-21 2013-12-31 Color Labs, Inc. Live content sharing within a social networking environment
CN104010614B (zh) * 2011-12-21 2016-06-29 皇家飞利浦有限公司 剥离粘合式cpr辅助设备
CN102980103A (zh) * 2012-11-27 2013-03-20 华南理工大学 一种机器视觉led照明光源
CA2956790A1 (en) * 2014-07-28 2016-02-04 Sergei GONCHAR Sparkly childrens products
US10068424B2 (en) 2016-05-13 2018-09-04 Universal Entertainment Corporation Attendant device and gaming machine
JP6572943B2 (ja) 2017-06-23 2019-09-11 カシオ計算機株式会社 ロボット、ロボットの制御方法及びプログラム
JP6936081B2 (ja) * 2017-08-30 2021-09-15 パナソニック株式会社 ロボット
JP1622873S (ja) * 2017-12-29 2019-01-28 ロボット
WO2020009098A1 (ja) * 2018-07-02 2020-01-09 Groove X株式会社 ロボット
KR102012968B1 (ko) * 2018-08-07 2019-08-27 주식회사 서큘러스 인터렉션 로봇의 제어 방법 및 제어 서버
KR102415997B1 (ko) * 2020-11-05 2022-07-05 (주)로보티즈 사용자의 돌봄 감성을 자극하는 반려 로봇
JP2023092204A (ja) * 2021-12-21 2023-07-03 カシオ計算機株式会社 ロボット
JP7677299B2 (ja) * 2022-09-28 2025-05-15 カシオ計算機株式会社 機器の制御装置、機器の制御方法及びプログラム
WO2024107680A1 (en) * 2022-11-14 2024-05-23 4Tekies Llc Interactive comfort pet
USD1038267S1 (en) * 2023-09-19 2024-08-06 Shenzhen Leqisi electronic Technology Co., LTD Toy

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3277500B2 (ja) * 1999-05-10 2002-04-22 ソニー株式会社 ロボット装置
JP2002154081A (ja) * 2000-11-16 2002-05-28 Nec Access Technica Ltd ロボットならびにその表情表現方法および段差と持ち上げ状態の検出方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337552B1 (en) * 1999-01-20 2002-01-08 Sony Corporation Robot apparatus
US4556932A (en) * 1983-03-28 1985-12-03 Lehrer Bradley D Lighted novelty item
JPS60128699U (ja) * 1984-02-07 1985-08-29 株式会社トミー 無線操縦玩具
US5141464A (en) * 1991-01-23 1992-08-25 Mattel, Inc. Touch responsive animated toy figure
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5461188A (en) * 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
US5668333A (en) * 1996-06-05 1997-09-16 Hasbro, Inc. Musical rainbow toy
KR20010071833A (ko) * 1999-05-10 2001-07-31 이데이 노부유끼 로봇 장치 및 그 제어 방법 및 기록 매체
JP2001191284A (ja) * 1999-10-25 2001-07-17 Sony Corp ロボット装置及びロボット装置の学習方法
USD457203S1 (en) * 1999-11-02 2002-05-14 Sega Toys, Ltd. Robotic dog
USD448433S1 (en) * 1999-11-02 2001-09-25 Sega Toys Ltd. Robotic dog
JP2001191281A (ja) * 1999-12-29 2001-07-17 Sony Corp 編集装置、編集方法及び記録媒体
US6672934B2 (en) * 2000-02-04 2004-01-06 Trendmasters, Inc. Amusement device
US6462498B1 (en) * 2000-05-09 2002-10-08 Andrew J. Filo Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered
JP2002018146A (ja) * 2000-07-04 2002-01-22 Tomy Co Ltd 対話型玩具、反応行動パターン生成装置および反応行動パターン生成方法
JP2002127059A (ja) * 2000-10-20 2002-05-08 Sony Corp 行動制御装置および方法、ペットロボットおよび制御方法、ロボット制御システム、並びに記録媒体
US6507773B2 (en) * 2001-06-14 2003-01-14 Sharper Image Corporation Multi-functional robot with remote and video system
JP2003060745A (ja) * 2001-08-22 2003-02-28 Sony Corp 情報伝達装置、情報伝達方法及びモニタ装置
JP2003071765A (ja) * 2001-09-04 2003-03-12 Sony Corp ロボット装置及びロボット装置の入力方法
JP2004237392A (ja) * 2003-02-05 2004-08-26 Sony Corp ロボット装置、及びロボット装置の表現方法
US7374482B2 (en) * 2003-08-12 2008-05-20 Ghaly Nabil N Interactive slot machine

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3277500B2 (ja) * 1999-05-10 2002-04-22 ソニー株式会社 ロボット装置
JP2002154081A (ja) * 2000-11-16 2002-05-28 Nec Access Technica Ltd ロボットならびにその表情表現方法および段差と持ち上げ状態の検出方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"<Hodo Kankei Shiryo> idog Shinshohin & Zensekai Hatsubai Kettei!, Kabushiki Kaisha Sega Toys", 19 January 2005 (2005-01-19), XP003000534, Retrieved from the Internet <URL:http://www.segatoys.co.jp/company_information/press_release/pdf/20050119.pdf> *
KANO M. ET AL.: "Kansei Kaiwagata Robot 'ifbot' no Hyojo Seigyo no Kanjo Kukan eno Mapping", DAI 66 KAI (HEISEI 16 NEN) ZENKOKU TAIKAI KOEN RONBUNSHU (4), INTERFACE COMPUTER TO NINGEN SHAKAI, INFORMATION PROCESSING SOCIETY OF JAPAN, 9 March 2004 (2004-03-09), pages 4-77 - 4-78, XP003000533 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515092B2 (en) 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
CN105676740A (zh) * 2016-03-01 2016-06-15 深圳前海勇艺达机器人有限公司 一种使机器人具备照明的方法和装置

Also Published As

Publication number Publication date
JP2006198017A (ja) 2006-08-03
US20070270074A1 (en) 2007-11-22

Similar Documents

Publication Publication Date Title
WO2006077868A1 (ja) ロボット玩具
US10732745B2 (en) Light emitting diode switch device and array
JP2006198017A5 (enExample)
EP2359919A1 (en) A toy
US20070190894A1 (en) Holiday displays having active figurines
CN120303042A (zh) 交互式机动旋转玩具
JP2001327765A (ja) 電子玩具及びその制御方法及び記憶媒体
WO2016184430A1 (zh) 一种能自动感应外界环境变化而变形变色的发光智能珠宝
CN111514590A (zh) 一种玩具
KR200385518Y1 (ko) 발광 장신구
JP2020113466A (ja) 照明装置
CN201257300Y (zh) 一种音乐吊饰
CN206597314U (zh) 触控声光玩具改良结构
CN119925905A (zh) 一种智能益智游戏装置
CN210786226U (zh) 一种玩具
CN204218264U (zh) 一种儿童智能牙刷
CN2845876Y (zh) 玩具的带动装置
TW200922671A (en) A toy with emotion expression function and method therefor, and a means of mixing light
JPH02136160A (ja) 作動装置
HK1232008B (zh) 发光二极管开关装置及阵列
TWM268514U (en) Apparatus of splendor lamp
HK1232008A1 (en) Light emitting diode switch device and array
HK1195660A (en) Light emitting diode switch device and array
HK1195660B (en) Light emitting diode switch device and array
TW201033993A (en) Display apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 11775133

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 11775133

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 06711886

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 6711886

Country of ref document: EP