US20020061708A1 - Interactive talking dolls - Google Patents
Interactive talking dolls Download PDFInfo
- Publication number
- US20020061708A1 US20020061708A1 US09/880,425 US88042501A US2002061708A1 US 20020061708 A1 US20020061708 A1 US 20020061708A1 US 88042501 A US88042501 A US 88042501A US 2002061708 A1 US2002061708 A1 US 2002061708A1
- Authority
- US
- United States
- Prior art keywords
- signal
- toy
- action
- interactive
- performing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the present invention relates to interactive toys, one toy, once activated by a user, activating another toy. More particularly, the present invention relates to a pair of toys which perform responsive actions or functions in continuous sequence.
- a set of talking dolls are provided. The user activates one of the dolls to say a sentence. At the end of the sentence, the user-activated doll activates another doll to respond to the first sentence. Each doll may respond to the sentence of another doll until a conversation is complete.
- Toys that are activated by a user to perform a desired function are known in the art.
- a desired action such as speaking or moving
- the doll typically only performs a single action (e.g., the doll says a single word or phrase, or moves in a desired manner) without saying anything more until the activation switch is pressed again.
- each switch causing the doll to performed a desired action (e.g., say a specific word or phrase or move in a desired manner) associated with that switch, once the action is completed, the doll is idle. Only when the desired activation switch is pressed does the doll perform again.
- Such dolls need not be activated by a mechanically activated switch.
- Light-sensitive switches may be used instead of, or in addition to, a mechanical switch, such as shown in U.S. Pat. No. 5,281,180 to Lam et al.
- the desired action need not be the enunciation of a speech pattern.
- Other toys are known that perform another action, such as moving or flashing lights, upon activation by the user.
- the above-described toys merely perform the single desired action or function in response to activation by a user. These toys do not then activate another device without further intervention from a user.
- the device used to activate another device has comprised a signal generator alone, such as a remote control unit, that does not perform an action (such as enunciation of a speech pattern) other than transmitting a signal.
- a signal generator such as a remote control unit
- the only “toy” that is activated to perform a desired function is the toy controlled by the remote control device, the remote control device not performing an independent action.
- the toy which performs the desired action is not activated by another device that has performed a desired action.
- a set of interactive toys which each perform a desired action in addition to transmitting a signal to another toy has not yet been provided with the capability of being programmed by an external, wireless control device such as a common household remote control unit which merely signals one of the toys to perform a desired action, that action then triggering a cascade of mutual activation and response.
- Each toy performs an action, the action of at least one of the toys being accompanied by a signal that is sent to the other toy to cause the other toy to perform a responsive action.
- the other toy's action is also accompanied by a signal that is sent to the first toy (or, yet another toy) to cause that toy to perform yet another (the same or different) responsive action.
- the set of toys performs one of a variety of different interactive responsive action sequences.
- the user may either select the action sequence to be performed, or the action may be selected randomly or in a given sequence by the control system of the toy, for example, upon activation of one of the toys.
- Each toy may respond with a single set response. However, most preferably, each toy may respond in one of several manners, randomly, sequentially, or user-selected, to the action of the other toy.
- the user-activated toy typically sends a signal to the other (receiving) toy that is coded.
- the code is received by the receiving toy to cause the receiving toy to perform an appropriate action in response to the action previously performed by the first signal-emitting toy in the sequence.
- This interaction may continue until the logical conclusion of the interaction or indefinitely.
- the toys are dolls and the interaction is in the form of a conversation comprising responsive speech patterns enunciated by the dolls.
- the toys may comprise animals, or a doll interacting with another object, such as a car.
- the toys can be controlled by a household remote control device.
- the toys may be initially activated wirelessly such that a hard-wired switch on the toy is not necessary.
- each toy preferably is also programmable to respond to signals of the remote control device in a desired manner. Specifically, if several interactive action sequences may be performed, then each interactive action sequence and/or each individual response may be associated with a button on the remote control device. Additionally, another button on the remote control device is preferably dedicated to remote random selection of an interactive sequence/response.
- FIG. 1 is a perspective view of a set of exemplary toys that may be used to perform a sequence of interactive actions in accordance with the principles of the present invention
- FIG. 2 is a high level block diagram of the interactive mechanism of a set of toys in accordance with the principles of the present invention
- FIG. 3 is a detailed circuit diagram of the circuitry of FIG. 2 for implementing an interactive sequence according to the present invention
- FIG. 4 is a table showing jumper connections for setting the options setting of the interactive mechanism of the present invention.
- FIGS. 5 A- 5 F are a flow chart showing the sequence of actions performed by toys in the play mode in accordance with the principles of the present invention.
- FIG. 6 is a flow chart showing the sequence of actions performed by toys in the learn mode in accordance with the principles of the present invention.
- a set of toys are provided for interacting with one another independently of user input other than an initial activation of one member of the set to commence interaction.
- a first toy is actuated to perform a first desired action. Actuation may either be caused by actuation of a hard-wired activation switch or by transmission of a wireless signal, such as a signal from a remote control unit.
- the first toy activates a second toy to perform a second desired action, typically in response to the first desired action.
- the action sequence is complete, and the toys remain inactive.
- the second toy may perform a third desired action, such as a reaction-inducing action, after completing the second desired action.
- a third desired action such as a reaction-inducing action
- the second toy Upon completion of the third (reaction-inducing) action, the second toy activates either the first toy or yet another toy to react to the reaction-inducing action.
- the first (or the yet other toy) then responds to the third (reaction-inducing) action with a fourth desired action.
- Such interaction between the toys may continue for a set number of rounds, or indefinitely, as desired.
- interactive toys 10 are in the form of a first doll 12 and a second doll 14 , as shown in FIG. 1.
- the interactive toys need not be dolls and one toy need not be the same as the other.
- a combination of a doll and an animal such as a dog that barks in response to question asked by the doll
- a doll and an inanimate object such as a car that opens its doors or turns on its headlights or starts its engine
- two animals or two inanimate objects (such as two musical instruments each playing a musical piece), or a variety of desired objects that may interact with each other in an amusing manner are all within the scope of this invention.
- One such example of interactive toys is a sound producing element that emits a sound sequence (such as a musical piece) and a keyboard (or other such device with activation keys) that actuates the sound producing element.
- the keyboard emits a tone (or a sound or a message indicating the action to be performed by the sound producing element) before actuating the sound producing element to play the desired sound sequence.
- the sound producing element signals the keyboard to activate the same or a different sound producing element (or another type of toy), which element or toy then performs another desired action.
- each doll has a body 16 in which the mechanism that controls the interactive action sequence is housed.
- body 16 preferably is soft, body 16 may be formed from any desired material that permits transmission of wireless signals, such as infrared signals, therethrough. The same is true of the housings or bodies of the other toy forms that may be used instead of dolls 12 , 14 .
- Each set of toys provided in accordance with the principles of the present invention has a mechanism 20 that permits and implements performance of the interactive action sequence (hereinafter “the interactive mechanism”) as shown in FIG. 2.
- Interactive mechanism 20 of each toy comprises a number of functional blocks that permit each toy to receive an activation signal, and, in response, to cause that toy to perform a desired action.
- the appropriate functional blocks of interactive mechanism 20 Upon completion of that action, the appropriate functional blocks of interactive mechanism 20 cause another toy to perform a desired responsive action (if a response is called for).
- the other toy is also capable of activating either the first-activated toy, or yet another toy, to perform yet another responsive action.
- interactive mechanism 20 causes the toys to perform a sequence of interactive actions.
- the components of interactive mechanism 20 include a program control box 22 containing the necessary components for controlling the interactive sequence of events.
- the components of program control box 22 are contained within a housing within the toy.
- Program control box 22 includes a microcontroller unit (“MCU”) 24 that receives and processes information to control the functioning of interactive mechanism 20 .
- MCU 24 initially reads the option set by options setting 26 to determine the duration of the interaction to be performed by the interactive toys and whether actuation of the toy is to cause random selection of an action to be performed or sequential selection of an action, the possible actions thus being performed in a preset, predetermined linear order.
- each toy may only perform a single action, or, the second toy may cause another toy (or the first acting toy) to perform another responsive action (such that three actions are performed).
- the interactive sequence may continue between two or more toys for a predetermined finite number of interactions or indefinitely.
- the MCU also must read the mode selected by mode selection 28 .
- Mode selection 28 determines whether interactive mechanism 20 is in a play mode, in which the toys are enabled to perform the interactive actions, or in a learn mode, in which the toys may be programmed, as will be described in further detail below.
- MCU 24 remains in a sleep mode, which reduces power consumption, until it receives an activation signal from mode selection 28 , or from external hard-wired activation switch 30 via switch connections 32 , or from infrared (“IR”) detector/receiver 34 (or another receiver for a wireless activation signal) to commence operation.
- External activation switch 30 may take on any desired form known in the art, activated by any of a variety of external stimuli such as touch, light, sound (e.g., a voice recognition switch), motion (either motion of the switch itself or detection of an external motion), magnetic forces, etc.
- a separate activation switch may be provided for each of the possible actions to be performed (or at least for the initial action) so that the user may select the interactive sequence of actions to be performed.
- a single activation switch may be provided, causing MCU 24 to select (either randomly or sequentially, depending on the setting of options setting 26 ) the interactive sequence of actions to be performed.
- any other type of receiver for receiving a wireless signal from another toy of the set may be used instead of an IR receiver, depending on the type of wireless signals transmitted between the toys of the present invention.
- IR detector/receiver 34 is shown as part of program control box 22 , it will be understood that IR detector/receiver 34 may, instead, be externally coupled to program control box 22 .
- an activation signal is received from mode selection 28 , then the learning subroutine, which permits programming of the toys with a remote control unit, is commenced, as described in further detail below. If, instead, an activation signal is received via switch connections 32 from external activation switch 30 , or via IR detector 34 , then MCU 24 will begin the desired program encoded therein to commence the desired interactive operation. Thus, an action performing device must be provided to carry out the desired action of the interactive sequence of actions.
- an action performing devices may be a voice chip 36 , such as those known in the art, that has at least one and preferably several speech patterns stored therein which are enunciated upon activation of the voice chip by MCU 24 as the desired action to be performed.
- the voice chip not only contains a series of recorded phrases (“speech patterns”) stored in a memory (preferably a ROM provided therein), but also has recording capability such that the user may record desired speech patterns thereon. If another action is to be performed instead, then the necessary component for performing that desired action is provided in addition to or instead of voice chip 36 .
- the exact form of the action performing device depends on the design choices in implementing the principles of the present invention, the present invention thus not being limited to the use of a voice chip.
- a motor that moves a part of the interactive toy e.g., for activating an arm to wave, or for moving the lips of the doll
- lights that selectively flash, or other desired devices that can perform an action that is responsive to an action performed by another toy such other action performing device also being well known in the art, may be provided instead of or in addition to a voice chip.
- the toys are not dolls, but instead are inanimate objects, then the necessary mechanism that must be provided for causing the toy to perform a desired action would not be a voice chip.
- the set of toys may be an activation keyboard that emits a tone (or other sound or message) and a sound producing element that plays music (e.g., a musical instrument, such as a piano or a flute).
- the action performing device thus is not necessarily a voice chip but may be any electronic or mechanical component known in the art for causing the production of such non-vocal sounds.
- the toys are a doll and a car, then the action producing devices would include not only a voice chip for the doll, but also a device that can control elements of the car (such as a motor or a headlight) that are to be actuated by the doll.
- the action performing device is a voice chip 36
- a speaker 38 is included as part of interactive mechanism 10 , electrically coupled to the components of program control box 22 (preferably electrically coupled to the voice chip) as will be described in greater detail below.
- a microphone 40 is also included in interactive mechanism 20 , electrically coupled to the components of program control box 22 .
- any other element that performs the desired action and which is associated with the device that causes the action to be performed is coupled to program control box 22 .
- the interactive toys used in the present invention may be electrically coupled together to transmit signals to each other, preferably, the interactive toys are provided with transmitters and receivers for wirelessly transferring signals between each other.
- Various means for wirelessly communicating information between inanimate objects, such as electrical equipment, are known in the art.
- information is transferred via audible sound, ultrasound, radio frequency, and infrared wave signals.
- infrared signals are transmitted between the toys.
- FCC approval which would be needed for other transmission media such as radio frequency, is not necessary. It will be understood that any other desired signal transmitting and detecting/receiving components which wirelessly exchange information may be used instead.
- an infrared (“IR”) emitting driver 42 (such as an infrared light emitting diode), or other such infrared signal emitter, is coupled to the other components of program control box 22 .
- IR emitting driver 42 If the IR detectors used in the interactive toys are the type that only can receive an oscillating signal, such as is common in the art, IR emitting driver 42 must be driven to emit an oscillating signal.
- frequency oscillator 44 is coupled to IR emitting driver 42 through an output disable/enable control 46 .
- Output control 46 is normally set so that oscillating signals are not sent from oscillator 44 to IR emitting driver 42 .
- output control 46 enables oscillator 44 to send the desired signal to IR emitting driver 42 .
- a signal thus is emitted from IR emitting driver 42 which may be received by an IR detector of a corresponding interactive toy having a control mechanism substantially identical to interactive control mechanism 20 .
- a power and control box 48 provides program control box 22 , as well as the other devices comprising interactive mechanism 20 , with power.
- power and control box 48 comprises a battery pack within a housing 50 and the requisite wiring 52 coupling the battery pack to at least program control box 22 .
- Program control box 22 then supplies the remaining components of interactive mechanism 20 with power.
- power and control box 48 may be separately coupled to each of the remaining components of interactive mechanism 20 , instead. Access to power and control box 48 is generally provided so that the batteries therein can be replaced as necessary.
- control switches 54 may include an on/off switch 55 for turning the toy on so that power is not expended when the toy is not in use. Additionally, control switches 54 may include a mode selection switch (coupled to and enabling mode selection 28 ) for selecting whether the toy is in “play” mode or in “learn” mode, as will be described in further detail below.
- FIG. 3 A detailed circuit diagram showing a preferred circuit 100 containing the components making up the above-described functional blocks is shown in FIG. 3. Blocked sections of the diagram of FIG. 3 representing a functional block of FIG. 2 are represented by the same reference numeral. It will be understood that power switch 102 (of power control block 55 ) must be closed in order for circuit 100 to function. Furthermore, the function performed by circuit 100 is determined by mode selection block 28 comprising mode selection switch 104 positionable between a learn position 106 and a play position 108 . The function of circuit 100 will first be described for the mode in which mode selection switch 104 is in the play position 108 .
- Circuit 100 is controlled by MCU 24 comprising microcontroller 110 .
- Microcontroller 110 preferably is a 4-bit high performance single-chip microcontroller having a sufficient number of input/output ports to correspond to the number of desired actions that the toy is to perform, a timer (preferably an 8-bit basic timer) for measuring the time interval of an incoming signal (preferably an IR signal), and sufficient memory (RAM and ROM) to store the required software for causing circuit 100 to implement the desired interactive sequence of actions as well as to store the desired number of remote control codes for circuit programming with a remote control unit, as will be described below.
- a more powerful microprocessor such as an 8-bit microprocessor, may be used instead, depending on design choices.
- the microcontroller must be selected to have sufficient speed to generate a signal that can activate an infrared transmitter, as well as to recognize a received infrared signal.
- the size of the ROM/RAM, the power requirements, and the number of input and output pins are determined by the particular design requirements of the toys.
- a preferred microcontroller unit is the KS57C0302 CMOS microcontroller sold by Samsung Electronics of Korea.
- each microcontroller 110 preferably has six (6) pairs of input/output pins, five (5) of which are dedicated to codes corresponding to actions to be performed, the sixth pair being dedicated to random/sequential selection of an action (i.e., non-user determined selection of an action to be performed, the MCU 24 determining which action is to be performed based on the setting of options setting 26 ).
- the action sequence ending upon completion of the responsive action only a single input/output port is necessary.
- microcontroller 110 With circuit 100 supplied with power via power switch 102 , microcontroller 110 preferably remains in a sleep mode until one of three activation signals is received: a signal from hard-wired switch connections 32 (from an external activation switch); a wireless signal, such as from infrared detector/receiver 34 ; or a signal from mode selection block 28 .
- the first two mentioned signals activate circuit 100 when mode selection switch 104 is in the play position 108 .
- the third-mentioned signal activates circuit 100 when mode selection switch 104 is in the learn position 106 for programming purposes, and thus will be described in further detail below.
- Switch connections 32 may be coupled to a switch 30 located on or near the toy (such as in body 18 of doll 12 , 14 ) or a key 114 of a keyboard coupled to circuit 100 .
- Infrared detector/receiver 34 receives a signal either from an infrared emitting diode, similar to IR emitting driver 42 of circuit 100 , of a circuit (substantially identical to circuit 100 ) in an associated toy or from a remote control device (such as a household television remote controller) which can generate infrared signals.
- a remote control device for activating the toy of the present invention will be described in greater detail below.
- Receipt by MCU 24 of an activation signal from switch connections 32 causes MCU 24 to select a desired action to be performed.
- the desired action may be selected by a user (e.g., by pressing a desired activation switch associated with the desired action to be performed if a switch corresponding to each action is provided), or, by the MCU. If an activation switch is provided for MCU selection of the interactive sequence of actions to be performed, performance of the action may be in a preset linear order (i.e., in a set sequence), or at random, depending on the setting of options setting 26 .
- Options setting 26 is set through the use of jumpers J 1 -J 5 diodes D 5 -D 9 to close the jumpers.
- the jumper settings may either be hard-wired, or user selected via a dip switch having the required number of setting levers.
- a table showing various jumper connections, providing various settings 120 - 140 , and their associated functions is shown in FIG. 4. As can be seen, each function may be performed in either a linear sequence (“in sequence”), in which the actions that are performed follow a set order, or in a random order (“in random”), in which the actions are performed in a random order.
- Setting 120 causes MCU 24 to perform option 1 , representing the performance of one of a variety of desired actions by a toy, in a linear sequence.
- Setting 122 causes MCU 24 to perform option 1 in a random order.
- Setting 124 causes MCU 24 to perform option 1 as controlled by a preferably musical toy such as a piano or a flute.
- Setting 126 causes MCU 24 to perform option 2 , in which the first toy performs a response-inducing action and the second toy performs a responsive action, in sequence, whereas setting 128 causes MCU 24 to perform option 2 to be performed in random order.
- each toy performs a response-inducing action as well as a responsive action (i.e., the first toy performs a first action, the second toy responds to that action and then performs another action to which the first toy, or another toy, responds), is performed in sequence by setting 130 and in random by setting 132 .
- each toy performs greater than two (preferably ten) response-inducing actions as well as greater than two (preferably ten) responsive actions, is performed in sequence by setting 134 and in random by setting 136 .
- endless interactive actions are performed in option 5 , either in sequence by setting 138 , or in random by setting 140 .
- MCU 24 is actuated by an activation signal to perform the appropriate subroutine for performing the desired interactive sequence of actions, as described in greater detail below.
- Each action is associated with a corresponding code by the software subroutine initialized by the actuation of the toy, the subroutine sending the appropriate signal to the appropriate device to perform the desired action corresponding to the signal.
- the requisite code for initiating the action is preferably contained in a look up table (which is part of the software program) containing a list of the codes corresponding to the desired actions that may be performed.
- the desired action is the enunciation of a speech pattern.
- data output bus 144 couples MCU 24 with voice chip block 36 containing voice chip 146 .
- Voice chip 146 is capable of storing and retrieving voice patterns.
- the voice chip has a read only memory (ROM) in which the voice patterns are stored.
- the stored patterns may be any desired length, such as 6, 10, 20, or 32 seconds long.
- Enough pins must be provided to correspond to the output pins of the microcontroller 110 .
- the pins are capable of being edge triggered to enunciate a desired speech pattern.
- the voice chip that is used may be any of the commercially available voice chips that provide the above features, such as the MSS2101/3201 manufactured by Mosel of Taiwan. If the toy permits a user to record his or her own message for later playback by the toy, then a voice recording chip, such as the UM5506 manufactured by United Microelectronic Corp. of Taiwan, or the ISD1110X or ISD1420X both manufactured by Information Storage Devices, Inc. of San Jose, Calif., is provided. It will be understood that any other circuit component may additionally or alternatively be contained in voice chip block 36 , this block generally representing the action performing block containing the necessary component or device that causes the performance of the desired action. Such other component or device may actuate a motor, external lights that selectively flash, or other desired action performing devices, such as described above.
- Voice chip 146 preferably has a ROM with a preloaded series of preferably digitized phrases. However, it will be appreciated that the memory in which the phrases to be played may be located elsewhere. Preferably the phrases are prerecorded audio signals mask programmed onto voice chip 146 . Voice chip 146 contains the necessary circuitry to interpret the signal from microcontroller 110 via data bus 144 and to access the appropriate phrase stored within voice chip 146 (or at another memory location) and associated with the signal from microcontroller 110 . Furthermore, voice chip 146 preferably also contains the necessary circuitry to convert the recorded phrase into proper audio format for output to speaker 38 (which may or may not be considered a part of voice chip block 36 ). As known to one of ordinary skill in the art, the signal from voice chip 146 may be amplified as necessary for speaker 38 .
- voice chip 146 During enunciation of the selected speech pattern, voice chip 146 generates a busy signal at busy output pin 148 , which signals MCU 24 to enter an idle state in which no further signals are generated by microcontroller 110 .
- the busy signal is turned off at the end of the enunciation, thereby enabling MCU 24 to generate a coded signal that may be transmitted to the corresponding toy to actuate the corresponding toy to perform a corresponding interactive response.
- MCU 24 remains in a ready state, waiting for the termination of the busy signal. Once the busy signal ends, MCU 24 may continue its subroutine, the next set of which is to transmit a coded signal to another toy, as described in greater detail below.
- microcontroller 110 Once microcontroller 110 has generated the signal to transmit to the other toy, microcontroller 110 must transmit the signal to infrared emitting diode 42 .
- the infrared detector/receiver 34 used in each of the control circuits 100 of the interactive toys of the present invention generally can only receive an infrared signal with a predetermined carrier frequency (preferably 38 Khz). Thus, infrared emitting diode 42 must emit a signal at that predetermined frequency as well. Accordingly, circuit 100 is provided with an oscillator 44 which generates a signal at the necessary frequency for detection by another infrared detector/receiver 34 .
- the diodes of oscillator 44 are not necessary when the circuit is oscillating. They are nonetheless included to prevent the circuit from hanging up and also to allow the circuit to self-start on power-up. Without the diodes, R 2 and R 3 are returned to VCC (power), and except for the removal of R 1 and R 4 from the timing equations, the circuit functions in the same manner. However, if both transistors ever go into conduction at the same time long enough so that both capacitors are discharged, the circuit will stay in that state, with base currents being supplied through R 2 and R 3 .
- the transistors cannot both be turned on at the same time, since to do so would be to force both collector voltages to zero and there would be no source of base current. Both capacitors will try to charge through the bases, and when one begins to conduct, positive feedback will force the other off, so that the first gains control. The cycle will then proceed normally. It is noted that the value of R 2 and R 3 must be larger than that of R 1 and R 4 to prevent the recharge time constant from being unduly long and the rising edges of the output waveforms from being rounded off or otherwise distorted.
- Circuit 100 is also provided with an enable/disable control 46 .
- MCU 24 controls enable/disable control 46 to control whether or not the oscillating signal of oscillator 44 may be passed to infrared emitting diode 42 .
- the oscillating signal is passed through interconnected transistors as shown.
- MCU 24 emits a serial data stream representing the signal to be transmitted.
- This signal turns on enable/disable control 46 in the coded sequence to permit oscillator 44 to drive infrared emitting diode 42 in accordance with the serial data stream.
- the signal from oscillator 44 typically must be amplified, such as by output signal block 150 .
- the signal from infrared emitting diode 42 is received by an infrared detector/receiver 34 in a corresponding circuit 100 in a corresponding toy provided to interact with the first toy having the above-described circuit.
- the infrared detector/receiver 34 of the corresponding toy receives and filters the signal from the first actuated toy and sends the signal to the corresponding MCU 24 .
- Such a signal comprises the wireless second signal of the above-mentioned signals that may be received by MCU 24 .
- microcontroller 110 can differentiate between the signals to determine whether the signal is to cause a reaction-inducing action or a responsive action to be performed. For example, if the signal is from a hard-wired activation signal or from a remote control device, microcontroller 110 must recognize the signal as an initiating signal (i.e., a signal which causes a reaction-inducing action to be performed) to begin an interactive sequence of actions, and thus start the appropriate subroutine.
- an initiating signal i.e., a signal which causes a reaction-inducing action to be performed
- microcontroller 110 If, however, the signal is from another toy, microcontroller 110 must recognize the signal as a response-inducing signal (i.e., a signal which causes a responsive action to be performed) so that the subroutine for the interactive sequence of actions may be commenced at the appropriate place (rather than at the beginning of the subroutine described below, which would cause a reaction-inducing action to be performed instead).
- a response-inducing signal i.e., a signal which causes a responsive action to be performed
- FIGS. 5 A- 5 F A flow chart of the subroutine for performing an interactive sequence of actions between at least two toys when in play mode (when switch 104 is in play position 108 ) is shown in FIGS. 5 A- 5 F, beginning with step 200 .
- Dolls A and B are sleeping in step 202 .
- the actuation of the MCU by either a hard-wired activation switch in step 204 causes the MCU of doll A (“MCU A”) to wake up in step 206 .
- MCU A then, in step 208 , performs Action 1 .
- Action 1 represents a response-inducing action and is represented separately in FIG. SE because Action 1 represents a sub-subroutine that is performed at various points during the interactive play subroutine of FIGS. 5 A- 5 D.
- Action 1 represents the asking of a response-inducing question by one of the dolls.
- the software may randomly select (in any desired manner, such as by randomly pointing at a memory location containing an action code or by performing a desired selection computation) one of a plurality of codes associated in the program with different actions to be performed (typically the codes are in a look up table, each code corresponding to a reaction-inducing action or a responsive action) if the set option is in random.
- the software sequentially selects an action to be performed, such as by incrementing a variable that causes linear progression through a set of actions that may be performed.
- a separate switch may be provided corresponding to each question that may be asked.
- Action 1 activates the appropriate output pin of the microcontroller corresponding to the selected action code in step 300 (FIG. 5E).
- the microcontroller is coupled to the voice chip via an output bus.
- the pin of the voice chip corresponding to the activated microcontroller pin is activated, in step 302 , to cause the speech pattern associated therewith to be enunciated by the voice chip.
- holding loop 210 comprises the steps of reading pin P 3 . 3 of the microcontroller of MCU A in step 212 and asking whether pin P 3 . 3 is high in decision step 214 .
- Pin P 3 . 3 is coupled to the busy signal output of the voice chip and is set low while a busy signal is emitted by the voice chip. Thus, so long as pin P 3 . 3 is low, MCU A continues to read pin P 3 . 3 , in step 212 , to determine its status.
- the next step in the software program, or play subroutine, is for MCU A to generate a signal that causes the IR emitter to send a coded signal to the other doll (doll B) in step 218 .
- This signal is coded to represent the appropriate responsive action that is to be performed by doll B.
- Doll A thus emits a signal that is received by doll B in step 220 .
- the receipt of a signal wakes up doll B, whereas the completion of the performance of an action by doll A permits doll A to return to sleep.
- MCU B of doll B reads the coded signal emitted from doll A in step 222 .
- Doll B then, in step 224 , performs Action 2 , shown separately in FIG. SF.
- Action 2 is shown separately because Action 2 represents a sub-subroutine that is performed at various points during the interactive play subroutine of FIGS. 5 A- 5 D.
- Action 2 represents the answering of the question asked by doll A.
- a single response is set for each question asked by the first-actuated doll.
- the software randomly points at, or otherwise randomly selects, one of a plurality of codes (typically in a look up table, each code corresponding to a reaction-inducing action or a responsive action) set by the program if the set option is in random.
- the software sequentially causes linear progression (such as by incrementation of a variable) through a set of actions that may be performed.
- Another option is to permit user selection with either a hard-wired or a remote control unit.
- Action 2 activates the output pin corresponding to the selected action code in step 400 (FIG. 5F).
- the MCU is coupled to the voice chip via an output bus.
- the pin of the voice chip corresponding to the activated microcontroller pin is also activated, in step 402 , to cause the speech pattern associated therewith to be enunciated by the voice chip.
- holding loop 226 comprises the steps of reading pin P 3 . 3 of the microcontroller in step 228 and asking whether pin P 3 . 3 is high in decision step 230 .
- Pin P 3 . 3 is coupled to the busy signal output of the voice chip and is set low while a busy signal is emitted by the voice chip. Thus, so long as pin P 3 . 3 is low, MCU B continues to read pin P 3 . 3 , in step 228 , to determine its status.
- the option setting must be read in step 234 .
- decision step 236 if the option setting is set so that the speech pattern just enunciated is to be the last of the interactive sequence, then doll B goes to sleep again in step 238 .
- doll B performs Action 1 (as shown in FIG. 5E, as described above) to enunciate a question (or other response-inducint action) via the voice chip in step 240 .
- MCU B is placed in a holding loop 242 , continuously reading pin P 3 . 3 in step 244 to determine, in decision block 246 , whether pin P 3 . 3 . is high.
- MCU B detects that pin P 3 . 3 is high, MCU B determines, in step 248 that the question being enunciated by the voice chip has been finished.
- the software program of MCU B remains on hold, which pin P 3 . 3 is low, only continuing once pin P 3 . 3 in high so that step 248 may be reached.
- step 250 in which MCU B sends a coded signal to the IR emitter to thereby send a coded signal to doll A.
- Doll B then goes to sleep in step 252 .
- Doll A upon receipt of the coded signal emitted by doll B, is woken up in step 254 .
- MCU A then reads, in step 256 , the coded signal to determine which answer should be enunciated in response to the question enunciated by doll B, and performs Action 2 in step 258 (represented in FIG. 5F), such as described above with respect to doll B and step 224 .
- MCU A is held in holding loop 260 in which MCU A continuously reads pin P 3 . 3 in step 262 and asks, in decision block 264 , whether pin P 3 . 3 is high yet. Once pin P 3 . 3 is high, MCU A detects, in step 266 , that the voice chip is finished enunciating the answer. MCU A then reads the option setting in step 268 , to determine, in decision block 270 , whether another interactive sequence of actions is to be performed. If not, doll A goes to sleep in step 272 . If so, then the software program returns to point D in FIG. 5A. This process continues until the number of interactive sequences of actions required by the options setting has been performed.
- the MCUs must be capable of recognizing whether a signal is from a hard-wired activation switch, which would start the beginning of an interactive sequence of actions, or from a remote control device, which would also start the beginning of an interactive sequence of actions (but correlates the signal differently, as described below), or from another doll, which would cause the doll to perform at least a responsive action (if not another reaction-inducing action as well).
- a signal is from a hard-wired activation switch, which would start the beginning of an interactive sequence of actions, or from a remote control device, which would also start the beginning of an interactive sequence of actions (but correlates the signal differently, as described below), or from another doll, which would cause the doll to perform at least a responsive action (if not another reaction-inducing action as well).
- the above-described software program related to the interaction between dolls is only exemplary. The program may be modified, as required, to correspond to other types of interactive sequences of actions performed in accordance with the broad principles of the present invention.
- the final of the above-mentioned three signals that activates MCU 24 is a signal from mode selection 28 that mode selection switch 104 is in the learn position 106 .
- mode selection switch When mode selection switch is moved to the learn position 106 , MCU 24 is placed in learn mode and voice chip 36 is turned off.
- a learn subroutine is commenced so that MCU 24 may be programmed to interpret an infrared signal generated from a common household remote control unit, such as a commercially available television remote control unit, and respond thereafter to such a signal by performing a desired action as described above.
- a common household remote control unit such as a commercially available television remote control unit
- several programming buttons are used, each of the selected programming buttons on the remote control device being associated with a single speech pattern by the software program of MCU 24 .
- buttons permits MCU selection (as opposed to user selection) of an action to be performed, depending on the setting of options setting 26 .
- a button is associated with a random number generator, or any other software provision that selects a random code such that a randomly selected action is performed if the setting is in random. If, instead, the setting is in linear, then the button is associated with an appropriate software provision for linear selection of an action from the sequence of actions that may be performed.
- MCU 24 is capable of emitting a signal, such as a beep via speaker 38 , in order to indicate whether or not the infrared signal of the selected button has been associated with the code that initiates the desired action of the interaction sequence.
- an infrared signal generated by the remote control device and received by the infrared detector/receiver 34 may be processed in substantially the same manner as a hard-wired activation signal, substantially as described above.
- the particular coded signals associated with the remote control used must be associated with the code set for the action (a set code) and stored in the program.
- above-described Action 1 or 2 involves identifying the received signal through the use of a different look up table (or other form in which codes are stored and correlated) than that which is preprogrammed for hard-wired actuation.
- the learn subroutine implemented when MCU 24 is in learn mode so that a received infrared (or other wireless) signal from a wireless control device may be associated with a code for a desired action to be performed, will now be described with reference to FIG. 6.
- the number of buttons on the remote control device preferably corresponds to the number of actions the toys can perform, plus an additional button that corresponds to the hard-wired activation signal.
- the additional button selects an action either randomly or in accordance with a preset sequence, depending on the doll's setting.
- Preferably six buttons are used for programming one doll and a different six buttons are used for programming the other doll.
- step 400 of the learn subroutine shown in FIG. 5 the learn software subroutine is started.
- the user points a remote control first at one doll and then at the other doll and sequentially presses the number of remote control buttons necessary to correlate with each action to be performed so that the dolls can be programmed to respond differently to the pressing of each of the buttons.
- the buttons used for one doll are different from the buttons used for the other doll.
- the MCU of the doll being programmed reads the signal in step 402 .
- the MCU must determine, in decision step 404 , whether the received signal is valid (recognizable by the MCU). If not, the MCU learn subroutine returns to step 404 to read another signal.
- step 406 the read signal is saved in a predefined address (associated with one of the possible actions) in the program for later use.
- decision block 408 determines whether all coding buttons have been programmed. If not, the subroutine returns to step 402 to read another signal from the remote control. Once all of the buttons have been programmed, there are no more addresses to be assigned with a coded signal and the subroutine continues with step 410 , in which the MCU rests until activated by one of the above-described actuation signals. It will be appreciated that fewer or greater than six buttons may be programmed, depending on the number of actions that may be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Toys (AREA)
Abstract
A set of interactive toys that perform a sequence of actions in response to one another without external activation other than an initial actuation to begin the sequence of actions. Preferably, each toy has an activation switch and/or a receiver for a wireless signal such as an infrared signal which activates the toy. Upon activation, the toy performs a desired action, such as the enunciation of a speech pattern, and signals another toy to perform a responsive action. Preferably, the toy are capable of performing several different action sequences, such as the enunciation of different conversations, the performance of different movements, etc. Additionally, the toys are programmable by a remote control device. The remote control device either functions as an activation switch, initiating a random or predetermined (yet not user determined) sequence of interactions, or as an interaction selector, such that a desired sequence of actions may be selected.
Description
- The present invention relates to interactive toys, one toy, once activated by a user, activating another toy. More particularly, the present invention relates to a pair of toys which perform responsive actions or functions in continuous sequence. In a preferred embodiment a set of talking dolls are provided. The user activates one of the dolls to say a sentence. At the end of the sentence, the user-activated doll activates another doll to respond to the first sentence. Each doll may respond to the sentence of another doll until a conversation is complete.
- Toys that are activated by a user to perform a desired function are known in the art. For example, a variety of dolls exist that perform a desired action, such as speaking or moving, when activated by a user. However, the doll typically only performs a single action (e.g., the doll says a single word or phrase, or moves in a desired manner) without saying anything more until the activation switch is pressed again. Thus, although several activation switches may be provided, each switch causing the doll to performed a desired action (e.g., say a specific word or phrase or move in a desired manner) associated with that switch, once the action is completed, the doll is idle. Only when the desired activation switch is pressed does the doll perform again. Such dolls need not be activated by a mechanically activated switch. Light-sensitive switches may be used instead of, or in addition to, a mechanical switch, such as shown in U.S. Pat. No. 5,281,180 to Lam et al.
- The desired action need not be the enunciation of a speech pattern. Other toys are known that perform another action, such as moving or flashing lights, upon activation by the user. However, the above-described toys merely perform the single desired action or function in response to activation by a user. These toys do not then activate another device without further intervention from a user.
- Despite the variety of known means for activating the toy to perform a desired action and the variety of actions that may be performed, none of the known toys causes another toy to respond with an action which may then cause the first activated toy (or yet another toy) to perform yet another, further-responsive, action (again, without further intervention by a user). Until now, the device used to activate another device has comprised a signal generator alone, such as a remote control unit, that does not perform an action (such as enunciation of a speech pattern) other than transmitting a signal. Thus, in effect, the only “toy” that is activated to perform a desired function is the toy controlled by the remote control device, the remote control device not performing an independent action. The toy which performs the desired action is not activated by another device that has performed a desired action. Moreover, a set of interactive toys which each perform a desired action in addition to transmitting a signal to another toy has not yet been provided with the capability of being programmed by an external, wireless control device such as a common household remote control unit which merely signals one of the toys to perform a desired action, that action then triggering a cascade of mutual activation and response.
- It is therefore an object of the present invention to provide a toy that performs a desired action upon user activation, the action accompanied by a signal to another toy to perform a responsive action without further intervention by the user.
- It is a related object of the present invention to provide a set of toys which interactively cause each other to perform a desired action, each action accompanied by a signal to the other toy to perform a responsive action.
- It is another object of the present invention to provide a set of responsive toys that are programmable and controllable by a household remote control device which generates a control signal to activate one of the toys.
- These and other objects of the present invention are accomplished in accordance with the principles of the present invention by providing a set of interactive toys. Each toy performs an action, the action of at least one of the toys being accompanied by a signal that is sent to the other toy to cause the other toy to perform a responsive action. Preferably, the other toy's action is also accompanied by a signal that is sent to the first toy (or, yet another toy) to cause that toy to perform yet another (the same or different) responsive action. Although only a single interactive responsive action sequence may be performed by the toys, preferably, the set of toys performs one of a variety of different interactive responsive action sequences. The user may either select the action sequence to be performed, or the action may be selected randomly or in a given sequence by the control system of the toy, for example, upon activation of one of the toys. Each toy may respond with a single set response. However, most preferably, each toy may respond in one of several manners, randomly, sequentially, or user-selected, to the action of the other toy.
- Because the response of the other toy should be consonant with the action of the user-activated toy, the user-activated toy typically sends a signal to the other (receiving) toy that is coded. The code is received by the receiving toy to cause the receiving toy to perform an appropriate action in response to the action previously performed by the first signal-emitting toy in the sequence. This interaction may continue until the logical conclusion of the interaction or indefinitely. For example, if the actions are the enunciation of a word or phrase, the interaction is a conversation which ends at the logical conclusion of the conversation. In a preferred embodiment, the toys are dolls and the interaction is in the form of a conversation comprising responsive speech patterns enunciated by the dolls. However, the toys may comprise animals, or a doll interacting with another object, such as a car.
- Also in accordance with the principles of the present invention, the toys can be controlled by a household remote control device. Thus, the toys may be initially activated wirelessly such that a hard-wired switch on the toy is not necessary. Additionally, each toy preferably is also programmable to respond to signals of the remote control device in a desired manner. Specifically, if several interactive action sequences may be performed, then each interactive action sequence and/or each individual response may be associated with a button on the remote control device. Additionally, another button on the remote control device is preferably dedicated to remote random selection of an interactive sequence/response.
- These and other features and advantages of the present invention will be readily apparent from the following detailed description of the invention, the scope of the invention being set out in the appended claims. The detailed description will be better understood in conjunction with the accompanying drawings, wherein like reference characters represent like elements, as follows:
- FIG. 1 is a perspective view of a set of exemplary toys that may be used to perform a sequence of interactive actions in accordance with the principles of the present invention;
- FIG. 2 is a high level block diagram of the interactive mechanism of a set of toys in accordance with the principles of the present invention;
- FIG. 3 is a detailed circuit diagram of the circuitry of FIG. 2 for implementing an interactive sequence according to the present invention;
- FIG. 4 is a table showing jumper connections for setting the options setting of the interactive mechanism of the present invention;
- FIGS.5A-5F are a flow chart showing the sequence of actions performed by toys in the play mode in accordance with the principles of the present invention; and
- FIG. 6 is a flow chart showing the sequence of actions performed by toys in the learn mode in accordance with the principles of the present invention.
- In accordance with the principles of the present invention, a set of toys are provided for interacting with one another independently of user input other than an initial activation of one member of the set to commence interaction. A first toy is actuated to perform a first desired action. Actuation may either be caused by actuation of a hard-wired activation switch or by transmission of a wireless signal, such as a signal from a remote control unit. Upon completion of the desired action, the first toy activates a second toy to perform a second desired action, typically in response to the first desired action. In the simplest form of the invention, once the second toy completes the second desired responsive action, the action sequence is complete, and the toys remain inactive. However, if desired, the second toy may perform a third desired action, such as a reaction-inducing action, after completing the second desired action. Upon completion of the third (reaction-inducing) action, the second toy activates either the first toy or yet another toy to react to the reaction-inducing action. The first (or the yet other toy) then responds to the third (reaction-inducing) action with a fourth desired action. Such interaction between the toys may continue for a set number of rounds, or indefinitely, as desired.
- In a preferred embodiment,
interactive toys 10 are in the form of afirst doll 12 and asecond doll 14, as shown in FIG. 1. However, the interactive toys need not be dolls and one toy need not be the same as the other. For example, a combination of a doll and an animal (such as a dog that barks in response to question asked by the doll), or a doll and an inanimate object (such as a car that opens its doors or turns on its headlights or starts its engine), two animals, or two inanimate objects (such as two musical instruments each playing a musical piece), or a variety of desired objects that may interact with each other in an amusing manner are all within the scope of this invention. One such example of interactive toys is a sound producing element that emits a sound sequence (such as a musical piece) and a keyboard (or other such device with activation keys) that actuates the sound producing element. The keyboard emits a tone (or a sound or a message indicating the action to be performed by the sound producing element) before actuating the sound producing element to play the desired sound sequence. Once the sound sequence has been performed, the sound producing element signals the keyboard to activate the same or a different sound producing element (or another type of toy), which element or toy then performs another desired action. - In the case of
dolls body 16 in which the mechanism that controls the interactive action sequence is housed. Althoughbody 16 preferably is soft,body 16 may be formed from any desired material that permits transmission of wireless signals, such as infrared signals, therethrough. The same is true of the housings or bodies of the other toy forms that may be used instead ofdolls - Each set of toys provided in accordance with the principles of the present invention has a
mechanism 20 that permits and implements performance of the interactive action sequence (hereinafter “the interactive mechanism”) as shown in FIG. 2.Interactive mechanism 20 of each toy comprises a number of functional blocks that permit each toy to receive an activation signal, and, in response, to cause that toy to perform a desired action. Upon completion of that action, the appropriate functional blocks ofinteractive mechanism 20 cause another toy to perform a desired responsive action (if a response is called for). Preferably, the other toy is also capable of activating either the first-activated toy, or yet another toy, to perform yet another responsive action. Thus,interactive mechanism 20 causes the toys to perform a sequence of interactive actions. - The components of
interactive mechanism 20 include aprogram control box 22 containing the necessary components for controlling the interactive sequence of events. Preferably the components ofprogram control box 22 are contained within a housing within the toy.Program control box 22 includes a microcontroller unit (“MCU”) 24 that receives and processes information to control the functioning ofinteractive mechanism 20. Preferably,MCU 24 initially reads the option set by options setting 26 to determine the duration of the interaction to be performed by the interactive toys and whether actuation of the toy is to cause random selection of an action to be performed or sequential selection of an action, the possible actions thus being performed in a preset, predetermined linear order. For example, each toy may only perform a single action, or, the second toy may cause another toy (or the first acting toy) to perform another responsive action (such that three actions are performed). The interactive sequence may continue between two or more toys for a predetermined finite number of interactions or indefinitely. The MCU also must read the mode selected bymode selection 28.Mode selection 28 determines whetherinteractive mechanism 20 is in a play mode, in which the toys are enabled to perform the interactive actions, or in a learn mode, in which the toys may be programmed, as will be described in further detail below. -
MCU 24 remains in a sleep mode, which reduces power consumption, until it receives an activation signal frommode selection 28, or from external hard-wiredactivation switch 30 viaswitch connections 32, or from infrared (“IR”) detector/receiver 34 (or another receiver for a wireless activation signal) to commence operation.External activation switch 30 may take on any desired form known in the art, activated by any of a variety of external stimuli such as touch, light, sound (e.g., a voice recognition switch), motion (either motion of the switch itself or detection of an external motion), magnetic forces, etc. If desired, a separate activation switch may be provided for each of the possible actions to be performed (or at least for the initial action) so that the user may select the interactive sequence of actions to be performed. However, in order to reduce manufacturing costs, a single activation switch may be provided, causingMCU 24 to select (either randomly or sequentially, depending on the setting of options setting 26) the interactive sequence of actions to be performed. It will be understood that any other type of receiver for receiving a wireless signal from another toy of the set may be used instead of an IR receiver, depending on the type of wireless signals transmitted between the toys of the present invention. Although IR detector/receiver 34 is shown as part ofprogram control box 22, it will be understood that IR detector/receiver 34 may, instead, be externally coupled toprogram control box 22. - If an activation signal is received from
mode selection 28, then the learning subroutine, which permits programming of the toys with a remote control unit, is commenced, as described in further detail below. If, instead, an activation signal is received viaswitch connections 32 fromexternal activation switch 30, or viaIR detector 34, thenMCU 24 will begin the desired program encoded therein to commence the desired interactive operation. Thus, an action performing device must be provided to carry out the desired action of the interactive sequence of actions. - In a preferred embodiment, as mentioned above, at least two
dolls voice chip 36, such as those known in the art, that has at least one and preferably several speech patterns stored therein which are enunciated upon activation of the voice chip byMCU 24 as the desired action to be performed. If desired, the voice chip not only contains a series of recorded phrases (“speech patterns”) stored in a memory (preferably a ROM provided therein), but also has recording capability such that the user may record desired speech patterns thereon. If another action is to be performed instead, then the necessary component for performing that desired action is provided in addition to or instead ofvoice chip 36. As will be understood, the exact form of the action performing device depends on the design choices in implementing the principles of the present invention, the present invention thus not being limited to the use of a voice chip. For example, a motor that moves a part of the interactive toy (e.g., for activating an arm to wave, or for moving the lips of the doll), lights that selectively flash, or other desired devices that can perform an action that is responsive to an action performed by another toy, such other action performing device also being well known in the art, may be provided instead of or in addition to a voice chip. Thus, if the toys are not dolls, but instead are inanimate objects, then the necessary mechanism that must be provided for causing the toy to perform a desired action would not be a voice chip. For instance, the set of toys may be an activation keyboard that emits a tone (or other sound or message) and a sound producing element that plays music (e.g., a musical instrument, such as a piano or a flute). The action performing device thus is not necessarily a voice chip but may be any electronic or mechanical component known in the art for causing the production of such non-vocal sounds. Likewise, if the toys are a doll and a car, then the action producing devices would include not only a voice chip for the doll, but also a device that can control elements of the car (such as a motor or a headlight) that are to be actuated by the doll. - If the action performing device is a
voice chip 36, then aspeaker 38 is included as part ofinteractive mechanism 10, electrically coupled to the components of program control box 22 (preferably electrically coupled to the voice chip) as will be described in greater detail below. If recording capability is desired, then amicrophone 40 is also included ininteractive mechanism 20, electrically coupled to the components ofprogram control box 22. Similarly, any other element that performs the desired action and which is associated with the device that causes the action to be performed is coupled toprogram control box 22. - Although the interactive toys used in the present invention may be electrically coupled together to transmit signals to each other, preferably, the interactive toys are provided with transmitters and receivers for wirelessly transferring signals between each other. Various means for wirelessly communicating information between inanimate objects, such as electrical equipment, are known in the art. Typically, information is transferred via audible sound, ultrasound, radio frequency, and infrared wave signals. In the preferred embodiment of the present invention, infrared signals are transmitted between the toys. Thus, FCC approval, which would be needed for other transmission media such as radio frequency, is not necessary. It will be understood that any other desired signal transmitting and detecting/receiving components which wirelessly exchange information may be used instead.
- Preferably, an infrared (“IR”) emitting driver42 (such as an infrared light emitting diode), or other such infrared signal emitter, is coupled to the other components of
program control box 22. If the IR detectors used in the interactive toys are the type that only can receive an oscillating signal, such as is common in the art,IR emitting driver 42 must be driven to emit an oscillating signal. Thus,frequency oscillator 44 is coupled toIR emitting driver 42 through an output disable/enablecontrol 46.Output control 46 is normally set so that oscillating signals are not sent fromoscillator 44 toIR emitting driver 42. However, once an action has been performed andinteractive mechanism 20 is to activate anotherinteractive mechanism 20 of a corresponding interactive toy,output control 46 enablesoscillator 44 to send the desired signal toIR emitting driver 42. A signal thus is emitted fromIR emitting driver 42 which may be received by an IR detector of a corresponding interactive toy having a control mechanism substantially identical tointeractive control mechanism 20. - A power and
control box 48 providesprogram control box 22, as well as the other devices comprisinginteractive mechanism 20, with power. Typically, power andcontrol box 48 comprises a battery pack within ahousing 50 and therequisite wiring 52 coupling the battery pack to at leastprogram control box 22.Program control box 22 then supplies the remaining components ofinteractive mechanism 20 with power. However, if desired, power andcontrol box 48 may be separately coupled to each of the remaining components ofinteractive mechanism 20, instead. Access to power andcontrol box 48 is generally provided so that the batteries therein can be replaced as necessary. - Because power and
control box 48 is typically the only component ofinteractive mechanism 20 that is user-accessible, power andcontrol box 48 may be provided withcontrol switches 54 which provide overall control ofinteractive mechanism 20. Control switches 54 may include an on/off switch 55 for turning the toy on so that power is not expended when the toy is not in use. Additionally, control switches 54 may include a mode selection switch (coupled to and enabling mode selection 28) for selecting whether the toy is in “play” mode or in “learn” mode, as will be described in further detail below. - A detailed circuit diagram showing a
preferred circuit 100 containing the components making up the above-described functional blocks is shown in FIG. 3. Blocked sections of the diagram of FIG. 3 representing a functional block of FIG. 2 are represented by the same reference numeral. It will be understood that power switch 102 (of power control block 55) must be closed in order forcircuit 100 to function. Furthermore, the function performed bycircuit 100 is determined bymode selection block 28 comprisingmode selection switch 104 positionable between alearn position 106 and aplay position 108. The function ofcircuit 100 will first be described for the mode in whichmode selection switch 104 is in theplay position 108. -
Circuit 100 is controlled byMCU 24 comprisingmicrocontroller 110.Microcontroller 110 preferably is a 4-bit high performance single-chip microcontroller having a sufficient number of input/output ports to correspond to the number of desired actions that the toy is to perform, a timer (preferably an 8-bit basic timer) for measuring the time interval of an incoming signal (preferably an IR signal), and sufficient memory (RAM and ROM) to store the required software for causingcircuit 100 to implement the desired interactive sequence of actions as well as to store the desired number of remote control codes for circuit programming with a remote control unit, as will be described below. A more powerful microprocessor, such as an 8-bit microprocessor, may be used instead, depending on design choices. Because the signals between the toys are preferably wireless, and, most preferably infrared signals, the microcontroller must be selected to have sufficient speed to generate a signal that can activate an infrared transmitter, as well as to recognize a received infrared signal. The size of the ROM/RAM, the power requirements, and the number of input and output pins are determined by the particular design requirements of the toys. A preferred microcontroller unit is the KS57C0302 CMOS microcontroller sold by Samsung Electronics of Korea. - In a preferred embodiment, at least ten input/output ports are provided so that the toy can perform at least five initiating actions and five responsive actions. However, it will be understood that because the number of input/output ports corresponds to the number of actions which may be performed, fewer or greater than ten inlet/outlet ports may be provided depending on design choices. Thus, each
microcontroller 110 preferably has six (6) pairs of input/output pins, five (5) of which are dedicated to codes corresponding to actions to be performed, the sixth pair being dedicated to random/sequential selection of an action (i.e., non-user determined selection of an action to be performed, theMCU 24 determining which action is to be performed based on the setting of options setting 26). Of course, in the simplest form of the invention (in which a first toy performs an action and then activates a second toy to perform a responsive action, the action sequence ending upon completion of the responsive action) only a single input/output port is necessary. - With
circuit 100 supplied with power viapower switch 102,microcontroller 110 preferably remains in a sleep mode until one of three activation signals is received: a signal from hard-wired switch connections 32 (from an external activation switch); a wireless signal, such as from infrared detector/receiver 34; or a signal frommode selection block 28. The first two mentioned signals activatecircuit 100 whenmode selection switch 104 is in theplay position 108. The third-mentioned signal activatescircuit 100 whenmode selection switch 104 is in thelearn position 106 for programming purposes, and thus will be described in further detail below. -
Switch connections 32 may be coupled to aswitch 30 located on or near the toy (such as inbody 18 ofdoll 12, 14) or a key 114 of a keyboard coupled tocircuit 100. Infrared detector/receiver 34 receives a signal either from an infrared emitting diode, similar toIR emitting driver 42 ofcircuit 100, of a circuit (substantially identical to circuit 100) in an associated toy or from a remote control device (such as a household television remote controller) which can generate infrared signals. Use of a remote control device for activating the toy of the present invention will be described in greater detail below. - Receipt by
MCU 24 of an activation signal fromswitch connections 32 causesMCU 24 to select a desired action to be performed. The desired action may be selected by a user (e.g., by pressing a desired activation switch associated with the desired action to be performed if a switch corresponding to each action is provided), or, by the MCU. If an activation switch is provided for MCU selection of the interactive sequence of actions to be performed, performance of the action may be in a preset linear order (i.e., in a set sequence), or at random, depending on the setting of options setting 26. - Options setting26 is set through the use of jumpers J1-J5 diodes D5-D9 to close the jumpers. The jumper settings may either be hard-wired, or user selected via a dip switch having the required number of setting levers. A table showing various jumper connections, providing various settings 120-140, and their associated functions is shown in FIG. 4. As can be seen, each function may be performed in either a linear sequence (“in sequence”), in which the actions that are performed follow a set order, or in a random order (“in random”), in which the actions are performed in a random order. Setting 120 causes
MCU 24 to performoption 1, representing the performance of one of a variety of desired actions by a toy, in a linear sequence. Setting 122, on the other hand, causesMCU 24 to performoption 1 in a random order. Setting 124 causesMCU 24 to performoption 1 as controlled by a preferably musical toy such as a piano or a flute. Setting 126 causesMCU 24 to performoption 2, in which the first toy performs a response-inducing action and the second toy performs a responsive action, in sequence, whereas setting 128 causesMCU 24 to performoption 2 to be performed in random order.Option 3, in which each toy performs a response-inducing action as well as a responsive action (i.e., the first toy performs a first action, the second toy responds to that action and then performs another action to which the first toy, or another toy, responds), is performed in sequence by setting 130 and in random by setting 132.Option 4, in which each toy performs greater than two (preferably ten) response-inducing actions as well as greater than two (preferably ten) responsive actions, is performed in sequence by setting 134 and in random by setting 136. Finally, endless interactive actions are performed inoption 5, either in sequence by setting 138, or in random by setting 140. - Whatever the desired action is,
MCU 24 is actuated by an activation signal to perform the appropriate subroutine for performing the desired interactive sequence of actions, as described in greater detail below. Each action is associated with a corresponding code by the software subroutine initialized by the actuation of the toy, the subroutine sending the appropriate signal to the appropriate device to perform the desired action corresponding to the signal. The requisite code for initiating the action is preferably contained in a look up table (which is part of the software program) containing a list of the codes corresponding to the desired actions that may be performed. Once the code for the desired action to be performed is determined, the appropriate one or more of input/output pins 142 ofmicroprocessor 110 is activated in a manner familiar to those skilled in the art. - In a preferred embodiment, the desired action is the enunciation of a speech pattern. Thus,
data output bus 144couples MCU 24 withvoice chip block 36 containingvoice chip 146.Voice chip 146 is capable of storing and retrieving voice patterns. Preferably, the voice chip has a read only memory (ROM) in which the voice patterns are stored. The stored patterns may be any desired length, such as 6, 10, 20, or 32 seconds long. Enough pins must be provided to correspond to the output pins of themicrocontroller 110. Preferably, the pins are capable of being edge triggered to enunciate a desired speech pattern. The voice chip that is used may be any of the commercially available voice chips that provide the above features, such as the MSS2101/3201 manufactured by Mosel of Taiwan. If the toy permits a user to record his or her own message for later playback by the toy, then a voice recording chip, such as the UM5506 manufactured by United Microelectronic Corp. of Taiwan, or the ISD1110X or ISD1420X both manufactured by Information Storage Devices, Inc. of San Jose, Calif., is provided. It will be understood that any other circuit component may additionally or alternatively be contained invoice chip block 36, this block generally representing the action performing block containing the necessary component or device that causes the performance of the desired action. Such other component or device may actuate a motor, external lights that selectively flash, or other desired action performing devices, such as described above. -
Voice chip 146 preferably has a ROM with a preloaded series of preferably digitized phrases. However, it will be appreciated that the memory in which the phrases to be played may be located elsewhere. Preferably the phrases are prerecorded audio signals mask programmed ontovoice chip 146.Voice chip 146 contains the necessary circuitry to interpret the signal frommicrocontroller 110 viadata bus 144 and to access the appropriate phrase stored within voice chip 146 (or at another memory location) and associated with the signal frommicrocontroller 110. Furthermore,voice chip 146 preferably also contains the necessary circuitry to convert the recorded phrase into proper audio format for output to speaker 38 (which may or may not be considered a part of voice chip block 36). As known to one of ordinary skill in the art, the signal fromvoice chip 146 may be amplified as necessary forspeaker 38. - During enunciation of the selected speech pattern,
voice chip 146 generates a busy signal at busy output pin 148, which signalsMCU 24 to enter an idle state in which no further signals are generated bymicrocontroller 110. The busy signal is turned off at the end of the enunciation, thereby enablingMCU 24 to generate a coded signal that may be transmitted to the corresponding toy to actuate the corresponding toy to perform a corresponding interactive response. Preferably,MCU 24 remains in a ready state, waiting for the termination of the busy signal. Once the busy signal ends,MCU 24 may continue its subroutine, the next set of which is to transmit a coded signal to another toy, as described in greater detail below. - Once
microcontroller 110 has generated the signal to transmit to the other toy,microcontroller 110 must transmit the signal to infrared emittingdiode 42. The infrared detector/receiver 34 used in each of thecontrol circuits 100 of the interactive toys of the present invention generally can only receive an infrared signal with a predetermined carrier frequency (preferably 38 Khz). Thus, infrared emittingdiode 42 must emit a signal at that predetermined frequency as well. Accordingly,circuit 100 is provided with anoscillator 44 which generates a signal at the necessary frequency for detection by another infrared detector/receiver 34. - Theoretically, the diodes of
oscillator 44 are not necessary when the circuit is oscillating. They are nonetheless included to prevent the circuit from hanging up and also to allow the circuit to self-start on power-up. Without the diodes, R2 and R3 are returned to VCC (power), and except for the removal of R1 and R4 from the timing equations, the circuit functions in the same manner. However, if both transistors ever go into conduction at the same time long enough so that both capacitors are discharged, the circuit will stay in that state, with base currents being supplied through R2 and R3. With the diodes present, the transistors cannot both be turned on at the same time, since to do so would be to force both collector voltages to zero and there would be no source of base current. Both capacitors will try to charge through the bases, and when one begins to conduct, positive feedback will force the other off, so that the first gains control. The cycle will then proceed normally. It is noted that the value of R2 and R3 must be larger than that of R1 and R4 to prevent the recharge time constant from being unduly long and the rising edges of the output waveforms from being rounded off or otherwise distorted. -
Circuit 100 is also provided with an enable/disablecontrol 46.MCU 24 controls enable/disablecontrol 46 to control whether or not the oscillating signal ofoscillator 44 may be passed to infrared emittingdiode 42. Preferably, the oscillating signal is passed through interconnected transistors as shown. Thus, whenMCU 24 is ready to transmit a signal to another toy,MCU 24 emits a serial data stream representing the signal to be transmitted. This signal turns on enable/disablecontrol 46 in the coded sequence to permitoscillator 44 to drive infrared emittingdiode 42 in accordance with the serial data stream. As one of ordinary skill in the art would know, the signal fromoscillator 44 typically must be amplified, such as byoutput signal block 150. - The signal from infrared emitting
diode 42 is received by an infrared detector/receiver 34 in acorresponding circuit 100 in a corresponding toy provided to interact with the first toy having the above-described circuit. The infrared detector/receiver 34 of the corresponding toy receives and filters the signal from the first actuated toy and sends the signal to the correspondingMCU 24. Such a signal comprises the wireless second signal of the above-mentioned signals that may be received byMCU 24. - Both the hard-wired activation signal from
switch connections 32 and the wireless signal received byIR detector 34 are input intomicrocontroller 110 via different pins, as may be seen in FIG. 3. Thus,microcontroller 110 can differentiate between the signals to determine whether the signal is to cause a reaction-inducing action or a responsive action to be performed. For example, if the signal is from a hard-wired activation signal or from a remote control device,microcontroller 110 must recognize the signal as an initiating signal (i.e., a signal which causes a reaction-inducing action to be performed) to begin an interactive sequence of actions, and thus start the appropriate subroutine. If, however, the signal is from another toy,microcontroller 110 must recognize the signal as a response-inducing signal (i.e., a signal which causes a responsive action to be performed) so that the subroutine for the interactive sequence of actions may be commenced at the appropriate place (rather than at the beginning of the subroutine described below, which would cause a reaction-inducing action to be performed instead). - A flow chart of the subroutine for performing an interactive sequence of actions between at least two toys when in play mode (when
switch 104 is in play position 108) is shown in FIGS. 5A-5F, beginning withstep 200. Dolls A and B are sleeping instep 202. The actuation of the MCU by either a hard-wired activation switch instep 204, causes the MCU of doll A (“MCU A”) to wake up instep 206. MCU A then, instep 208, performsAction 1.Action 1 represents a response-inducing action and is represented separately in FIG. SE becauseAction 1 represents a sub-subroutine that is performed at various points during the interactive play subroutine of FIGS. 5A-5D. Preferably,Action 1 represents the asking of a response-inducing question by one of the dolls. The software may randomly select (in any desired manner, such as by randomly pointing at a memory location containing an action code or by performing a desired selection computation) one of a plurality of codes associated in the program with different actions to be performed (typically the codes are in a look up table, each code corresponding to a reaction-inducing action or a responsive action) if the set option is in random. Alternatively, if the set option is in sequence, the software sequentially selects an action to be performed, such as by incrementing a variable that causes linear progression through a set of actions that may be performed. Instead, or additionally, a separate switch may be provided corresponding to each question that may be asked. Any desired number of actions may be performed by the dolls. In a preferred embodiment, a total of ten actions may be performed by each doll, five being reaction-inducing actions and the other five being responsive actions. Upon selection, by the software program, of an action to be performed,Action 1 activates the appropriate output pin of the microcontroller corresponding to the selected action code in step 300 (FIG. 5E). As described above, the microcontroller is coupled to the voice chip via an output bus. Thus, the pin of the voice chip corresponding to the activated microcontroller pin is activated, instep 302, to cause the speech pattern associated therewith to be enunciated by the voice chip. - Returning to FIG. 5A, upon performance of
Action 1 instep 208, while the voice chip is enunciating the selected speech pattern, MCU A remains in a holdingloop 210 waiting for the selected action to be performed so that the next step in the software program may be performed. Specifically, holdingloop 210 comprises the steps of reading pin P3.3 of the microcontroller of MCU A instep 212 and asking whether pin P3.3 is high indecision step 214. Pin P3.3 is coupled to the busy signal output of the voice chip and is set low while a busy signal is emitted by the voice chip. Thus, so long as pin P3.3 is low, MCU A continues to read pin P3.3, instep 212, to determine its status. Once the voice chip is finished enunciating the selected speech pattern (as shown, the first action performed is a question, thus, the selected speech pattern is a question) pin P3.3 goes high and MCU A is permitted to continue to step 216, in which MCU A is signaled that the voice chip is finished so that the software program may continue. - The next step in the software program, or play subroutine, is for MCU A to generate a signal that causes the IR emitter to send a coded signal to the other doll (doll B) in
step 218. This signal is coded to represent the appropriate responsive action that is to be performed by doll B. Doll A thus emits a signal that is received by doll B instep 220. The receipt of a signal wakes up doll B, whereas the completion of the performance of an action by doll A permits doll A to return to sleep. MCU B of doll B reads the coded signal emitted from doll A instep 222. Doll B then, instep 224, performsAction 2, shown separately in FIG. SF. As withAction 1,Action 2 is shown separately becauseAction 2 represents a sub-subroutine that is performed at various points during the interactive play subroutine of FIGS. 5A-5D. Preferably,Action 2 represents the answering of the question asked by doll A. Typically, a single response is set for each question asked by the first-actuated doll. However, it is within the scope of the present invention to provide several answers to each of the questions asked, each answer either being randomly selected, sequentially selected, or user selected. The software randomly points at, or otherwise randomly selects, one of a plurality of codes (typically in a look up table, each code corresponding to a reaction-inducing action or a responsive action) set by the program if the set option is in random. Alternatively, if the set option is in sequence, the software sequentially causes linear progression (such as by incrementation of a variable) through a set of actions that may be performed. Another option is to permit user selection with either a hard-wired or a remote control unit. Upon selection of the responsive action to be performed by the software program,Action 2 activates the output pin corresponding to the selected action code in step 400 (FIG. 5F). As described above, the MCU is coupled to the voice chip via an output bus. Thus, the pin of the voice chip corresponding to the activated microcontroller pin is also activated, instep 402, to cause the speech pattern associated therewith to be enunciated by the voice chip. - Returning to FIG. 5B, upon performance of
Action 2 instep 224, while the voice chip is enunciating the selected speech pattern, MCU B remains in a holdingloop 226 waiting for the selected action to be performed so that the next step in the software program may be performed. Specifically, holdingloop 226 comprises the steps of reading pin P3.3 of the microcontroller instep 228 and asking whether pin P3.3 is high indecision step 230. Pin P3.3 is coupled to the busy signal output of the voice chip and is set low while a busy signal is emitted by the voice chip. Thus, so long as pin P3.3 is low, MCU B continues to read pin P3.3, instep 228, to determine its status. Once the voice chip is finished enunciating the selected speech pattern (as shown, the first action performed is a question, thus, the selected speech pattern is a question) pin P3.3 goes high and MCU B is permitted to continue to step 232, in which MCU B is signaled that the voice chip is finished so that the software program may continue. - Because, based on the option set, the answer just enunciated by the voice chip of doll B may or may not be the last action to be performed, the option setting must be read in
step 234. Indecision step 236, if the option setting is set so that the speech pattern just enunciated is to be the last of the interactive sequence, then doll B goes to sleep again instep 238. However, if greater than one interactive sequence is to be performed by dolls A and B, then doll B performs Action 1 (as shown in FIG. 5E, as described above) to enunciate a question (or other response-inducint action) via the voice chip instep 240. As above, during the enunciation of a speech pattern, MCU B is placed in a holdingloop 242, continuously reading pin P3.3 instep 244 to determine, indecision block 246, whether pin P3.3. is high. When MCU B detects that pin P3.3 is high, MCU B determines, instep 248 that the question being enunciated by the voice chip has been finished. As above, the software program of MCU B remains on hold, which pin P3.3 is low, only continuing once pin P3.3 in high so thatstep 248 may be reached. The software program of MCU B continues withstep 250, in which MCU B sends a coded signal to the IR emitter to thereby send a coded signal to doll A. Doll B then goes to sleep instep 252. Doll A, upon receipt of the coded signal emitted by doll B, is woken up instep 254. MCU A then reads, instep 256, the coded signal to determine which answer should be enunciated in response to the question enunciated by doll B, and performsAction 2 in step 258 (represented in FIG. 5F), such as described above with respect to doll B and step 224. Also as described above, while the voice chip is enunciating the selected answer, MCU A is held in holdingloop 260 in which MCU A continuously reads pin P3.3 instep 262 and asks, indecision block 264, whether pin P3.3 is high yet. Once pin P3.3 is high, MCU A detects, instep 266, that the voice chip is finished enunciating the answer. MCU A then reads the option setting instep 268, to determine, indecision block 270, whether another interactive sequence of actions is to be performed. If not, doll A goes to sleep instep 272. If so, then the software program returns to point D in FIG. 5A. This process continues until the number of interactive sequences of actions required by the options setting has been performed. - It will be understood that the MCUs must be capable of recognizing whether a signal is from a hard-wired activation switch, which would start the beginning of an interactive sequence of actions, or from a remote control device, which would also start the beginning of an interactive sequence of actions (but correlates the signal differently, as described below), or from another doll, which would cause the doll to perform at least a responsive action (if not another reaction-inducing action as well). It will further be understood that the above-described software program related to the interaction between dolls is only exemplary. The program may be modified, as required, to correspond to other types of interactive sequences of actions performed in accordance with the broad principles of the present invention.
- The final of the above-mentioned three signals that activates
MCU 24 is a signal frommode selection 28 thatmode selection switch 104 is in thelearn position 106. When mode selection switch is moved to the learnposition 106,MCU 24 is placed in learn mode andvoice chip 36 is turned off. When in learn mode, a learn subroutine is commenced so thatMCU 24 may be programmed to interpret an infrared signal generated from a common household remote control unit, such as a commercially available television remote control unit, and respond thereafter to such a signal by performing a desired action as described above. Preferably, several programming buttons are used, each of the selected programming buttons on the remote control device being associated with a single speech pattern by the software program ofMCU 24. Additionally, another button permits MCU selection (as opposed to user selection) of an action to be performed, depending on the setting of options setting 26. Thus, a button is associated with a random number generator, or any other software provision that selects a random code such that a randomly selected action is performed if the setting is in random. If, instead, the setting is in linear, then the button is associated with an appropriate software provision for linear selection of an action from the sequence of actions that may be performed.MCU 24 is capable of emitting a signal, such as a beep viaspeaker 38, in order to indicate whether or not the infrared signal of the selected button has been associated with the code that initiates the desired action of the interaction sequence. OnceMCU 24 has been programmed, an infrared signal generated by the remote control device and received by the infrared detector/receiver 34 may be processed in substantially the same manner as a hard-wired activation signal, substantially as described above. However, it will be understood that because each remote control unit is different, each time the toys are programmed the particular coded signals associated with the remote control used must be associated with the code set for the action (a set code) and stored in the program. Thus, upon remote control actuation, above-describedAction - The learn subroutine, implemented when
MCU 24 is in learn mode so that a received infrared (or other wireless) signal from a wireless control device may be associated with a code for a desired action to be performed, will now be described with reference to FIG. 6. The number of buttons on the remote control device preferably corresponds to the number of actions the toys can perform, plus an additional button that corresponds to the hard-wired activation signal. Like the hard-wired activation signal, the additional button selects an action either randomly or in accordance with a preset sequence, depending on the doll's setting. Preferably six buttons are used for programming one doll and a different six buttons are used for programming the other doll. Instep 400 of the learn subroutine shown in FIG. 5, the learn software subroutine is started. The user points a remote control first at one doll and then at the other doll and sequentially presses the number of remote control buttons necessary to correlate with each action to be performed so that the dolls can be programmed to respond differently to the pressing of each of the buttons. Thus, the buttons used for one doll are different from the buttons used for the other doll. Each time a user presses a button of the remote control unit, the MCU of the doll being programmed reads the signal instep 402. Before continuing, the MCU must determine, indecision step 404, whether the received signal is valid (recognizable by the MCU). If not, the MCU learn subroutine returns to step 404 to read another signal. If the signal, however, is valid, then the subroutine continues withstep 406, in which the read signal is saved in a predefined address (associated with one of the possible actions) in the program for later use. After saving the signal,decision block 408 determines whether all coding buttons have been programmed. If not, the subroutine returns to step 402 to read another signal from the remote control. Once all of the buttons have been programmed, there are no more addresses to be assigned with a coded signal and the subroutine continues withstep 410, in which the MCU rests until activated by one of the above-described actuation signals. It will be appreciated that fewer or greater than six buttons may be programmed, depending on the number of actions that may be performed. - It will be understood that although such programming capability as described is provided in a preferred embodiment of the invention, such feature is not necessary to achieve the broad objects of the present invention. Such programming capability requires the above-described MCU. If such capability is not desired, and only one interactive action sequence is performed by the toys, then an MCU is unnecessary.
- While the foregoing description and drawings represent the preferred embodiments of the present invention, it will be understood that various additions, modifications and substitutions may be made therein without departing from the spirit and scope of the present invention as defined in the accompanying claims. In particular, it will be understood that although much of the above disclosure is dedicated to describing the principles of the present invention as applied to two interactive dolls, these principles may be equally applied to other interactive toys as well. It will be clear to those skilled in the art that the present invention may be embodied in other specific forms, structures, arrangements, proportions, and with other elements, materials, and components, without departing from the spirit or essential characteristics thereof. One skilled in the art will appreciate that the invention may be used with many modifications of structure, arrangement, proportions, materials, and components and otherwise, used in the practice of the invention, which are particularly adapted to specific environments and operative requirements without departing from the principles of the present invention. The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, and not limited to the foregoing description.
Claims (40)
1. An interactive set of toys comprising:
a first toy having a first action performing device;
a second toy having a second action performing device;
a first signal generator associated with said first action performing device, said first signal generator generating a first signal after an action is performed by said first action performing device;
a first signal transmitter associated with said first signal generator, said first signal transmitter transmitting said first signal; and
a first signal receiver associated with said second action performing device, said first signal receiver receiving said first signal from said first signal transmitter to activate said second action performing device.
2. An interactive toy as in claim 1 , further comprising:
a second signal generator associated with said second action performing device, said second signal generator generating a second signal after an action is performed by said second action performing device; and
a second signal transmitter associated with said second signal generator, said signal transmitter transmitting said second signal.
3. An interactive toy as in claim 2 , further comprising a second signal receiver associated with said first action performing device, said second signal receiver receiving said second signal from said second signal transmitter to activate said first action performing device.
4. An interactive toy as in claim 3 , wherein said first and second signal generators generate infrared signals and said first and second signal receivers receive said infrared signals.
5. An interactive toy as in claim 3 , further comprising:
a first control unit controlling said first action performing device, said first signal generator, and said second signal receiver; and
a second control unit controlling said second action performing device, said second signal generator, and said first signal receiver.
6. An interactive toy as in claim 5 , wherein said first and second control units encode and decode said signals.
7. An interactive toy as in claim 5 , wherein:
said first action performing device is capable of performing a plurality of desired actions, a single action being performed in response to each activation of said first action performing device;
said first control unit is programmable by a remote control device having a plurality of control buttons such that each selected control button of the remote control device is associated with a different one of said plurality of desired actions; and
activation of a control button causes said action performing device to perform the associated one of a plurality of desired actions.
8. An interactive toy as in claim 1 , wherein said first and second device for performing a desired action comprise a voice chip, said voice chip enunciating a desired speech pattern comprising the desired action.
9. An interactive toy as in claim 8 , wherein said first and second toy each further comprise a speaker associated with said voice chip.
10. An interactive toy as in claim 8 , wherein at least one of said toys further comprises a recording mechanism associated with one of said voice chips, said recording mechanism permitting recording of a speech pattern and the coupled to an associated speaker.
11. An interactive toy as in claim 8 , wherein said first and second toys are dolls.
12. An interactive toy as in claim 11 , wherein at least one of said device for performing a desired function further comprises a motor, said motor moving the doll associated therewith in response to a signal from the other doll.
13. An interactive toy as in claim 1 , further comprising a first activation switch on said first toy, said activation switch being accessible to a user to activate said first action performing device.
14. An interactive toy as in claim 13 , further including a second activation switch on said second toy.
15. An interactive toy as in claim 13 , wherein said first activation switch is touch sensitive.
16. An interactive toy as in claim 13 , wherein said first activation switch is light activated.
17. An interactive toy as in claim 13 , wherein said first activation switch is a receiver for a wireless signal.
18. An interactive toy as in claim 1 , wherein:
said first and second toys are dolls; and
at least one of said action performing devices comprises a motor, said motor moving the doll associated therewith in response to a signal from the other doll.
19. An interactive toy as in claim 1 , wherein said first signal generator generates an infrared signal and said first signal receiver receives said infrared signal.
20. An interactive toy as in claim 17 , further comprising:
a second signal generator associated with said second device for performing a desired action, said second signal generator generating an infrared signal after an action is performed by said second action performing device;
a second signal transmitter associated with said second signal generator; and
a second signal receiver associated with said first action performing device, said second signal receiver receiving said infrared signal from said second signal transmitter to activate said first action performing device.
21. An interactive toy as in claim 20 , wherein:
said first toy is an activation keyboard;
said first action performing device emits a sound signal;
said second toy is a sound producing element; and
said second action performing device toy a musical piece.
22. An interactive toy as in claim 21 , wherein said sound producing element is a musical instrument.
23. A method of causing a set of toys to perform an interactive sequence of actions, said method comprising the steps of:
activating a first toy to perform a first desired action;
generating a first signal identifying the first desired action performed;
transmitting said first signal to a second toy;
causing said first signal to activate said second toy to perform a second desired action responsive to said first desired action.
24. A method as in claim 23 , wherein said step of generating a first signal further comprises the step of encoding said first signal.
25. A method as in claim 23 , further comprising the step of generating a second signal identifying the second desired action performed.
26. A method as in claim 25 , wherein said steps of generating a first and second signal further comprise the steps of encoding said first and second signals.
27. A method as in claim 23 , wherein said first desired action comprises the enunciation of a speech pattern by said first toy.
28. A method as in claim 23 , wherein said first desired action comprises movement of said first toy.
29. A method as in claim 23 , wherein said second desired action comprises the enunciation of a speech pattern by said second toy.
30. A method as in claim 23 , wherein said second desired action comprises movement of said second toy.
31. A method as in claim 23 , further comprising the steps of:
transmitting said second signal to another toy; and
causing said second signal to activate said other toy to perform a third desired action responsive to said second desired action.
32. A method as in claim 31 , wherein said other toy is said first toy, said step of causing said second signal to activate said other toy comprising the step of causing said first toy to perform said third desired action responsive to said second desired action.
33. A method as in claim 31 , further comprising the steps of:
generating a third signal identifying the third desired action performed;
transmitting said third signal to another toy;
causing said third signal to activate said other toy to perform a fourth desired action responsive to said third desired action.
34. A method as in claim 33 , wherein an activation keyboard performs said first and third desired actions and a sound producing element performs said second and fourth desired actions.
35. A method as in claim 34 , wherein said second and fourth desired actions comprise the performance of a musical piece.
36. A method as in claim 23 , wherein said step of transmitting said first signal to said second toy comprises the step of wirelessly transmitting said first signal.
37. A method as in claim 36 , wherein said step of wirelessly transmitting said first signal comprises the step of transmitting an infrared signal to an infrared detector/receiver in said second toy.
38. A method as in claim 23 , further comprising the step of programming said first toy to respond to a signal from a remote control device.
39. A method as in claim 38 , further comprising the step of providing said first toy with a microcontroller unit, said microcontroller unit running a subroutine associating each of a plurality of codes with a different action to be performed, said step of generating a first signal identifying the first desired action performed comprising the step of associating said first signal with one of said plurality of codes.
40. A method as in claim 39 , wherein
said remote control device comprises a plurality of control buttons; and
said programming step further comprises the steps of associating the signals of each of said control buttons with one of said plurality of codes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/880,425 US6454625B1 (en) | 1997-04-09 | 2001-06-13 | Interactive talking dolls |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US83163597A | 1997-04-09 | 1997-04-09 | |
US09/685,526 US6358111B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
US09/880,425 US6454625B1 (en) | 1997-04-09 | 2001-06-13 | Interactive talking dolls |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/685,526 Continuation US6358111B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
Publications (2)
Publication Number | Publication Date |
---|---|
US20020061708A1 true US20020061708A1 (en) | 2002-05-23 |
US6454625B1 US6454625B1 (en) | 2002-09-24 |
Family
ID=25259519
Family Applications (10)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/685,526 Expired - Lifetime US6358111B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
US09/685,527 Expired - Lifetime US6309275B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
US09/880,425 Expired - Lifetime US6454625B1 (en) | 1997-04-09 | 2001-06-13 | Interactive talking dolls |
US09/883,762 Expired - Lifetime US6497604B2 (en) | 1997-04-09 | 2001-06-18 | Interactive talking dolls |
US09/876,367 Expired - Lifetime US6375535B1 (en) | 1997-04-09 | 2001-07-24 | Interactive talking dolls |
US10/008,879 Expired - Lifetime US6497606B2 (en) | 1997-04-09 | 2001-11-08 | Interactive talking dolls |
US10/200,696 Expired - Lifetime US6641454B2 (en) | 1997-04-09 | 2002-07-22 | Interactive talking dolls |
US10/658,043 Expired - Fee Related US7068941B2 (en) | 1997-04-09 | 2003-09-09 | Interactive talking dolls |
US11/206,532 Abandoned US20060009113A1 (en) | 1997-04-09 | 2005-08-18 | Interactive talking dolls |
US14/179,222 Expired - Fee Related US9067148B2 (en) | 1997-04-09 | 2014-02-12 | Interactive talking dolls |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/685,526 Expired - Lifetime US6358111B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
US09/685,527 Expired - Lifetime US6309275B1 (en) | 1997-04-09 | 2000-10-10 | Interactive talking dolls |
Family Applications After (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/883,762 Expired - Lifetime US6497604B2 (en) | 1997-04-09 | 2001-06-18 | Interactive talking dolls |
US09/876,367 Expired - Lifetime US6375535B1 (en) | 1997-04-09 | 2001-07-24 | Interactive talking dolls |
US10/008,879 Expired - Lifetime US6497606B2 (en) | 1997-04-09 | 2001-11-08 | Interactive talking dolls |
US10/200,696 Expired - Lifetime US6641454B2 (en) | 1997-04-09 | 2002-07-22 | Interactive talking dolls |
US10/658,043 Expired - Fee Related US7068941B2 (en) | 1997-04-09 | 2003-09-09 | Interactive talking dolls |
US11/206,532 Abandoned US20060009113A1 (en) | 1997-04-09 | 2005-08-18 | Interactive talking dolls |
US14/179,222 Expired - Fee Related US9067148B2 (en) | 1997-04-09 | 2014-02-12 | Interactive talking dolls |
Country Status (2)
Country | Link |
---|---|
US (10) | US6358111B1 (en) |
CA (1) | CA2225060A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040010413A1 (en) * | 2002-07-11 | 2004-01-15 | Takei Taka Y. | Action voice recorder |
US20050153624A1 (en) * | 2004-01-14 | 2005-07-14 | Wieland Alexis P. | Computing environment that produces realistic motions for an animatronic figure |
US20050287913A1 (en) * | 2004-06-02 | 2005-12-29 | Steven Ellman | Expression mechanism for a toy, such as a doll, having fixed or movable eyes |
US20110237154A1 (en) * | 2010-03-26 | 2011-09-29 | Nelson Gutierrez | My Best Friend Doll |
Families Citing this family (168)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060206219A1 (en) | 1995-05-30 | 2006-09-14 | Brown David W | Motion control systems and methods |
GB9700854D0 (en) * | 1997-01-16 | 1997-03-05 | Scient Generics Ltd | Sub-audible acoustic data transmission mechanism |
CA2225060A1 (en) * | 1997-04-09 | 1998-10-09 | Peter Suilun Fong | Interactive talking dolls |
US20010032278A1 (en) | 1997-10-07 | 2001-10-18 | Brown Stephen J. | Remote generation and distribution of command programs for programmable devices |
IL125221A0 (en) | 1998-07-06 | 1999-03-12 | Toy Control Ltd | Motion activation using passive sound source |
GB2342813A (en) * | 1998-08-13 | 2000-04-19 | Nigel Shane Bray | Interactive educational toys with wireless communication devices |
IL127569A0 (en) | 1998-09-16 | 1999-10-28 | Comsense Technologies Ltd | Interactive toys |
US6607136B1 (en) | 1998-09-16 | 2003-08-19 | Beepcard Inc. | Physical presence digital authentication system |
JP2002527012A (en) | 1998-10-02 | 2002-08-20 | コムセンス・テクノロジーズ・リミテッド | Card for interaction with computer |
US7260221B1 (en) | 1998-11-16 | 2007-08-21 | Beepcard Ltd. | Personal communicator authentication |
US6149490A (en) * | 1998-12-15 | 2000-11-21 | Tiger Electronics, Ltd. | Interactive toy |
CN1151858C (en) | 1999-02-04 | 2004-06-02 | 英特莱格公司 | Programmable toy with communication means |
US6729934B1 (en) * | 1999-02-22 | 2004-05-04 | Disney Enterprises, Inc. | Interactive character system |
US6178923B1 (en) * | 1999-05-18 | 2001-01-30 | Robert A. Plotkin | System and method for making live animals appear to talk |
GB9917985D0 (en) | 1999-07-30 | 1999-09-29 | Scient Generics Ltd | Acoustic communication system |
WO2001009863A1 (en) * | 1999-07-31 | 2001-02-08 | Linden Craig L | Method and apparatus for powered interactive physical displays |
EP1390929A2 (en) * | 1999-09-14 | 2004-02-25 | Aisynth Entertainment Inc. | Smart toys |
US6631351B1 (en) | 1999-09-14 | 2003-10-07 | Aidentity Matrix | Smart toys |
US8019609B2 (en) | 1999-10-04 | 2011-09-13 | Dialware Inc. | Sonic/ultrasonic authentication method |
US6879862B2 (en) * | 2000-02-28 | 2005-04-12 | Roy-G-Biv Corporation | Selection and control of motion data |
US8032605B2 (en) | 1999-10-27 | 2011-10-04 | Roy-G-Biv Corporation | Generation and distribution of motion commands over a distributed network |
US6702644B1 (en) * | 1999-11-15 | 2004-03-09 | All Season Toys, Inc. | Amusement device |
US6620024B2 (en) | 2000-02-02 | 2003-09-16 | Silverlit Toys Manufactory, Ltd. | Computerized toy |
US6736694B2 (en) * | 2000-02-04 | 2004-05-18 | All Season Toys, Inc. | Amusement device |
US8157610B1 (en) * | 2000-04-11 | 2012-04-17 | Disney Enterprises, Inc. | Location-sensitive toy and method therefor |
US6585556B2 (en) * | 2000-05-13 | 2003-07-01 | Alexander V Smirnov | Talking toy |
ES2172462B1 (en) * | 2000-05-18 | 2003-12-16 | Onilco Innovacion Sa | DOLLS CONVERSING BETWEEN THEM. |
ES2172463A1 (en) * | 2000-05-18 | 2002-09-16 | Onilco Innovacion Sa | Doll that looks for and reacts to a pet |
US6551165B2 (en) | 2000-07-01 | 2003-04-22 | Alexander V Smirnov | Interacting toys |
US6482064B1 (en) * | 2000-08-02 | 2002-11-19 | Interlego Ag | Electronic toy system and an electronic ball |
US7042366B1 (en) * | 2000-09-06 | 2006-05-09 | Zilog, Inc. | Use of remote controls for audio-video equipment to control other devices |
WO2002043024A1 (en) * | 2000-11-23 | 2002-05-30 | Koninklijke Philips Electronics N.V. | Arrangement including a remote control device and a first electronic device |
AU2211102A (en) * | 2000-11-30 | 2002-06-11 | Scient Generics Ltd | Acoustic communication system |
US6555979B2 (en) * | 2000-12-06 | 2003-04-29 | L. Taylor Arnold | System and method for controlling electrical current flow as a function of detected sound volume |
JP3855653B2 (en) * | 2000-12-15 | 2006-12-13 | ヤマハ株式会社 | Electronic toys |
US6682387B2 (en) * | 2000-12-15 | 2004-01-27 | Silverlit Toys Manufactory, Ltd. | Interactive toys |
CA2367183A1 (en) * | 2001-01-10 | 2002-07-10 | Gary Rottger | Interactive television |
US7904194B2 (en) | 2001-02-09 | 2011-03-08 | Roy-G-Biv Corporation | Event management systems and methods for motion control systems |
US9219708B2 (en) | 2001-03-22 | 2015-12-22 | DialwareInc. | Method and system for remotely authenticating identification devices |
US6682392B2 (en) * | 2001-04-19 | 2004-01-27 | Thinking Technology, Inc. | Physically interactive electronic toys |
TW572767B (en) * | 2001-06-19 | 2004-01-21 | Winbond Electronics Corp | Interactive toy |
WO2003000370A1 (en) * | 2001-06-25 | 2003-01-03 | Peter Sui Lun Fong | Interactive talking dolls |
US6836807B2 (en) * | 2001-10-30 | 2004-12-28 | Topseed Technology Corp. | Wireless receiving device and method jointly used by computer peripherals |
US6810436B2 (en) * | 2001-10-30 | 2004-10-26 | Topseed Technology Corp. | Wireless receiving device and method jointly used by computer peripherals |
DK1464172T3 (en) * | 2001-12-24 | 2013-06-24 | Intrasonics Sarl | Subtitle system |
US7297044B2 (en) * | 2002-08-26 | 2007-11-20 | Shoot The Moon Products Ii, Llc | Method, apparatus, and system to synchronize processors in toys |
US7137861B2 (en) * | 2002-11-22 | 2006-11-21 | Carr Sandra L | Interactive three-dimensional multimedia I/O device for a computer |
CA2517229A1 (en) * | 2003-03-12 | 2004-09-23 | Mattel Inc. | Interactive dvd gaming system |
US20050003733A1 (en) * | 2003-05-01 | 2005-01-06 | Janice Ritter | Elastic sound-making toy with rotatable appendages |
US6822154B1 (en) * | 2003-08-20 | 2004-11-23 | Sunco Ltd. | Miniature musical system with individually controlled musical instruments |
US7706548B2 (en) * | 2003-08-29 | 2010-04-27 | International Business Machines Corporation | Method and apparatus for computer communication using audio signals |
US20060064503A1 (en) | 2003-09-25 | 2006-03-23 | Brown David W | Data routing systems and methods |
US8027349B2 (en) | 2003-09-25 | 2011-09-27 | Roy-G-Biv Corporation | Database event driven motion systems |
CN2668210Y (en) * | 2003-10-29 | 2005-01-05 | 晓兴有限公司 | Music toy |
WO2005060558A2 (en) * | 2003-12-12 | 2005-07-07 | Uts, L.L.C. | Urinary transfer system and associated method of use |
US20050154594A1 (en) * | 2004-01-09 | 2005-07-14 | Beck Stephen C. | Method and apparatus of simulating and stimulating human speech and teaching humans how to talk |
JP2005221677A (en) * | 2004-02-04 | 2005-08-18 | Canon Inc | Image forming apparatus |
JP4386262B2 (en) * | 2004-02-04 | 2009-12-16 | キヤノン株式会社 | Image forming apparatus |
JP2005221676A (en) * | 2004-02-04 | 2005-08-18 | Canon Inc | Image forming apparatus and its controlling method |
JP4418689B2 (en) * | 2004-02-04 | 2010-02-17 | キヤノン株式会社 | Image forming apparatus |
WO2005115577A2 (en) | 2004-05-17 | 2005-12-08 | Steven Ellman | Tearing mechanism for a toy, such as a doll, having fixed or movable eyes |
US20060239469A1 (en) * | 2004-06-09 | 2006-10-26 | Assaf Gil | Story-telling doll |
DE102004035970A1 (en) * | 2004-07-23 | 2006-02-16 | Sirona Dental Systems Gmbh | Method for processing a digitized workpiece, in particular of three-dimensional models of tooth replacement parts to be produced and device |
US20060111184A1 (en) * | 2004-11-03 | 2006-05-25 | Peter Maclver | Gaming system |
US8277297B2 (en) | 2004-11-03 | 2012-10-02 | Mattel, Inc. | Gaming system |
US8382567B2 (en) * | 2004-11-03 | 2013-02-26 | Mattel, Inc. | Interactive DVD gaming systems |
US20060111183A1 (en) * | 2004-11-03 | 2006-05-25 | Peter Maclver | Remote control |
US7331857B2 (en) * | 2004-11-03 | 2008-02-19 | Mattel, Inc. | Gaming system |
US20060111166A1 (en) * | 2004-11-03 | 2006-05-25 | Peter Maclver | Gaming system |
GB2419827A (en) * | 2004-11-04 | 2006-05-10 | Peter James Williams | Electronically interactive action figures |
US20060114120A1 (en) * | 2004-11-04 | 2006-06-01 | Goldstone Marc B | System and method for the unpredictable remote control of devices |
US20060175753A1 (en) * | 2004-11-23 | 2006-08-10 | Maciver Peter | Electronic game board |
TWI249350B (en) * | 2004-12-30 | 2006-02-11 | Tatung Co Ltd | Control circuit and method thereof for reducing power consumption of a display device |
US7247783B2 (en) * | 2005-01-22 | 2007-07-24 | Richard Grossman | Cooperative musical instrument |
GB2424510A (en) * | 2005-03-24 | 2006-09-27 | Nesta | Interactive blocks. |
GB2425490A (en) * | 2005-04-26 | 2006-11-01 | Steven Lipman | Wireless communication toy |
WO2006114625A2 (en) * | 2005-04-26 | 2006-11-02 | Steven Lipman | Toys |
US20060287028A1 (en) * | 2005-05-23 | 2006-12-21 | Maciver Peter | Remote game device for dvd gaming systems |
FR2889347B1 (en) * | 2005-09-20 | 2007-09-21 | Jean Daniel Pages | SOUND SYSTEM |
US20080305873A1 (en) * | 2005-10-21 | 2008-12-11 | Zheng Yu Brian | Universal Toy Controller System And Methods |
US8157611B2 (en) * | 2005-10-21 | 2012-04-17 | Patent Category Corp. | Interactive toy system |
US7808385B2 (en) * | 2005-10-21 | 2010-10-05 | Patent Category Corp. | Interactive clothing system |
US20080153594A1 (en) * | 2005-10-21 | 2008-06-26 | Zheng Yu Brian | Interactive Toy System and Methods |
US20080139080A1 (en) * | 2005-10-21 | 2008-06-12 | Zheng Yu Brian | Interactive Toy System and Methods |
EP1940389A2 (en) | 2005-10-21 | 2008-07-09 | Braincells, Inc. | Modulation of neurogenesis by pde inhibition |
US20080303787A1 (en) * | 2005-10-21 | 2008-12-11 | Zheng Yu Brian | Touch Screen Apparatus And Methods |
US20080300061A1 (en) * | 2005-10-21 | 2008-12-04 | Zheng Yu Brian | Online Interactive Game System And Methods |
US8469766B2 (en) * | 2005-10-21 | 2013-06-25 | Patent Category Corp. | Interactive toy system |
US20070178966A1 (en) * | 2005-11-03 | 2007-08-02 | Kip Pohlman | Video game controller with expansion panel |
US20070213111A1 (en) * | 2005-11-04 | 2007-09-13 | Peter Maclver | DVD games |
US20070158911A1 (en) * | 2005-11-07 | 2007-07-12 | Torre Gabriel D L | Interactive role-play toy apparatus |
US20080263164A1 (en) * | 2005-12-20 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Method of Sending Motion Control Content in a Message, Message Transmitting Device Abnd Message Rendering Device |
US20080014830A1 (en) * | 2006-03-24 | 2008-01-17 | Vladimir Sosnovskiy | Doll system with resonant recognition |
US8324492B2 (en) * | 2006-04-21 | 2012-12-04 | Vergence Entertainment Llc | Musically interacting devices |
GB2437823A (en) * | 2006-05-04 | 2007-11-07 | Mattel Inc | Wrist cover for simulating riding a vehicle |
US20080032276A1 (en) * | 2006-07-21 | 2008-02-07 | Yu Zheng | Interactive system |
US20080032275A1 (en) * | 2006-07-21 | 2008-02-07 | Yu Zheng | Interactive system |
US20080053286A1 (en) * | 2006-09-06 | 2008-03-06 | Mordechai Teicher | Harmonious Music Players |
US8307295B2 (en) * | 2006-10-03 | 2012-11-06 | Interbots Llc | Method for controlling a computer generated or physical character based on visual focus |
US20080082214A1 (en) * | 2006-10-03 | 2008-04-03 | Sabrina Haskell | Method for animating a robot |
US20080082301A1 (en) * | 2006-10-03 | 2008-04-03 | Sabrina Haskell | Method for designing and fabricating a robot |
US8177601B2 (en) * | 2006-11-01 | 2012-05-15 | Penny Ekstein-Lieberman | Peek-a-boo doll with dual activation |
TW200829319A (en) * | 2007-01-05 | 2008-07-16 | Allgates Semiconductor Inc | Control system of interactive toy set with online instant messaging |
US7909697B2 (en) * | 2007-04-17 | 2011-03-22 | Patent Catefory Corp. | Hand-held interactive game |
GB2448883A (en) * | 2007-04-30 | 2008-11-05 | Sony Comp Entertainment Europe | Interactive toy and entertainment device |
US20080288989A1 (en) * | 2007-05-14 | 2008-11-20 | Zheng Yu Brian | System, Methods and Apparatus for Video Communications |
US20080288870A1 (en) * | 2007-05-14 | 2008-11-20 | Yu Brian Zheng | System, methods, and apparatus for multi-user video communications |
IL184052A (en) * | 2007-06-19 | 2014-08-31 | E N T T Ltd | System and method for audio animation |
US8128500B1 (en) * | 2007-07-13 | 2012-03-06 | Ganz | System and method for generating a virtual environment for land-based and underwater virtual characters |
GB0714148D0 (en) | 2007-07-19 | 2007-08-29 | Lipman Steven | interacting toys |
US20090030808A1 (en) * | 2007-07-26 | 2009-01-29 | Shinyoung Park | Customized toy pet |
TW200907705A (en) * | 2007-08-13 | 2009-02-16 | Chu-Hsin Peng | Modeling multimedia storage with player function and multimedia player with modeling-looking |
US20090117819A1 (en) * | 2007-11-07 | 2009-05-07 | Nakamura Michael L | Interactive toy |
US8926395B2 (en) * | 2007-11-28 | 2015-01-06 | Patent Category Corp. | System, method, and apparatus for interactive play |
US8092271B2 (en) * | 2007-12-20 | 2012-01-10 | Hallmark Cards, Incorporated | Interactive toy with positional sensor |
WO2009091275A1 (en) * | 2008-01-14 | 2009-07-23 | Vladimir Anatolevich Matveev | New-year game |
US8583956B2 (en) * | 2008-01-31 | 2013-11-12 | Peter Sui Lun Fong | Interactive device with local area time synchronization capbility |
US8046620B2 (en) * | 2008-01-31 | 2011-10-25 | Peter Sui Lun Fong | Interactive device with time synchronization capability |
US20100041304A1 (en) * | 2008-02-13 | 2010-02-18 | Eisenson Henry L | Interactive toy system |
US20090209165A1 (en) * | 2008-02-15 | 2009-08-20 | Dixon Adrienne M | Scriptural speaking inspirational figurine |
US8172637B2 (en) * | 2008-03-12 | 2012-05-08 | Health Hero Network, Inc. | Programmable interactive talking device |
KR100995807B1 (en) * | 2008-03-28 | 2010-11-22 | 성균관대학교산학협력단 | Daily contents updating teller toy and method for operating the same |
GB2460306B (en) | 2008-05-29 | 2013-02-13 | Intrasonics Sarl | Data embedding system |
US7878878B2 (en) * | 2008-07-07 | 2011-02-01 | Massaro Darren S | Life size halloween novelty item |
US8354918B2 (en) | 2008-08-29 | 2013-01-15 | Boyer Stephen W | Light, sound, and motion receiver devices |
RU2011116297A (en) * | 2008-10-06 | 2012-11-20 | Вердженс Энтертейнмент ЭлЭлСи (US) | SYSTEM FOR MUSICAL INTERACTION OF AVATARS |
CN101770705B (en) * | 2009-01-05 | 2013-08-21 | 鸿富锦精密工业(深圳)有限公司 | Audio playing device with interaction function and interaction method thereof |
US8391467B2 (en) * | 2009-03-25 | 2013-03-05 | Koplar Interactive Systems International L.L.C. | Methods and systems for encoding and decoding audio signals |
EP2236736B8 (en) | 2009-03-30 | 2018-02-14 | Vallourec Drilling Products France | Wired drill pipe |
US8548613B2 (en) * | 2009-04-20 | 2013-10-01 | Disney Enterprises, Inc. | System and method for an interactive device for use with a media device |
CA2675913A1 (en) * | 2009-08-20 | 2011-02-20 | Thinking Technology Inc. | Interactive talking toy with moveable and detachable body parts |
GB2475273B (en) | 2009-11-12 | 2011-09-28 | Liberation Consulting Ltd | Toy systems and position systems |
TW201120670A (en) * | 2009-12-10 | 2011-06-16 | Inst Information Industry | Figure interaction systems and methods, and computer program products thereof |
US9144746B2 (en) | 2010-08-20 | 2015-09-29 | Mattel, Inc. | Toy with locating feature |
US8444452B2 (en) * | 2010-10-25 | 2013-05-21 | Hallmark Cards, Incorporated | Wireless musical figurines |
US8983088B2 (en) * | 2011-03-31 | 2015-03-17 | Jeffrey B. Conrad | Set of interactive coasters |
US20120270466A1 (en) * | 2011-04-25 | 2012-10-25 | Spin Master Ltd. | System for automatically tracking a moving toy vehicle |
US20130268119A1 (en) * | 2011-10-28 | 2013-10-10 | Tovbot | Smartphone and internet service enabled robot systems and methods |
US8568192B2 (en) * | 2011-12-01 | 2013-10-29 | In-Dot Ltd. | Method and system of managing a game session |
US8371897B1 (en) * | 2012-01-19 | 2013-02-12 | Silverlit Limited | Vision technology for interactive toys |
US20130280985A1 (en) * | 2012-04-24 | 2013-10-24 | Peter Klein | Bedtime toy |
US8912419B2 (en) * | 2012-05-21 | 2014-12-16 | Peter Sui Lun Fong | Synchronized multiple device audio playback and interaction |
US8454406B1 (en) * | 2012-05-24 | 2013-06-04 | Sap Link Technology Corp. | Chorusing toy system |
US20130331001A1 (en) * | 2012-06-11 | 2013-12-12 | Eitan Lev | Play System Representing a Character |
US9039483B2 (en) | 2012-07-02 | 2015-05-26 | Hallmark Cards, Incorporated | Print-level sensing for interactive play with a printed image |
US20140011423A1 (en) * | 2012-07-03 | 2014-01-09 | Uneeda Doll Company, Ltd. | Communication system, method and device for toys |
GB2507073B (en) | 2012-10-17 | 2017-02-01 | China Ind Ltd | Interactive toy |
US9616352B2 (en) * | 2012-11-27 | 2017-04-11 | Giggles International Limited | Interactive talking toy |
GB2511479A (en) * | 2012-12-17 | 2014-09-10 | Librae Ltd | Interacting toys |
US20150147936A1 (en) * | 2013-11-22 | 2015-05-28 | Cepia Llc | Autonomous Toy Capable of Tracking and Interacting With a Source |
US9636599B2 (en) | 2014-06-25 | 2017-05-02 | Mattel, Inc. | Smart device controlled toy |
US9108115B1 (en) | 2014-08-25 | 2015-08-18 | Silverlit Limited | Toy responsive to blowing or sound |
US9931572B2 (en) | 2014-09-15 | 2018-04-03 | Future of Play Global Limited | Systems and methods for interactive communication between an object and a smart device |
KR101657617B1 (en) * | 2015-02-16 | 2016-09-30 | 심플렉스 인터넷 주식회사 | System, Apparatus and Method for Input information Based on Sound Wave |
CN104941204B (en) * | 2015-07-09 | 2018-09-28 | 上海维聚网络科技有限公司 | Intelligent toy system and its exchange method |
US10272349B2 (en) | 2016-09-07 | 2019-04-30 | Isaac Davenport | Dialog simulation |
US10111035B2 (en) | 2016-10-03 | 2018-10-23 | Isaac Davenport | Real-time proximity tracking using received signal strength indication |
KR101815425B1 (en) | 2017-08-21 | 2018-01-30 | 백원선 | Talking family dolls with interactive functions |
CN107583291B (en) * | 2017-09-29 | 2023-05-02 | 深圳希格玛和芯微电子有限公司 | Toy interaction method and device and toy |
KR102676962B1 (en) | 2018-04-06 | 2024-06-19 | 머슬 랩 캐나다 인코포레이티드 | Integrated pipetting device |
US20190314732A1 (en) * | 2018-04-12 | 2019-10-17 | Intellifect Incorporated | Emotionally Responsive Electronic Toy |
US20190335714A1 (en) * | 2018-05-07 | 2019-11-07 | Radio Systems Corporation | Sound generating pet toy |
US10970486B2 (en) | 2018-09-18 | 2021-04-06 | Salesforce.Com, Inc. | Using unstructured input to update heterogeneous data stores |
US10981073B2 (en) | 2018-10-22 | 2021-04-20 | Disney Enterprises, Inc. | Localized and standalone semi-randomized character conversations |
US10500513B1 (en) * | 2018-12-07 | 2019-12-10 | Tomy International, Inc. | Interactive sound generating toy |
CN109663368B (en) * | 2019-01-03 | 2024-02-09 | 东莞银辉玩具有限公司 | Intelligent toy following method and toy robot applying same |
US11123647B2 (en) * | 2019-02-04 | 2021-09-21 | Disney Enterprises, Inc. | Entertainment system including performative figurines |
US20230050509A1 (en) * | 2021-08-10 | 2023-02-16 | Sophie Amayaka | Inspiration Quotes Delivering Toy |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3739521A (en) | 1971-11-30 | 1973-06-19 | Mattel Inc | Starting switch for toys |
US3796284A (en) | 1972-03-10 | 1974-03-12 | Mattel Inc | Starting mechanism for toy with phonograph |
US4231184A (en) | 1977-07-07 | 1980-11-04 | Horsman Dolls Inc. | Remote-control doll assembly |
US4221927A (en) * | 1978-08-08 | 1980-09-09 | Scott Dankman | Voice responsive "talking" toy |
JPS5661276U (en) | 1979-10-17 | 1981-05-25 | ||
US4516950A (en) | 1982-01-27 | 1985-05-14 | Ergoplic Ltd., An Israel Company | Speaking toy employing chordic input |
US4451911A (en) | 1982-02-03 | 1984-05-29 | Mattel, Inc. | Interactive communicating toy figure device |
JPS60128699U (en) | 1984-02-07 | 1985-08-29 | 株式会社トミー | radio controlled toy |
US4754133A (en) * | 1986-04-25 | 1988-06-28 | Williams Electronics Games, Inc. | Transceiver circuit for modulated infrared signals |
US5029214A (en) * | 1986-08-11 | 1991-07-02 | Hollander James F | Electronic speech control apparatus and methods |
US4857030A (en) | 1987-02-06 | 1989-08-15 | Coleco Industries, Inc. | Conversing dolls |
US4751353A (en) | 1987-02-06 | 1988-06-14 | Coleco Industries, Inc. | Doll or the like with position and motion sensing switch |
US4840602A (en) * | 1987-02-06 | 1989-06-20 | Coleco Industries, Inc. | Talking doll responsive to external signal |
US4878871A (en) | 1988-04-22 | 1989-11-07 | Noto Nancy C | Toy for conveying personalized message |
US4923428A (en) * | 1988-05-05 | 1990-05-08 | Cal R & D, Inc. | Interactive talking toy |
US4930019A (en) * | 1988-11-29 | 1990-05-29 | Chi Wai Chu | Multiple-user interactive audio/video apparatus with automatic response units |
JP2899013B2 (en) * | 1989-07-03 | 1999-06-02 | 俊弘 津村 | Position information transmission system for moving objects |
JP2516425Y2 (en) | 1990-12-11 | 1996-11-06 | 株式会社タカラ | Operating device |
JPH07114852B2 (en) | 1991-04-23 | 1995-12-13 | 株式会社バンダイ | Conversational toys |
US5209695A (en) | 1991-05-13 | 1993-05-11 | Omri Rothschild | Sound controllable apparatus particularly useful in controlling toys and robots |
CA2058839A1 (en) | 1992-01-08 | 1993-07-08 | Wing Fan Lam | Toy doll |
US5328401A (en) | 1992-03-23 | 1994-07-12 | Demars Robert A | Blushing toy |
US5281143A (en) * | 1992-05-08 | 1994-01-25 | Toy Biz, Inc. | Learning doll |
EP0708673A1 (en) * | 1992-10-19 | 1996-05-01 | JANI, Jeffrey, Scott | Video and radio controlled moving and talking device |
JP3201028B2 (en) * | 1992-12-28 | 2001-08-20 | カシオ計算機株式会社 | Image creation apparatus and face image communication method |
US5647787A (en) * | 1993-10-13 | 1997-07-15 | Raviv; Roni | Sound controlled toy |
US5376038A (en) | 1994-01-18 | 1994-12-27 | Toy Biz, Inc. | Doll with programmable speech activated by pressure on particular parts of head and body |
US5495357A (en) * | 1994-02-14 | 1996-02-27 | Machina, Inc. | Apparatus and method for recording, transmitting, receiving and playing sounds |
US5587545A (en) * | 1994-03-10 | 1996-12-24 | Kabushiki Kaisha B-Ai | Musical toy with sound producing body |
US6471420B1 (en) | 1994-05-13 | 2002-10-29 | Matsushita Electric Industrial Co., Ltd. | Voice selection apparatus voice response apparatus, and game apparatus using word tables from which selected words are output as voice selections |
TW312063B (en) * | 1995-08-31 | 1997-08-01 | Sony Co Ltd | |
US5636994A (en) * | 1995-11-09 | 1997-06-10 | Tong; Vincent M. K. | Interactive computer controlled doll |
US5752880A (en) * | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
US5746602A (en) * | 1996-02-27 | 1998-05-05 | Kikinis; Dan | PC peripheral interactive doll |
JPH09276555A (en) | 1996-04-12 | 1997-10-28 | M S C:Kk | Conversation toy |
JPH10135909A (en) * | 1996-10-31 | 1998-05-22 | Sony Corp | Optical signal transmitter, optical signal receiver, optical signal transmitter and optical signal transmission method |
CA2225060A1 (en) * | 1997-04-09 | 1998-10-09 | Peter Suilun Fong | Interactive talking dolls |
US6110000A (en) * | 1998-02-10 | 2000-08-29 | T.L. Products Promoting Co. | Doll set with unidirectional infrared communication for simulating conversation |
JPH11239107A (en) * | 1998-02-23 | 1999-08-31 | Taiyo Yuden Co Ltd | Two-way optical communication equipment and optical remote controller |
US6089942A (en) * | 1998-04-09 | 2000-07-18 | Thinking Technology, Inc. | Interactive toys |
TW477237U (en) * | 2000-07-05 | 2002-02-21 | Elan Microelectronics Corp | Interactive toy device using ultrasound for transmitting signals |
US7297044B2 (en) * | 2002-08-26 | 2007-11-20 | Shoot The Moon Products Ii, Llc | Method, apparatus, and system to synchronize processors in toys |
US6822154B1 (en) * | 2003-08-20 | 2004-11-23 | Sunco Ltd. | Miniature musical system with individually controlled musical instruments |
US9309275B2 (en) | 2013-03-04 | 2016-04-12 | Idenix Pharmaceuticals Llc | 3′-deoxy nucleosides for the treatment of HCV |
-
1997
- 1997-12-17 CA CA 2225060 patent/CA2225060A1/en not_active Abandoned
-
2000
- 2000-10-10 US US09/685,526 patent/US6358111B1/en not_active Expired - Lifetime
- 2000-10-10 US US09/685,527 patent/US6309275B1/en not_active Expired - Lifetime
-
2001
- 2001-06-13 US US09/880,425 patent/US6454625B1/en not_active Expired - Lifetime
- 2001-06-18 US US09/883,762 patent/US6497604B2/en not_active Expired - Lifetime
- 2001-07-24 US US09/876,367 patent/US6375535B1/en not_active Expired - Lifetime
- 2001-11-08 US US10/008,879 patent/US6497606B2/en not_active Expired - Lifetime
-
2002
- 2002-07-22 US US10/200,696 patent/US6641454B2/en not_active Expired - Lifetime
-
2003
- 2003-09-09 US US10/658,043 patent/US7068941B2/en not_active Expired - Fee Related
-
2005
- 2005-08-18 US US11/206,532 patent/US20060009113A1/en not_active Abandoned
-
2014
- 2014-02-12 US US14/179,222 patent/US9067148B2/en not_active Expired - Fee Related
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040010413A1 (en) * | 2002-07-11 | 2004-01-15 | Takei Taka Y. | Action voice recorder |
US20050153624A1 (en) * | 2004-01-14 | 2005-07-14 | Wieland Alexis P. | Computing environment that produces realistic motions for an animatronic figure |
US8374724B2 (en) * | 2004-01-14 | 2013-02-12 | Disney Enterprises, Inc. | Computing environment that produces realistic motions for an animatronic figure |
US20050287913A1 (en) * | 2004-06-02 | 2005-12-29 | Steven Ellman | Expression mechanism for a toy, such as a doll, having fixed or movable eyes |
US20070254554A1 (en) * | 2004-06-02 | 2007-11-01 | Steven Ellman | Expression mechanism for a toy, such as a doll, having fixed or movable eyes |
US20110237154A1 (en) * | 2010-03-26 | 2011-09-29 | Nelson Gutierrez | My Best Friend Doll |
Also Published As
Publication number | Publication date |
---|---|
US20060009113A1 (en) | 2006-01-12 |
US6497606B2 (en) | 2002-12-24 |
US20040082255A1 (en) | 2004-04-29 |
US6309275B1 (en) | 2001-10-30 |
US6358111B1 (en) | 2002-03-19 |
US9067148B2 (en) | 2015-06-30 |
US6641454B2 (en) | 2003-11-04 |
US20020187722A1 (en) | 2002-12-12 |
US20020052163A1 (en) | 2002-05-02 |
US20020024447A1 (en) | 2002-02-28 |
CA2225060A1 (en) | 1998-10-09 |
US6497604B2 (en) | 2002-12-24 |
US6454625B1 (en) | 2002-09-24 |
US20140179196A1 (en) | 2014-06-26 |
US6375535B1 (en) | 2002-04-23 |
US7068941B2 (en) | 2006-06-27 |
US20010034180A1 (en) | 2001-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6358111B1 (en) | Interactive talking dolls | |
US6110000A (en) | Doll set with unidirectional infrared communication for simulating conversation | |
US6409636B1 (en) | Electronic jump rope | |
US8715031B2 (en) | Interactive device with sound-based action synchronization | |
US5437552A (en) | Interactive audio-visual work | |
US4840602A (en) | Talking doll responsive to external signal | |
US7695338B2 (en) | Remote controlled toy | |
US20010041496A1 (en) | Talking toy | |
US8821209B2 (en) | Interactive device with sound-based action synchronization | |
AU2007240321A1 (en) | Musically interacting devices | |
US6108515A (en) | Interactive responsive apparatus with visual indicia, command codes, and comprehensive memory functions | |
CA2113329A1 (en) | Talking playset | |
WO2003000370A1 (en) | Interactive talking dolls | |
JP3066762U (en) | Conversation toys | |
CN2662963Y (en) | Voice toy | |
RU2218202C2 (en) | Device for audio control of toy | |
JPH10328421A (en) | Automatically responding toy | |
JPH0356000Y2 (en) | ||
TWI336266B (en) | The controll method of an interactive intellectual robotic toy | |
JPH07175567A (en) | Input device | |
JP2001264466A (en) | Voice processing device | |
JPH09747A (en) | Doll toy for performing plural different speeches and actions by the same contact means | |
JPH04150885A (en) | Imitation sound reproducing toy | |
WO2000074020A1 (en) | Toy telephone educational or amusement apparatus | |
JP2001265372A (en) | Sound processor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |