WO1999010065A2 - Interactive talking toy - Google Patents

Interactive talking toy Download PDF

Info

Publication number
WO1999010065A2
WO1999010065A2 PCT/IL1998/000406 IL9800406W WO9910065A2 WO 1999010065 A2 WO1999010065 A2 WO 1999010065A2 IL 9800406 W IL9800406 W IL 9800406W WO 9910065 A2 WO9910065 A2 WO 9910065A2
Authority
WO
WIPO (PCT)
Prior art keywords
computer
fanciful
toy
action
controlled
Prior art date
Application number
PCT/IL1998/000406
Other languages
French (fr)
Other versions
WO1999010065A3 (en
Inventor
Oz Gabai
Moshe Cohen
Jacob Gabai
Dov Shlomo Eylath
Nimrod Sandlerman
Original Assignee
Creator Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL12164297A external-priority patent/IL121642A0/en
Application filed by Creator Ltd. filed Critical Creator Ltd.
Priority to AU88834/98A priority Critical patent/AU8883498A/en
Priority to EP98940531A priority patent/EP0935492A4/en
Publication of WO1999010065A2 publication Critical patent/WO1999010065A2/en
Publication of WO1999010065A3 publication Critical patent/WO1999010065A3/en

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue

Definitions

  • the present invention relates to toys in general, and particularly to computer-controlled toys with a capacity for speech.
  • toys which are remotely controlled by wireless communication and which are not used in conjunction with a computer system.
  • such toys include vehicles whose motion is controlled by a human user via a
  • Haugerud describes computer control of a toy via a wired connection, wherein the user of the computer typically writes a simple program
  • US Patent 4,840,602 to Rose describes a talking doll responsive to an external signal, in which the doll has a vocabulary stored in digital data in a memory which may be accessed to cause a speech synthesizer in the doll to simulate speech.
  • control signals to the animated character to provide speech, hearing vision and
  • US Patent 5,388,493 describes a system for a housing for a vertical dual keyboard MIDI wireless controller for accordionists.
  • the system may be used with either a conventional MIDI cable connection or by a wireless MIDI transmission system.
  • German Patent DE 3009-040 to Neuhierl describes a device for adding the capability to transmit sound from a remote control to a controlled model vehicle.
  • the sound is generated by means of a microphone or a tape recorder and transmitted to the controlled model vehicle by means of radio communications.
  • the model vehicle is equipped with a speaker that emits the received sounds.
  • the present invention seeks to provide an improved computer-controlled toy system with a capacity for modifying a known language and/or speaking in a previously unknown or whimsical language.
  • the computer or computer-controlled toy may be any suitable computer or computer-controlled toy.
  • the computer or computer-controlled toy may be any suitable computer or computer-controlled toy.
  • a user may interact with the computer or
  • the computer-controlled toy "demodifies" the speech to arrive at an associated English word.
  • the computer or computer-controlled toy may perform an action based on
  • a computer or computer-controlled toy speaks a language with an increasing
  • the present invention also seeks to provide an improved computer-
  • a computer or computer-controlled toy is configured with a set of actions or
  • the computer or computer-controlled toy is further capable of introducing
  • the computer or computer-controlled toy is additionally or alternatively capable of receiving a word chosen by the user for association with the action.
  • the computer or computer-controlled toy may maintain
  • Words of any language known to the computer or computer-controlled toy may have an associated level of complexity for controlling what words are available to the computer
  • the user may then command the computer or computer-controlled toy using the private language.
  • the computer or computer-controlled toy makes-up a language for a each of a
  • predefined and/or user defined base language units comprising monosyllabic or
  • Base language units may be predefined together with a complexity designation (e.g., those with more syllables, more
  • the user provides the computer or computer-controlled toy with made-up
  • the computer or computer-controlled toy interprets user speech by searching made-up, modified, and/or known languages, possibly in a particular order.
  • the user may give a cue to indicate that he is using and wishes to be understood using a particular
  • the present invention a toy with developing skills, the toy including a fanciful figure having a
  • action control circuitry operative to control the fanciful figure to perform the action at different levels of skill at different times.
  • the capacity to perform an action includes a capacity to talk.
  • the action control circuitry is operative to control the fanciful figure to perform
  • the action at an increasing level of skill over time. Additionally in accordance with a preferred embodiment of the present invention the action includes talking and the fanciful figure is operative to increase its
  • the capacity to perform an action includes performing at least one physical action in response to an oral stimulus.
  • a system for interacting with a computer-controlled fanciful figure including at least one fanciful figure, at least one speech output apparatus, at least
  • one computer operative to control the fanciful figure and provide a speech output associated with the fanciful figure via the at least one speech output apparatus, the
  • speech output is in a special language.
  • the special language is at least partly generated by the at least one computer.
  • the special language is at least partly generated by modifying at least one
  • the at least one computer is operative to receive the at least one language
  • the at least one computer is operative to provide the at least one language modification rule to a user.
  • the at least one fanciful figure is action induceable for producing an action.
  • the action includes a movement.
  • the action includes a sound.
  • the action includes a light emission.
  • the speech output is identifiable with the action.
  • the at least one computer is operative to induce the fanciful figure to produce the action.
  • the user induces the fanciful figure to produce the action and the at least one
  • computer is operative to detect the action.
  • the computer is operative to receive a speech input via the at least one speech input
  • the speech input is identifiable with the action.
  • the at least one computer is additionally operative to translate between the
  • the at least one fanciful figure is displayable on a computer display.
  • the speech output apparatus is assembled with the at least one computer. Additionally in accordance with a preferred embodiment of the present invention
  • the fanciful figure is a toy in communication with the at least one computer.
  • the at least one computer is assembled with the toy.
  • the toy includes at least one appendage that is actuable.
  • the toy includes at least one appendage that is articulatable.
  • the speech output apparatus is assembled with the toy.
  • the language is a previously unknown language.
  • one computer and the speech input apparatus is assembled with the toy.
  • the at least one fanciful figure includes a plurality of fanciful figures.
  • the speech input apparatus is assembled with the at least one computer .
  • the special language is preassembled with the at least one computer.
  • a method of playing with a toy including selecting an action having an associated skill level, controlling a fanciful figure to perform the
  • the selecting step includes selecting a talking action.
  • the increasing step includes increasing a vocabulary over time.
  • a method of playing with a toy including providing
  • At least one fanciful figure controlling speech output apparatus to provide a speech output associated with the fanciful figure the speech output is in a special language.
  • controlling step includes generating at least part of the special language.
  • the generating step includes generating the at least part of the special language by modifying at least one known language according to at least one language
  • the generating step includes generating the at least part of the special language
  • the method includes controlling the at least one fanciful figure, to perform an
  • a method of playing with a toy including providing at least one fanciful figure, controlling the at least one fanciful figure to produce an action, and accepting at least one speech input for association with the action.
  • the method includes controlling speech output apparatus to provide a speech output associated with the fanciful figure.
  • a wireless computer controlled toy system including a computer system
  • toy including a first wireless receiver, the toy receiving the first transmission via the first wireless receiver and operative to carry out at least one action based on the first
  • the computer system may include a computer game.
  • the toy may include
  • the at least one action may include a plurality of actions.
  • the first transmission may include a digital signal.
  • the computer system includes a computer having a MLDI port and wherein the
  • the computer may be operative to transmit the digital signal by way of the MIDI port.
  • the sound includes music, a pre-recorded sound and/or speech.
  • the speech may include recorded speech and synthesized speech.
  • the at least one toy has a plurality of states including at least a sleep state and
  • the first transmission includes a state transition command
  • the at least one action includes transitioning between the sleep state and the awake state.
  • a sleep state may typically include a state in which the toy consumes a
  • an awake state is typically a state of normal operation.
  • the computer system includes a plurality of computers.
  • the first transmission includes computer identification data and the second
  • transmission includes computer identification data.
  • the at least one toy is operative to transmit a second transmission via a second wireless transmitter and the computer system is operative to receive the second transmission via a second wireless receiver.
  • the at least one toy includes at least a first toy and a second toy, and wherein the first toy is operative to transmit a toy-to-toy transmission to the second toy via the second
  • the second toy is operative to carry out at least one
  • operation of the computer system is controlled, at least in part, by the second
  • the computer system includes a computer game, and wherein operation of the
  • the second transmission may include a digital signal and/or an analog
  • the computer system has a plurality of states including at least a sleep state and
  • the second transmission include a state transition command
  • invention at least one toy includes sound input apparatus, and the second transmission
  • the computer system is also operative to perform at least one of the following actions: manipulate the sound signal; and play the sound signal.
  • the sound includes speech
  • the computer system is operative to perform a speech recognition operation on the speech.
  • the second transmission includes toy identification data, and the computer
  • system is operative to identify the at least one toy based, at least in part, on the toy
  • the first transmission includes toy identification data.
  • the toy identification may adapt a mode of operation thereof based, at least in part, on the toy identification
  • the at least one action may include movement of the toy, movement of a part of
  • the sound may be transmitted using a MIDI
  • a game system including a computer system operative to control
  • a computer game and having a display operative to display at least one display object, and at least one toy in wireless communication with the computer system, the computer
  • game including a plurality of game objects, and the plurality of game objects includes the at least one display object and the at least one toy.
  • the at least one toy is operative to transmit toy identification data to the
  • the computer game based, at least in part, on the toy identification data.
  • the computer system may include a plurality of computers.
  • the first transmission includes computer identification data and the second transmission includes computer identification data.
  • MIDI musical instrument data interface
  • apparatus including MIDI apparatus operative to receive and transmit MIDI data
  • the first wireless apparatus is
  • MIDI data including data received from the first MIDI device to the second wireless apparatus, and to transmit MIDI data including data received from
  • the second wireless apparatus to the first MIDI device, and the second wireless
  • apparatus is operative to transmit M DI data including data received from the second MIDI device to the first wireless apparatus, and to transmit MIDI data including data
  • the second wireless apparatus includes a plurality of wirelesses each
  • each of the second plurality of wirelesses is operative to transmit MIDI data including data received from the associated MIDI device to the first wireless apparatus, and to transmit MIDI data including data received from the first wireless apparatus to the associated MIDI device.
  • the first MIDI device may include a computer, while the second MIDI
  • the device may include a toy.
  • the first wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the first wireless and a first analog device,
  • the second wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the second wireless and a second analog device, and the first wireless apparatus is also operative to transmit analog signals
  • first wireless apparatus and to transmit analog signals including data received from the
  • a method for generating control instructions for a computer controlled toy system includes selecting a toy, selecting at least one command from among a plurality of commands associated with the toy, and generating
  • control instructions for the toy including the at least one command.
  • the step of selecting at least one command includes choosing a command
  • command includes utilizing a graphical user interface.
  • the at least one control parameter includes an execution condition controlling
  • the execution condition may include a time at which to perform the
  • condition may also include a status of the toy.
  • the at least one control parameter includes a command modifier modifying
  • the at least one control parameter includes a condition dependent on a future
  • the at least one command includes a command to cancel a previous command.
  • the present invention a signal transmission apparatus for use in conjunction with a
  • the apparatus including wireless transmission apparatus; and signal processing
  • apparatus including at least one of the following analog/digital sound conversion apparatus operative to convert analog sound signals to digital sound signals, to convert digital sound signals to analog sound signals, and to transmit the signals between the
  • a peripheral control interface operative to transmit control signals between the computer and a
  • peripheral device using the wireless transmission apparatus and a MIDI interface
  • a computer system including a computer, and a sound card operatively attached to the computer and having a MIDI connector and at least one
  • the computer is also operative to receive digital signals by means of the MIDI connector and to receive analog signals by means of the at least one analog connector.
  • Figs. 1 - 32C illustrate a toy system for use in conjunction with a
  • Fig. 1A is a partly pictorial, partly block diagram illustration of a
  • Fig. IB is a partly pictorial, partly block diagram illustration a preferred
  • FIG. 1C is a partly pictorial, partly block diagram illustration of a
  • FIGS. 2A - 2C are simplified pictorial illustrations of a portion of the
  • Fig. 3 is a simplified block diagram of a preferred implementation of the
  • Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3;
  • Figs. 5A - 5D taken together comprise a schematic diagram of the apparatus of Fig. 4;
  • Fig. 5E is an schematic diagram of an alternative implementation of the
  • Fig. 6 is a simplified block diagram of a preferred implementation of the
  • Figs. 7A - 7F taken together with either Fig. 5D or Fig. 5E, comprise a
  • Fig. 8A is a simplified flowchart illustration of a preferred method for
  • FIG. 8B - 8T taken together, comprise a simplified flowchart illustration
  • Fig. 9A is a simplified flowchart illustration of a preferred method for
  • Figs. 9B - 9N taken together with Figs. 8D - 8M, comprise a simplified
  • Figs. 10A - IOC are simplified pictorial illustrations of a signal transmitted
  • Fig. 11 is a simplified flowchart illustration of a preferred method for
  • Figs. 12A - 12C are pictorial illustrations of a preferred implementation of
  • Fig. 13 is a block diagram of a first sub-unit of a multi-port multi-channel
  • Fig. 14 is a block diagram of a second sub-unit of a multi-port multi ⁇
  • Fig. 16 is a simplified flowchart illustration of a preferred method by
  • FIG. 17 is a simplified flowchart illustration of a preferred method for implementing the "select control channel pair" step of Fig. 16;
  • Fig. 18A is a simplified flowchart illustration of a preferred method for
  • Fig. 18B is a simplified flowchart illustration of a preferred method for
  • Fig. 19 is a simplified flowchart illustration of a preferred method of
  • Fig. 20 is a simplified illustration of a remote game server in association
  • a wireless computer controlled toy system which may include a network computer
  • Fig. 21 is a simplified flowchart illustration of the operation of the computer or of the network computer of Fig. 20, when operating in conjunction with the
  • Fig. 22 is a simplified flowchart illustration of the operation of the remote
  • Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
  • Figs. 24A - 24E taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
  • a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer
  • FIGS. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
  • Figs. 27A - 27J are preferred flowchart illustrations of a preferred radio
  • Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14;
  • Fig. 30 is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a further
  • Fig. 31 is a block diagram is a simplified block diagram illustrating the
  • FIG. 32 A, 32B and 32C taken together form a simplified block diagram
  • Figs. 33 - 43 illustrates embodiments of the toy system of Figs. 1 - 32C
  • FIG. 33 is a simplified pictorial illustration of a display-based fanciful figure interaction system constructed and operative in accordance with a preferred embodiment of the present invention
  • Figs. 34A and 34B taken together, are simplified pictorial illustrations of a toy-based fanciful figure interaction system constructed and operative in accordance
  • Fig. 34C is a simplified pictorial illustration of the toy-based fanciful figure of Figs. 34A and 34B;
  • Fig. 35 is a simplified block diagram of a fanciful figure interaction system
  • Fig. 36 is a simplified operational flow chart of a fanciful figure
  • Fig. 37 is a simplified operational flow chart of a preferred implementation of step 3440 of Fig. 36;
  • Fig. 38 is a simplified operational flow chart of a preferred embodiment
  • Fig. 39 is a simplified operational flow chart of a preferred embodiment
  • Fig. 40 is a simplified operational flow chart of a preferred implementation of step 3490 of Fig. 36;
  • Fig. 41 is a simplified block diagram of a preferred logical implementation
  • Figs. 42 and 43 taken together, are simplified block diagrams of possible
  • FIG. 1 A is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and
  • the system of Fig. 1A comprises a computer 100, which may be any suitable computer such
  • the computer 100 is equipped
  • the computer 100 is preferably equipped with a sound card such as,
  • the computer 100 is equipped with a computer radio interface 110
  • the computer 100 and, in a preferred embodiment of the present invention, also to receive signals transmitted elsewhere via wireless transmission and to deliver the signals
  • computer radio interface 110 are transmitted via both analog signals and digital signals,
  • the transmitted signal may be an analog signal or a digital signal.
  • the received signal may also be an analog signal or a digital signal.
  • Each signal typically comprises a message.
  • a preferred implementation of the computer radio interface 110 is
  • the system of Fig. 1A also comprises one or more toys 120.
  • the system of Fig. 1A also comprises one or more toys 120.
  • Fig. 1A comprises a plurality of toys, namely three toys 122, 124, and 126 but it is
  • Fig. IB is a partly pictorial, partly block diagram illustration of the toy 122 of Fig. 1 A.
  • Each toy 120 comprises a power source 125, such as a battery or a
  • Each toy 120 also comprises a toy control device 130,
  • the received signal may be, as explained above, an analog signal or a digital signal.
  • Each toy 120 preferably comprises a plurality of input devices 140 and
  • the input devices 140 may comprise, for example
  • a microphone 141 a microswitch sensor 142; a touch sensor (not shown in Fig. IB); a light sensor (not shown in Fig. IB); a movement sensor
  • 143 which may be, for example, a tilt sensor or an acceleration sensor.
  • the output devices 150 may comprise, for example, one or more of the following: a speaker 151; a light 152; a solenoid 153 which may be
  • a motor such as a stepping motor, operative to move a portion of the toy;
  • DC motors available from Alkatel (dunkermotoren), Postfach 1240, D-7823, BonndorCSchwarzald, Germany;
  • stepping motors and miniature motors available from Haydon Switch and Instruments, Inc. (HSI), 1500 Meriden Road, Waterbury, CT, USA; and DC solenoids available from Communications Instruments, Inc., P.O. Box 520, Fairview, North Carolina 28730, USA.
  • Examples of actions which the toy may perform include the following:
  • a recorded sound a synthesized sound
  • music including recorded music or synthesized music
  • speech including recorded speech or synthesized speech.
  • the received signal may comprise a condition governing the action as, for
  • the duration of the action or the number of repetitions of the action.
  • the portion of the received signal comprising a message
  • comprising a sound typically comprises an analog signal.
  • a sound typically comprises an analog signal.
  • the portion of the received signal is the portion of the received signal
  • comprising a sound, including music may comprise a digital signal, typically a signal
  • the action the toy may perform also includes reacting to signals transmitted by another toy, such as, for example, playing sound that the other toy is monitoring and transmitting.
  • the toy control device In a preferred embodiment of the present invention, the toy control device
  • the computer radio interface 110 controls the computer radio interface 110.
  • the computer radio interface 110 controls the computer radio interface 110.
  • the 110 is preferably also operative to poll the toy control device 130, that is, transmit a
  • the signal transmitted by the toy control device 130 may comprise one or more of the following: sound, typically sound captured by a microphone input device
  • sensor input devices 140 as, for example, light sensors or micro switch; an indication of low power in the power source 125; or information identifying the toy.
  • a sound signal transmitted by the device 130 may
  • the computer system is operative to perform a speech recognition
  • the signal from the radio control interface 110 may also comprise, for example, one or more of the following: a request to ignore input from one or more input
  • control device 130 to transmit a signal comprising the stored data received from the one
  • radio interface 110 and the toy control device 130 include information identifying the
  • Fig. 1C is a partly pictorial, partly block
  • FIG. 1 diagram illustration of a computer control system including a toy, constructed and
  • the system of Fig. 1C comprises two computers 100. It is appreciated that, in
  • toy control device 130 typically include information identifying the computer.
  • the computer 100 runs software comprising a computer game, typically a
  • the software may comprise
  • animated object includes any object which may be
  • An animated object may be any object depicted on the screen such as, for example: a doll; an action figure; a toy, such as, for example, an activity toy, a vehicle, or a ride-on vehicle; a drawing board or sketch board;
  • a household object such as, for example, a clock, a lamp, a chamber pot, or an item of
  • FIG. 2 A depicts a portion of the system of Fig. 1A in use.
  • the apparatus of Fig. 2 A comprises the computer screen 105 of Fig. 1A.
  • animated objects 160 and 165 are depicted on the computer screen.
  • Fig. 2B depicts the situation after the toy 122 has been brought into range
  • the toy 122 corresponds to the animated object 160.
  • Fig. 2B For example, in Fig. 2B
  • the toy 122 and the animated object 160, shown in Fig. 2 A are both a teddy bear.
  • apparatus of Fig. 2B comprises the computer screen 105, on which is depicted the
  • the apparatus of Fig. 2B also comprises the toy 122.
  • Fig. 2C depicts the situation after the toy 126 has also been brought into
  • the toy 126 corresponds to the animated object 165.
  • the toy 126 and the animated object 165 shown in Figs. 2 A and 2B, are both a clock.
  • the apparatus of Fig. 2C comprises the computer screen 105, on which no
  • the apparatus of Fig. 2C also comprises the toy 126.
  • the computer 100 is the means for calculating the total power required to produce the data.
  • Fig. 2A the user interacts with the animated objects 160 and 165 on
  • toys 122 and 126 may interact with the toys 122 and 126 by moving the toys or parts of the toys; by
  • FIG. 3 is a simplified block diagram of a
  • Fig. 3 comprises the computer radio interface 110.
  • the apparatus of Fig. 3 also comprises a sound card 190, as described above with reference to Fig. 1A.
  • the sound card 190 as described above with reference to Fig. 1A.
  • connections between the computer radio interface 110 and the sound card 190 are
  • the computer radio interface 110 comprises a DC unit 200 which is fed
  • a MIDI interface 210 which connects to the sound card MIDI
  • an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary audio interface 230 which preferably connects to a
  • the apparatus of Fig. 3 also comprises an antenna 240, which is operative
  • Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3.
  • the apparatus of Fig. 4 comprises the DC unit 200, the MIDI interface
  • the apparatus of Fig. 4 also comprises a multiplexer 240, a micro controller 250, a radio transceiver 260,
  • connection unit 270 connecting the radio transceiver 260 to the micro controller 250
  • FIG. 4 a schematic diagram of the apparatus of Fig. 4.
  • Transistors 2N2222 and MPSA14 Motorola, Phoenix, AZ, USA. Tel.
  • Ul of Fig. 5D may be replaced by:
  • U2 of Fig. 5D may be replaced by:
  • Fig. 5E is a schematic
  • Fig. 5E Ul BIM-418-F low power UHF data transceiver module, Ginsburg Electronic GmbH, Am Moosfeld 85, D-81829, Munchen, Germany.
  • Ul may be replaced by:
  • circuit boards for alternate embodiments of the apparatus.
  • the apparatus of Fig. 5E has similar functionality to the apparatus of Fig.
  • MIDI data is transmitted and received.
  • Figs. 5A - 5E are self-explanatory with regard to the above parts lists.
  • FIG. 6 is a simplified block diagram of a
  • the apparatus of Fig. 6 also comprises a microcontroller 250 similar to the microcontroller
  • the apparatus of Fig. 6 also comprises a digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the
  • microcontroller 250 and a plurality of input and output devices which may be connected
  • the apparatus of Fig. 6 also comprises an analog input/output interface
  • the apparatus of Fig. 6 also comprises a multiplexer 305 which is
  • the apparatus of Fig. 6 also comprises input devices 140 and output
  • the input devices 140 comprise, by way of example, a tilt switch
  • the output devices 150 comprise, by way of example, a DC
  • the apparatus of Fig. 6 also comprises a DC control 310, a preferred implementation of which is described in more detail below with reference to Figs. 7A -
  • the apparatus of Fig. 6 also comprises a comparator 280, similar to the
  • the apparatus of Fig. 6 also comprises a power source 125, shown in Fig. 6 by way of example as batteries, operative to provide electrical power to the apparatus
  • Fig. 5D or 5E comprise a schematic diagram of the toy control device of Fig. 6. If the
  • FIG. 5E schematics of Fig. 5E is employed to implement the computer radio interface of Fig. 4,
  • Figs. 7A - 7F are self-explanatory with reference to the above parts list.
  • analog signals or digital signals are analog signals or digital signals. It the case of digital signals, the digital signals preferably
  • device 130 comprises an indication of the intended recipient of the message.
  • messages also comprise
  • each message sent by the computer radio interface 110 to the toy control device 130 comprises an indication of the sender of the message; and each message sent by the toy control device 130 to the computer radio interface 110 comprises an indication of the intended recipient of the message.
  • a preferred set of predefined messages is as follows:
  • Set Toy control device output pin to a digital level D.
  • I 01000005000203050000 set io 3 to "1" for 5 seconds
  • the Audio is sent to the Toy control device by the computer sound card and the Computer radio interface.
  • tn vo P Computer address 00-03 I I cmd 1 ,2: Received CRI command MSB ok ack 00-FF I I cmd 1,4: Received CRI command LSB ok ack 00-FF I I
  • Fig. 8A is a simplified flowchart illustration of a preferred method for receiving radio signals, executing commands comprised therein, and sending radio signals, within the toy control device 130 of Fig.
  • each message as described above comprises a command, which may
  • Fig. 8A preferably comprises the following steps:
  • a synchronization signal or preamble is detected (step 400).
  • a header is
  • a command contained in the signal is received (step 405).
  • the command contained in the signal is executed (step 410). Executing the command may be as described above with reference to Fig. 1 A.
  • a signal comprising a command intended for the computer radio interface 110 is sent (step 420).
  • Fig. 9A is a simplified flowchart
  • Fig. 9A also preferably comprises the following steps:
  • a MIDI command is received from the computer 100 (step 430).
  • MIDI command may comprise a command intended to be transmitted to the toy control device 130, may comprise an audio in or audio out command, or may comprise a general
  • a MIDI command is sent to the computer 100 (step 440).
  • the MIDI command may comprise a signal received from the toy control device 130, may comprise a response to a MIDI command previously received by the computer radio interface 110 from the computer 100, or may comprise a general command.
  • the command contained in the MIDI command or in the received signal is executed (step 450).
  • Executing the command may comprise, in the case of a received signal, reporting the command to the computer 100, whereupon the computer 100 may typically carry out any appropriate action under program control as, for example, changing a screen display or taking any other appropriate action in response to the received command.
  • executing the command may comprise transmitting the command to the toy control device 130.
  • Executing a MIDI command may also comprise switching audio output of the computer control device 110 between the secondary audio interface 230 and the radio transceiver 260. Normally the secondary audio interface 230 is directly connected to the audio interface 220 preserving the connection between the computer sound board and the peripheral audio devices such as speakers, microphone and stereo system.
  • Figs. 9B - 9N Reference is now made to Figs. 8D - 8M, all of which, taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 9A.
  • Figs. 10A - 10C are simplified pictorial illustrations of a signal transmitted between the computer radio interface 110 and the toy control device 130 of Fig. 1A.
  • Fig. 10A comprises a synchronization preamble.
  • the duration T_SYNC of the synchronization preamble is preferably .500 millisecond, being preferably substantially equally divided into on and off
  • Fig. 10B comprises a signal representing a bit with value 0, while Fig.
  • 10C comprises a signal representing a bit with value 1.
  • Figs. 10B and 10C refer to the case where the
  • each bit is assigned a predetermined duration T, which is the same for every bit.
  • a frequency modulated carrier is transmitted, using the method of frequency modulation keying as is well known in the art.
  • An "off' signal typically less
  • Receipt of an on signal as shown in Fig. 10B of duration between 0.01 * T and 0.40 * T is preferably taken to be a bit with value 0.
  • Receipt of an on signal as shown in Fig. 10C of duration greater than 0.40 * T is preferably taken to be a bit with
  • T has a value of 1.0 millisecond.
  • the duration of the subsequent off signal is measured. The sum of the durations of the on signal and the off signal must be measured.
  • bit is between 0.90 T and 1.10 T for the bit to be considered valid. Otherwise, the bit is
  • Fig. 11 is a simplified flowchart illustration of a method for generating control instructions for the apparatus of Fig. 1A.
  • the method of Fig. 11 preferably includes the following steps:
  • a toy is selected (step 550). At least one command is selected, preferably
  • steps 560 - 580 From a plurality of commands associated with the selected toy (steps 560 - 580).
  • a command may be entered by selecting, modifying, and creating a new binary command (step 585).
  • selecting a command in steps 560 - 580 may include choosing a
  • a control parameter may include, for example, a condition depending on a result of a
  • previous command the previous command being associated either with the selected toy
  • a control parameter may also include an execution condition
  • governing execution of a command such as, for example: a condition stating that a
  • specified output is to occur based on a status of the toy, that is, if and only if a specified input is received; a condition stating that the command is to be performed at a specified time; a condition stating that performance of the command is to cease at a specified time;
  • condition comprising a command modifier modifying execution of the command, such as, for example, to terminate execution of the command in a case where execution of the
  • the command may comprise a command to cancel a previous command.
  • the output of the method of Fig. 11 typically comprises one or more
  • control instructions implementing the specified command, generated in step 590 are included in step 590.
  • the one or more control instructions are comprised in a command file.
  • the command file is called from a driver program which typically determines which command is to be executed at a given point in time and then calls the command
  • a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface.
  • Figs. 12A a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface.
  • Fig. 12A comprises a toy selection area 600, comprising a plurality of toy
  • selection icons 610 each depicting a toy.
  • Fig. 12A also typically comprises action buttons 620, typically comprising
  • Fig. 12B depicts a command generator screen typically displayed after the
  • Fig. 12B comprises an animation area 630, preferably comprising a depiction of the selected toy selection icon
  • a text area 635 comprising text describing the selected toy.
  • Fig. 12B also comprises a plurality of command category buttons 640, each of which allow the user to select a category of commands such as, for example:
  • Fig. 12B also comprises a cancel button 645 to cancel command selection
  • Fig. 12C comprises a command selection area 650, allowing the user to specify a specific command.
  • a wide variety of commands may be specified, and the
  • Fig. 12C also comprises a file name area 655, in which the user may
  • FIG. 12C also comprises a cancel button 645, similar to the cancel button 645 of Fig. 12B.
  • Fig. 12C also comprises a make button 660. Wlien the user actuates the make button 660, the control instruction generator of Fig. 11 generates control instructions
  • Fig. 12C also comprises a parameter selection area 665, in which the user may specify a parameter associated with the chosen command.
  • the steps for programming the microcontrollers of the present invention include the use of a universal programmer, such as the Universal Programmer, type EXPRO 60/80, manufactured by Sunshine Electronics Co. Ltd., Taipei, Japan.
  • Fig. 1C includes a description of a preferred set of predefined messages including a category termed "General commands".
  • Other General Commands are defined by the following description:
  • a computer transmits this command to verify that the radio channel is vacant If another computer is already using this channel it will respond with the Availability Response Command If no response is received within 250msec the channel is deemed vacant.
  • a computer transmits this command in response to an Availability Interrogation Command to announce that the radio channel is in use.
  • a Toy transmits this command to declare its existence and receive in response a Channel Pair Selection Command designating the computer that will control it and the radio channels to use.
  • a computer transmits this command in response to a Toy Availability Command to inform the toy the radio channels to be used.
  • P Computer address 00-03 H
  • FIGs. 13 and 14 there are illustrated block diagrams of multiport multichannel implementation of the computer radio interface 110 of Fig. 1A.
  • Fig. 13 illustrates the processing sub-unit of the computer interface that is implemented as an add-in board
  • Fig. 14 is the RF transceiver which is a device external to the
  • both sound and control commands may be transmitted via the MIDI connector 210 rather than
  • the functions of the interfaces 210 and 220 between the computer radio interface 110 and the sound card 190 may, alternatively, be implemented as connections between the computer radio interface 110 to the serial and/or parallel ports of the computer 100, as shown in Figs. 25A - 25E and Figs 26A -26D, respectively.
  • each transceiver 260 is configured to provide full duplex communication. If it is desired to provide full duplex communication, each transceiver 260
  • transceiver 260 (Fig. 4) which forms part of the toy control device 130 of Fig. 1A
  • Figs. 15A - 15E showing a Multi-Channel Computer Radio Interface
  • Figs. 24A - 24E showing a
  • Fig. 16 is a simplified flowchart
  • the CRI includes a conventional radio transceiver (260 of Fig. 4) which
  • RY3 GB021 having 40 channels which are divided into
  • control channels are designated as control channels.
  • one of the 4 control channel pairs is selected by
  • the selected control channel pair i is monitored by a first transceiver (step 820) to detect the appearance of a
  • step 830 selected from among the 16 such channel pairs provided over which game
  • the identity of the selected information communication channel pair also relates to the identity of the selected information communication channel pair.
  • a channel pair selection command is sent over the control channel pair to the new toy (step 840).
  • a game program is then begun (step 850), using the selected information communication channel pair.
  • the control channel pair is then free to receive
  • transceiver availability table (step 852).
  • transceiver which is not marked as busy. This transceiver is then assigned to the control channel i (step 858).
  • Fig. 17 is a simplified flowchart illustration of a preferred method for implementing "select control channel pair" step 810 of Fig. 16. In Fig. 17, the four
  • control channels are scanned. For each channel pair in which the noise level falls below a
  • step 895 the computer sends an availability interrogation command
  • step 910 waits for a predetermined time period, such as 250 ms, for a response (steps 930 and 940). If no other computer responds, i.e. sends back an "availability
  • the channel pair is deemed vacant. If the channel pair is found
  • Fig. 19 is a self-explanatory flowchart illustration of a preferred method
  • the toy control device sends a "toy availability command" (step 1160) which is a message advertising the toy's availability, on each control channel i in turn (steps
  • step 1190 the toy control device may begin receiving and executing game commands which the computer transmits over the
  • the computer system is provided, in communication with a remote game server, as shown in Fig. 20.
  • the remote game server 1250 is operative to serve to the computer 100 at least a portion of at least one toy-operating game, which operates one or more toys 1260.
  • an entire game may be downloaded from the remote game server 1250.
  • a new toy action script or new text files may be downloaded
  • a first portion of the game may be received off-line whereas an
  • computer 100 may be based on any suitable technology such as but not limited to ISDN; X.25; Frame-Relay; and Internet.
  • the computerized device may be provided locally, i.e. adjacent to the toy, because all "intelligence" may be provided from a remote source.
  • the computerized device may be less sophisticated than a personal computer, may lack a display monitor of its own, and may, for example, comprise a network computer 1270.
  • Fig. 21 is a simplified flowchart illustration of the operation of the computer 100 or of the network computer 1260 of Fig. 20, when operating in
  • Fig. 22 is a simplified flowchart illustration of the operation of the remote game server 1250 of Fig. 20.
  • Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
  • the proximity detection subsystem may for example include a pair of ultrasound transducers 1520 and 1530 associated with the toy and computer respectively.
  • the toy's ultrasound transducer 1520 typically broadcasts
  • Figs. 24A - 24E taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
  • Figs. 25A - 25E taken together, form a detailed schematic illustration of a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer.
  • Figs. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
  • Figs. 27A - 27J are preferred self-explanatory flowchart illustrations of a
  • Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14.
  • Fig. 30 illustrates a further embodiment of the present invention which includes a combination of a Computer Radio Interface (CRI) and a Toy Control Device
  • the combined unit 1610 controls a toy 1620 which is connected to the
  • the toy means such as radio communication, using the computer radio interface 110.
  • the toy means such as radio communication, using the computer radio interface 110.
  • Fig 31 illustrates a simplified block diagram of the combined unit 1610.
  • Figs. 32A, 32B and 32C taken together form a simplified schematic diagram of the EP900 EPLD chip (U9) of Fig. 28H.
  • Figs. 33 - 43 illustrate embodiments of the toy system of Figs. 1 - 32C in which a computer-controlled toy system has a capacity for
  • Fig. 33 is a simplified pictorial
  • Computer 2200 on which a fanciful figure 2210 is displayed.
  • Computer 2200 is
  • an audio input device 2220 typically a microphone, through which computer 2200 may receive audio input
  • an audio output device 2230 typically a speaker, through which computer 2200 may provide audio output
  • FIGS. 34A and 34B are simplified pictorial illustrations of a toy-based fanciful figure interaction system
  • Fig. 34C is a simplified pictorial illustration of a toy-based fanciful figure constructed
  • Figs. 34A and 34B Shown in Figs. 34A and 34B is computer 2200 preferably configured with audio input
  • a toy 2240 is shown in wired communication with computer 2200 along wired connection 2250, while in Fig. 34B toy
  • Audio input device 2220 is in communication with computer 2200 at any given time. Audio input device 2220
  • audio output device 2230 may be replaced with or augmented by audio input
  • toy 2240 is preferably configured with a control unit 2262, a power unit 2264, and one or more articulating appendages 2266.
  • a user 2280 is also shown interacting with toy 2240. It is appreciated that any or all of the functionality of computer 2200 may be assembled with or otherwise incorporated in toy 2240.
  • Fig. 35 is a simplified block diagram of a
  • the system of Fig. 35 preferably comprises a control unit 2300, a speech input and recognition unit 2310 capable of receiving a speech input and identifying the words comprising the speech input, an action interface 2320 capable of receiving action
  • a speech synthesis unit 2330 capable of producing audio speech
  • unit 2310 may receive input from audio input device 2220 (Fig. 33).
  • Action interface may be input from audio input device 2220 (Fig. 33).
  • Speech synthesis unit 2330
  • Action control unit 2340 may control an action associated with fanciful figure 2210 (Fig. 33) or toy 2240 (Figs.
  • the system of Fig. 35 also preferably comprises one or more sets
  • phonemes 2350 one or more language sets 2360, each typically comprising one or more words in a known language such as English or fanciful words, a set 2370 of
  • Fig. 35 preferably comprises a clock 2400.
  • Fig. 35 A logical implementation of the various sets shown in Fig. 35 is described in greater detail hereinbelow with reference to Fig. 41.
  • FIG. 36 is a simplified operational flow chart of a fanciful figure interaction system useful in describing the systems of Figs. 33,
  • Typical operation begins (step 3430) with the
  • a preferred method of performing step 3430 is
  • step 3450 speech
  • step 3450 A preferred method of performing step 3450 is described in greater
  • step 3460 operation continues with step 3440.
  • step 3470 is described in greater detail hereinbelow with
  • step 3440 Successfully identified speech is then checked for an
  • Fig. 37 is a simplified operational flow
  • Typical operation begins (step 3500) with selecting a term or action from action set 2390
  • Fig. 35 in accordance with selection criteria (step 3510).
  • the selection may be random or in accordance with a level of complexity or history of usage associated with an action.
  • Clock 2400 (Fig. 35) may be used to advance the level of complexity over time.
  • Association set 2390 (Fig. 35) is then searched for an
  • the associated action is then performed (step 3540) with or without verbalizing the associated language, and operation continues with step 3450 (Fig. 36) (step 3550).
  • Fig. 38 is a simplified operational flow
  • Typical operation begins (step 3560) with recording audio input typically comprising
  • the audio input in typically received via audio input device 2220
  • a data file in a volatile or non-volatile storage medium is
  • Fig. 39 is a simplified operational flow chart of a preferred implementation of step 3470 of Fig. 36 in greater detail, constructed
  • Typical operation begins (step 3610) with analyzing the file constructed in step 3590 of Fig. 38 for a first pause between speech elements, yielding a first speech element (step 3620). Speech recognition is then performed on the first speech element (step 3630). If
  • the first speech element is a language identifier (step 3640) then the current language is
  • step 3690 If the first speech element is not a language identifier, speech recognition is
  • step 3660 The speech is then identified for known words in the current language (step 3660).
  • step 3670 If no known words are found, another language is set to the current language (step 3680) and speech recognition is again performed on the rest of the file (step 3690). The speech is then identified for known words in the current language (step 3700). If no
  • step 3710 If the word is identified in a known, learned, generated, or
  • step 3720 Operation continues with step 3480 (Fig. 36).
  • Fig. 40 is a simplified operational flow
  • Typical operation begins (step 3730) with selecting a language which becomes the current language, at random or in accordance with selection criteria (step 3740).
  • Association set 2390 (Fig. 35) is then searched for an association between language in language set 2360 (Fig. 35) and the term or action (step 3750).
  • the associated action is
  • step 3760 then performed (step 3760) and operation continues with step 3450 (Fig. 36) (step
  • Fig. 41 is a simplified block diagram of a

Abstract

A toy with developing skills, the toy including a fanciful figure (122, 124, 126) having a capacity to perform an action, and action control circuitry operative to control the fanciful figure to perform the action at different levels of skill at different times.

Description

INTERACTIVE TALKING TOY FIELD OF THE INVENTION
The present invention relates to toys in general, and particularly to computer-controlled toys with a capacity for speech.
BACKGROUND OF THE INVENTION
Toys which are controlled by integrated or remote computer circuitry and
that are capable of emitting speech are known. Such toys, however, are limited to
employing known languages such as English for speech and do not incorporate the ability to modify a known language or speak with an increasing level of complexity. In addition, such toys do not have the capacity for associating words previously unknown
to them with toy movements or other actions.
Also well known in the art are toys which are remotely controlled by wireless communication and which are not used in conjunction with a computer system.
Typically, such toys include vehicles whose motion is controlled by a human user via a
remote control device.
US Patent 4,712,184 to Haugerud describes a computer controlled
educational toy, the construction of which teaches the user computer terminology and
programming and robotic technology. Haugerud describes computer control of a toy via a wired connection, wherein the user of the computer typically writes a simple program
to control movement of a robot. US Patent 4,840,602 to Rose describes a talking doll responsive to an external signal, in which the doll has a vocabulary stored in digital data in a memory which may be accessed to cause a speech synthesizer in the doll to simulate speech.
US Patent 5,021,878 to Lang describes an animated character system
with real-time control.
US Patent 5,142,803 to Lang describes an animated character system
with real-time control.
US Patent 5,191,615 to Aldava et al. describes an interrelational audio
kinetic entertainment system in which movable and audible toys and other animated
devices spaced apart from a television screen are provided with program synchronized
audio and control data to interact with the program viewer in relationship to the
television program.
US Patent 5,195,920 to Collier describes a radio controlled toy vehicle
which generates realistic sound effects on board the vehicle. Communications with a remote computer allows an operator to modify and add new sound effects.
US Patent 5,270,480 to Hikawa describes a toy acting in response to a
MIDI signal, wherein an instrument-playing toy performs simulated instrument playing
movements.
US Patent 5,289,273 to Lang describes a system for remotely controlling an animated character. The system uses radio signals to transfer audio, video and other
control signals to the animated character to provide speech, hearing vision and
movement in real-time. US Patent 5,388,493 describes a system for a housing for a vertical dual keyboard MIDI wireless controller for accordionists. The system may be used with either a conventional MIDI cable connection or by a wireless MIDI transmission system.
German Patent DE 3009-040 to Neuhierl describes a device for adding the capability to transmit sound from a remote control to a controlled model vehicle. The sound is generated by means of a microphone or a tape recorder and transmitted to the controlled model vehicle by means of radio communications. The model vehicle is equipped with a speaker that emits the received sounds.
The disclosures of all publications mentioned in the specification and of the publications cited therein are hereby incorporated by reference.
SUMMARY OF THE INVENTION
The present invention seeks to provide an improved computer-controlled toy system with a capacity for modifying a known language and/or speaking in a previously unknown or whimsical language.
In accordance with a preferred embodiment of the present invention a
computer or computer-controlled toy speaks a language such as English according to a
set of rules (such as "pig latin" rules). The computer or computer-controlled toy may
then speak in the modified language. A user may interact with the computer or
computer-controlled toy by speaking in the modified language. The computer or
computer-controlled toy "demodifies" the speech to arrive at an associated English word. The computer or computer-controlled toy may perform an action based on
modified or demodified language.
In accordance with another preferred embodiment of the present
invention a computer or computer-controlled toy speaks a language with an increasing
level of complexity. The present invention also seeks to provide an improved computer-
controlled toy system with a capacity for speaking, learning, and generating languages
other than those used in common discourse.
In accordance with another preferred embodiment of the present
invention a computer or computer-controlled toy is configured with a set of actions or
concepts such as jumping, going home, anger, etc. and a vocabulary in a known language
such as English, a previously unknown language such as "Martian," or both types of languages. The computer or computer-controlled toy is further capable of introducing
an action to a user together with a preselected, randomly selected, or other generated
word from one or all languages known to it. The computer or computer-controlled toy is additionally or alternatively capable of receiving a word chosen by the user for association with the action. The computer or computer-controlled toy may maintain
associations between actions and words that represent actions for later repetition.
Words of any language known to the computer or computer-controlled toy may have an associated level of complexity for controlling what words are available to the computer
or computer-controlled toy over time.
In accordance with another preferred embodiment of the present
invention a user and a computer or computer-controlled toy develop a "private"
language interactively where a computer-displayed animated figure or computer-
controlled toy performs a predefined or user-defined movement or action and the computer or computer-controlled toy or user assigns a made-up or other private word to
the movement or action. The private language and the language's association with
movements are maintained in a memory. The user may then command the computer or computer-controlled toy using the private language.
Further in accordance with a preferred embodiment of the present
invention the computer or computer-controlled toy talks in a made-up language. The
computer or computer-controlled toy accompanies its talking with known language,
movements, gestures, etc. for teaching its made-up language.
Still further in accordance with a preferred embodiment of the present
invention the computer or computer-controlled toy makes-up a language for a each of a
set of predefined movements, gestures, etc. by randomly selecting one or more
predefined and/or user defined base language units comprising monosyllabic or
polysyllabic phonemes, associating a selection of base language units with a specific movement, gesture, etc., and maintaining the associations. Base language units may be predefined together with a complexity designation (e.g., those with more syllables, more
difficult to pronounce) for increasingly complex selections becoming available over time.
Additionally in accordance with a preferred embodiment of the present
invention the user provides the computer or computer-controlled toy with made-up
words for association with predefined or user-defined movements, gestures, etc., with the associations being maintained by the computer or computer-controlled toy.
Moreover in accordance with a preferred embodiment of the present invention the computer or computer-controlled toy may generate made-up words for
user-provided terms.
Further in accordance with a preferred embodiment of the present
invention the computer or computer-controlled toy interprets user speech by searching made-up, modified, and/or known languages, possibly in a particular order. The user may give a cue to indicate that he is using and wishes to be understood using a particular
language.
There is thus provided in accordance with a preferred embodiment of the
present invention a toy with developing skills, the toy including a fanciful figure having a
capacity to perform an action, and action control circuitry operative to control the fanciful figure to perform the action at different levels of skill at different times.
Further in accordance with a preferred embodiment of the present
invention the capacity to perform an action includes a capacity to talk.
Still further in accordance with a preferred embodiment of the present
invention the action control circuitry is operative to control the fanciful figure to perform
the action at an increasing level of skill over time. Additionally in accordance with a preferred embodiment of the present invention the action includes talking and the fanciful figure is operative to increase its
vocabulary over time.
Moreover in accordance with a preferred embodiment of the present
invention the capacity to perform an action includes performing at least one physical action in response to an oral stimulus.
Further in accordance with a preferred embodiment of the present invention the capacity to perform an action includes performing an action other than
talking and emitting a verbalization associated with the action.
There is additionally provided in accordance with a preferred embodiment
of the present invention a system for interacting with a computer-controlled fanciful figure including at least one fanciful figure, at least one speech output apparatus, at least
one computer operative to control the fanciful figure and provide a speech output associated with the fanciful figure via the at least one speech output apparatus, the
speech output is in a special language.
Further in accordance with a preferred embodiment of the present
invention the special language is at least partly generated by the at least one computer.
Additionally in accordance with a preferred embodiment of the present
invention the special language is at least partly generated by modifying at least one
known language according to at least one language modification rule.
Moreover in accordance with a preferred embodiment of the present
invention the at least one computer is operative to receive the at least one language
modification rule from a user. Further in accordance with a preferred embodiment of the present invention the at least one computer is operative to provide the at least one language modification rule to a user.
Still further in accordance with a preferred embodiment of the present invention the special language is at least partly generated from a predefined set of
phonemes.
Additionally in accordance with a preferred embodiment of the present invention the at least one computer is operative to receive at least a portion of the special
language from a user.
Moreover in accordance with a preferred embodiment of the present
invention the at least one fanciful figure is action induceable for producing an action.
Further in accordance with a preferred embodiment of the present
invention the action includes a movement.
Additionally in accordance with a preferred embodiment of the present
invention the action includes a sound.
Moreover in accordance with a preferred embodiment of the present
invention the action includes a light emission.
Still further in accordance with a preferred embodiment of the present
invention the speech output is identifiable with the action.
Additionally in accordance with a preferred embodiment of the present invention the at least one computer maintains a memory including at least one the speech
output identifiable with the action. Moreover in accordance with a preferred embodiment of the present invention the at least one computer is operative to induce the fanciful figure to produce the action.
Further in accordance with a preferred embodiment of the present
invention the user induces the fanciful figure to produce the action and the at least one
computer is operative to detect the action.
Additionally in accordance with a preferred embodiment of the present invention at least one speech input apparatus is further included and the at least one
computer is operative to receive a speech input via the at least one speech input
apparatus.
Moreover in accordance with a preferred embodiment of the present
invention the speech input is identifiable with the action.
Still further in accordance with a preferred embodiment of the present invention the at least one computer maintains a memory including at least one the speech
input identifiable with the action.
Additionally in accordance with a preferred embodiment of the present
invention the at least one computer is additionally operative to translate between the
special language and at least one other language the other language includes a language
of common discourse.
Moreover in accordance with a preferred embodiment of the present
invention the at least one fanciful figure is displayable on a computer display.
Further in accordance with a preferred embodiment of the present
invention the speech output apparatus is assembled with the at least one computer. Additionally in accordance with a preferred embodiment of the present
invention the fanciful figure is a toy in communication with the at least one computer.
Moreover in accordance with a preferred embodiment of the present invention the at least one computer is assembled with the toy.
Still further in accordance with a preferred embodiment of the present invention the toy includes at least one appendage that is actuable.
Additionally in accordance with a preferred embodiment of the present
invention the toy includes at least one appendage that is articulatable.
Moreover in accordance with a preferred embodiment of the present
invention the speech output apparatus is assembled with the toy.
Further in accordance with a preferred embodiment of the present invention the language is a previously unknown language.
Additionally in accordance with a preferred embodiment of the present invention the at least one fanciful figure includes a toy in communication with the at least
one computer and the speech input apparatus is assembled with the toy.
Moreover in accordance with a preferred embodiment of the present
invention the at least one fanciful figure includes a plurality of fanciful figures.
Still further in accordance with a preferred embodiment of the present
invention the speech input apparatus is assembled with the at least one computer .
Additionally in accordance with a preferred embodiment of the present invention the special language is preassembled with the at least one computer.
There is additionally provided in accordance with a preferred embodiment
of the present invention a method of playing with a toy, the method including selecting an action having an associated skill level, controlling a fanciful figure to perform the
action, and increasing the skill level over time.
Moreover in accordance with a preferred embodiment of the present invention the selecting step includes selecting a talking action.
Still further in accordance with a preferred embodiment of the present invention the increasing step includes increasing a vocabulary over time.
There is additionally provided in accordance with a preferred embodiment
of the present invention a method of playing with a toy, the method including providing
at least one fanciful figure, controlling speech output apparatus to provide a speech output associated with the fanciful figure the speech output is in a special language.
Additionally in accordance with a preferred embodiment of the present
invention the controlling step includes generating at least part of the special language.
Moreover in accordance with a preferred embodiment of the present
invention the generating step includes generating the at least part of the special language by modifying at least one known language according to at least one language
modification rule.
Still further in accordance with a preferred embodiment of the present
invention the generating step includes generating the at least part of the special language
from a predefined set of phonemes.
Additionally in accordance with a preferred embodiment of the present
invention the method includes controlling the at least one fanciful figure, to perform an
action associated with the speech output.
There is additionally provided in accordance with a preferred embodiment
of the present invention a method of playing with a toy, the method including providing at least one fanciful figure, controlling the at least one fanciful figure to produce an action, and accepting at least one speech input for association with the action.
Moreover in accordance with a preferred embodiment of the present invention the controlling-action step includes articulating at least one appendage of the
fanciful figure.
Still further in accordance with a preferred embodiment of the present
invention the method includes controlling speech output apparatus to provide a speech output associated with the fanciful figure.
Additionally in accordance with a preferred embodiment of the present invention the controlling speech output step further includes providing the speech output
associated with the action.
Moreover in accordance with a preferred embodiment of the present invention the controlling speech output step further includes providing the speech output
in a previously unknown language.
There is thus provided in accordance with a preferred embodiment of the
present invention a wireless computer controlled toy system including a computer system
operative to transmit a first transmission via a first wireless transmitter and at least one
toy including a first wireless receiver, the toy receiving the first transmission via the first wireless receiver and operative to carry out at least one action based on the first
transmission.
The computer system may include a computer game. The toy may include
a plurality of toys, and the at least one action may include a plurality of actions.
The first transmission may include a digital signal. The first transmission
includes an analog signal and the analog signal may include sound. Additionally in accordance with a preferred embodiment of the present invention the computer system includes a computer having a MLDI port and wherein the
computer may be operative to transmit the digital signal by way of the MIDI port.
Additionally in accordance with a preferred embodiment of the present
invention the sound includes music, a pre-recorded sound and/or speech. The speech may include recorded speech and synthesized speech.
Further in accordance with a preferred embodiment of the present
invention the at least one toy has a plurality of states including at least a sleep state and
an awake state, and the first transmission includes a state transition command, and the at least one action includes transitioning between the sleep state and the awake state.
A sleep state may typically include a state in which the toy consumes a
reduced amount of energy and/or in which the toy is largely inactive, while an awake state is typically a state of normal operation.
Still further in accordance with a preferred embodiment of the present invention the first transmission includes a control command chosen from a plurality of
available control commands based, at least in part, on a result of operation of the
computer game.
Additionally in accordance with a preferred embodiment of the present invention the computer system includes a plurality of computers.
Additionally in accordance with a preferred embodiment of the present
invention the first transmission includes computer identification data and the second
transmission includes computer identification data.
Additionally in accordance with a preferred embodiment of the present
invention the at least one toy is operative to transmit a second transmission via a second wireless transmitter and the computer system is operative to receive the second transmission via a second wireless receiver.
Moreover in accordance with a preferred embodiment of the present invention the system includes at least one input device and the second transmission
includes a status of the at least one input device.
Additionally in accordance with a preferred embodiment of the invention
the at least one toy includes at least a first toy and a second toy, and wherein the first toy is operative to transmit a toy-to-toy transmission to the second toy via the second
wireless transmitter, and wherein the second toy is operative to carry out at least one
action based on the toy-to-toy transmission.
Further in accordance with a preferred embodiment of the present
invention operation of the computer system is controlled, at least in part, by the second
transmission.
Moreover in accordance with a preferred embodiment of the present
invention the computer system includes a computer game, and wherein operation of the
game is controlled, at least in part, by the second transmission.
The second transmission may include a digital signal and/or an analog
signal.
Still further in accordance with a preferred embodiment of the present
invention the computer system has a plurality of states including at least a sleep state and
an awake state, and the second transmission include a state transition command, and the
computer is operative, upon receiving the second transmission, to transition between the
sleep state and the awake state. Still further in accordance with a preferred embodiment of the present
invention at least one toy includes sound input apparatus, and the second transmission
includes a sound signal which represents a sound input via the sound input apparatus.
Additionally in accordance with a preferred embodiment of the present
invention the computer system is also operative to perform at least one of the following actions: manipulate the sound signal; and play the sound signal.
Additionally in accordance with a preferred embodiment of the present
invention the sound includes speech, and the computer system is operative to perform a speech recognition operation on the speech.
Further in accordance with a preferred embodiment of the present invention the second transmission includes toy identification data, and the computer
system is operative to identify the at least one toy based, at least in part, on the toy
identification data.
Still further in accordance with a preferred embodiment of the present invention the first transmission includes toy identification data. The computer system
may adapt a mode of operation thereof based, at least in part, on the toy identification
data.
Still further in accordance with a preferred embodiment of the present
invention the at least one action may include movement of the toy, movement of a part of
the toy and/or an output of a sound. The sound may be transmitted using a MIDI
protocol.
There is also provided in accordance with another preferred embodiment
of the present invention a game system including a computer system operative to control
a computer game and having a display operative to display at least one display object, and at least one toy in wireless communication with the computer system, the computer
game including a plurality of game objects, and the plurality of game objects includes the at least one display object and the at least one toy.
Further in accordance with a preferred embodiment of the present
invention the at least one toy is operative to transmit toy identification data to the
computer system, and the computer system is operative to adapt a mode of operation of
the computer game based, at least in part, on the toy identification data.
The computer system may include a plurality of computers.
Additionally in accordance with a preferred embodiment of the present invention the first transmission includes computer identification data and the second transmission includes computer identification data.
There is also provided in accordance with a preferred embodiment of the present invention a data transmission apparatus including first wireless apparatus
including musical instrument data interface (MIDI) apparatus operative to receive and
transmit MIDI data between a first wireless and a first MIDI device and second wireless
apparatus including MIDI apparatus operative to receive and transmit MIDI data
between a second wireless and a second MIDI device, the first wireless apparatus is
operative to transmit MIDI data including data received from the first MIDI device to the second wireless apparatus, and to transmit MIDI data including data received from
the second wireless apparatus to the first MIDI device, and the second wireless
apparatus is operative to transmit M DI data including data received from the second MIDI device to the first wireless apparatus, and to transmit MIDI data including data
received from the first wireless apparatus to the second MLDI device. Further in accordance with a preferred embodiment of the present invention the second wireless apparatus includes a plurality of wirelesses each
respectively associated with one of the plurality of MIDI devices, and each of the second plurality of wirelesses is operative to transmit MIDI data including data received from the associated MIDI device to the first wireless apparatus, and to transmit MIDI data including data received from the first wireless apparatus to the associated MIDI device.
The first MIDI device may include a computer, while the second MIDI
device may include a toy.
Additionally in accordance with a preferred embodiment of the present
invention the first wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the first wireless and a first analog device,
and the second wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the second wireless and a second analog device, and the first wireless apparatus is also operative to transmit analog signals
including signals received from the first analog device to the second wireless apparatus,
and to transmit analog signal including signals received from the second wireless
apparatus to the first analog device, and the second wireless apparatus is also operative
to transmit analog signals including signals received from the second analog device to the
first wireless apparatus, and to transmit analog signals including data received from the
first wireless apparatus to the second analog device.
There is also provided in accordance with another preferred embodiment
of the present invention a method for generating control instructions for a computer controlled toy system, the method includes selecting a toy, selecting at least one command from among a plurality of commands associated with the toy, and generating
control instructions for the toy including the at least one command.
Further in accordance with a preferred embodiment of the present invention the step of selecting at least one command includes choosing a command, and
specifying at least one control parameter associated with the chosen command.
Still further in accordance with a preferred embodiment of the present invention the at least one control parameter includes at least one condition depending on
a result of a previous command.
Additionally in accordance with a preferred embodiment of the present
invention at least one of the steps of selecting a toy and the step of selecting at least one
command includes utilizing a graphical user interface.
Still further in accordance with a preferred embodiment of the present invention the previous command includes a previous command associated with a second
toy.
Additionally in accordance with a preferred embodiment of the present
invention the at least one control parameter includes an execution condition controlling
execution of the command.
The execution condition may include a time at which to perform the
command and/or a time at which to cease performing the command. The execution
condition may also include a status of the toy.
Additionally in accordance with a preferred embodiment of the present
invention the at least one control parameter includes a command modifier modifying
execution of the command. Still further in accordance with a preferred embodiment of the present
invention the at least one control parameter includes a condition dependent on a future
event.
Additionally in accordance with a preferred embodiment of the present invention the at least one command includes a command to cancel a previous command.
There is also provided for in accordance with a preferred embodiment of
the present invention a signal transmission apparatus for use in conjunction with a
computer, the apparatus including wireless transmission apparatus; and signal processing
apparatus including at least one of the following analog/digital sound conversion apparatus operative to convert analog sound signals to digital sound signals, to convert digital sound signals to analog sound signals, and to transmit the signals between the
computer and a sound device using the wireless transmission apparatus; a peripheral control interface operative to transmit control signals between the computer and a
peripheral device using the wireless transmission apparatus; and a MIDI interface
operative to transmit MIDI signals between the computer and a MIDI device using the
wireless transmission apparatus.
There is also provided in accordance with another preferred embodiment
of the present invention a computer system including a computer, and a sound card operatively attached to the computer and having a MIDI connector and at least one
analog connector, wherein the computer is operative to transmit digital signals by means
of the MIDI connector and to transmit analog signals by means of the at least one analog connector. Further in accordance with a preferred embodiment of the present invention the computer is also operative to receive digital signals by means of the MIDI connector and to receive analog signals by means of the at least one analog connector.
It is noted that throughout the specification and claims the term "special language" is intended to include any language other than languages of common discourse
such as English, French, Swahili and Urdu.
It is further noted that throughout the specification and claims the term "fanciful figure" is intended to include any 2D or 3D real or virtual figure, which may or may not be based on fact, which is made or designed in a curious, intricate, imaginative
or whimsical way.
It is also noted that throughout the specification and claims the term
"radio" includes all forms of "wireless" communication.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated from the
following detailed description, taken in conjunction with the drawings in which:
Figs. 1 - 32C illustrate a toy system for use in conjunction with a
computer system wherein:
Fig. 1A is a partly pictorial, partly block diagram illustration of a
computer control system including a toy, constructed and operative in accordance with a
preferred embodiment of the present invention;
Fig. IB is a partly pictorial, partly block diagram illustration a preferred
implementation of the toy 122 of Fig. 1 A; Fig. 1C is a partly pictorial, partly block diagram illustration of a
computer control system including a toy, constructed and operative in accordance with
an alternative preferred embodiment of the present invention;
Figs. 2A - 2C are simplified pictorial illustrations of a portion of the
system of Fig. 1 A in use;
Fig. 3 is a simplified block diagram of a preferred implementation of the
computer radio interface 110 of Fig. 1 A;
Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3;
Figs. 5A - 5D taken together comprise a schematic diagram of the apparatus of Fig. 4;
Fig. 5E is an schematic diagram of an alternative implementation of the
apparatus of Fig. 5D;
Fig. 6 is a simplified block diagram of a preferred implementation of the
toy control device 130 of Fig. 1 A;
Figs. 7A - 7F, taken together with either Fig. 5D or Fig. 5E, comprise a
schematic diagram of the apparatus of Fig. 6;
Fig. 8A is a simplified flowchart illustration of a preferred method for
receiving radio signals, executing commands comprised therein, and sending radio
signals, within the toy control device 130 of Fig. 1A;
Figs. 8B - 8T, taken together, comprise a simplified flowchart illustration
of a preferred implementation of the method of Fig. 8 A;
Fig. 9A is a simplified flowchart illustration of a preferred method for
receiving MIDI signals, receiving radio signals, executing commands comprised therein, sending radio signals, and sending MIDI signals, within the computer radio interface 110
of Fig. 1A;
Figs. 9B - 9N, taken together with Figs. 8D - 8M, comprise a simplified
flowchart illustration of a preferred implementation of the method of Fig. 9 A;
Figs. 10A - IOC are simplified pictorial illustrations of a signal transmitted
between the computer radio interface 110 and the toy control device 130 of Fig. 1 A;
Fig. 11 is a simplified flowchart illustration of a preferred method for
generating control instructions for the apparatus of Fig. 1 A;
Figs. 12A - 12C are pictorial illustrations of a preferred implementation of
a graphical user interface implementation of the method of Fig. 11;
Fig. 13 is a block diagram of a first sub-unit of a multi-port multi-channel
implementation of the computer radio interface 110 of Fig. 1A, which sub-unit resides within computer 100 of Fig. 1A;
Fig. 14 is a block diagram of a second sub-unit of a multi-port multi¬
channel implementation of the computer radio interface 110 of Fig. 1A, which sub-unit
complements the apparatus of Fig. 13 and resides exteriorly to computer 100 of Fig. 1A;
Figs. 15A - 15E, taken together, form a detailed electronic schematic
diagram of the toy control device of Fig. 6, suitable for the multi-channel implementation
of Figs. 13 and 14;
Fig. 16 is a simplified flowchart illustration of a preferred method by
which a computer selects a control channel pair in anticipation of a toy becoming available and starts a game-defining communication over the control channel each time both a toy and a transceiver of the computer radio interface are available; Fig. 17 is a simplified flowchart illustration of a preferred method for implementing the "select control channel pair" step of Fig. 16;
Fig. 18A is a simplified flowchart illustration of a preferred method for
implementing the "select information communication channel pair" step of Fig. 16;
Fig. 18B is a simplified flowchart illustration of a preferred method for
performing the "locate computer" step of Fig. 18A;
Fig. 19 is a simplified flowchart illustration of a preferred method of
operation of the toy control device 130;
Fig. 20 is a simplified illustration of a remote game server in association
with a wireless computer controlled toy system which may include a network computer;
Fig. 21 is a simplified flowchart illustration of the operation of the computer or of the network computer of Fig. 20, when operating in conjunction with the
remote server;
Fig. 22 is a simplified flowchart illustration of the operation of the remote
game server of Fig. 20;
Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
computer controlled toy system including a proximity detection subsystem operative to
detect proximity between the toy and the computer;
Figs. 24A - 24E, taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
which is similar to the detailed electronic schematic diagrams of Figs. 5 A - 5D except for
being multi-channel, therefore capable of supporting full duplex applications, rather than
single-channel; Figs. 25 A - 25E, taken together, form a detailed schematic illustration of
a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer;
Figs. 26A - 26D, taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
the sound board of the computer.;
Figs. 27A - 27J are preferred flowchart illustrations of a preferred radio
coding technique which is an alternative to the radio coding technique described above with reference to Figs. 8E, 8G - 8M and 10A - C;
Figs. 28A - 28K, taken together, form a detailed electronic schematic
diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 13;
Figs. 29A - 291, taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14;
Fig. 30 is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a further
preferred embodiment of the present invention;
Fig. 31 is a block diagram is a simplified block diagram illustrating the
combination of the computer radio interface and the toy control device as used in the
embodiment of Fig. 30; and
Figs. 32 A, 32B and 32C taken together form a simplified block diagram
of the EPLD chip of Fig. 28H; and
Figs. 33 - 43 illustrates embodiments of the toy system of Figs. 1 - 32C,
in which a computer-controlled toy system has a capacity for modifying a known
language and/or speaking in a previously unknown or whimsical language, wherein: Fig. 33 is a simplified pictorial illustration of a display-based fanciful figure interaction system constructed and operative in accordance with a preferred embodiment of the present invention;
Figs. 34A and 34B, taken together, are simplified pictorial illustrations of a toy-based fanciful figure interaction system constructed and operative in accordance
with another preferred embodiment of the present invention;
Fig. 34C is a simplified pictorial illustration of the toy-based fanciful figure of Figs. 34A and 34B;
Fig. 35 is a simplified block diagram of a fanciful figure interaction system
useful in the systems of Figs. 33, 34A, 34B, and 34C;
Fig. 36 is a simplified operational flow chart of a fanciful figure
interaction system useful in describing the systems of Figs. 33, 34A, 34B, 34C, and 35;
Fig. 37 is a simplified operational flow chart of a preferred implementation of step 3440 of Fig. 36;
Fig. 38 is a simplified operational flow chart of a preferred
implementation of step 3450 of Fig. 36;
Fig. 39 is a simplified operational flow chart of a preferred
implementation of step 3470 of Fig. 36;
Fig. 40 is a simplified operational flow chart of a preferred implementation of step 3490 of Fig. 36;
Fig. 41 is a simplified block diagram of a preferred logical implementation
of the various sets described with reference to Fig. 35; and
Figs. 42 and 43, taken together, are simplified block diagrams of possible
implementations of various tables described in Fig 41. DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Fig. 1 A which is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and
operative in accordance with a preferred embodiment of the present invention. The system of Fig. 1A comprises a computer 100, which may be any suitable computer such
as, for example, an IBM-compatible personal computer. The computer 100 is equipped
with a screen 105. The computer 100 is preferably equipped with a sound card such as,
for example, a Sound Blaster Pro card commercially available from Creative Labs, Inc., 1901 McCarthy Boulevard, Milpitas CA 95035 or from Creative Technology Ltd., 67
Ayer Rajah Crescent #03-18, Singapore, 0513; a hard disk; and, optionally, a CD-ROM
drive.
The computer 100 is equipped with a computer radio interface 110
operative to transmit signals via wireless transmission based on commands received from
the computer 100 and, in a preferred embodiment of the present invention, also to receive signals transmitted elsewhere via wireless transmission and to deliver the signals
to the computer 100. Typically, commands transmitted from the computer 100 to the
computer radio interface 110 are transmitted via both analog signals and digital signals,
with the digital signals typically being transmitted by way of a MIDI port. Transmission
of the analog and digital signals is described below with reference to Fig. 3.
The transmitted signal may be an analog signal or a digital signal. The received signal may also be an analog signal or a digital signal. Each signal typically comprises a message. A preferred implementation of the computer radio interface 110 is
described below with reference to Fig. 3. The system of Fig. 1A also comprises one or more toys 120. The system
of Fig. 1A comprises a plurality of toys, namely three toys 122, 124, and 126 but it is
appreciated that, alternatively, either one toy only or a large plurality of toys may be
used.
Reference is now additionally made to Fig. IB, which is a partly pictorial, partly block diagram illustration of the toy 122 of Fig. 1 A.
Each toy 120 comprises a power source 125, such as a battery or a
connection to line power. Each toy 120 also comprises a toy control device 130,
operative to receive a wireless signal transmitted by the computer 100 and to cause each toy 120 to perform an action based on the received signal. The received signal may be, as explained above, an analog signal or a digital signal. A preferred implementation of the
toy control device 130 is described below with reference to Fig. 6.
Each toy 120 preferably comprises a plurality of input devices 140 and
output devices 150, as seen in Fig. IB. The input devices 140 may comprise, for example
on or more of the following: a microphone 141; a microswitch sensor 142; a touch sensor (not shown in Fig. IB); a light sensor (not shown in Fig. IB); a movement sensor
143, which may be, for example, a tilt sensor or an acceleration sensor. Appropriate
commercially available input devices include the following: position sensors available
from Hamlin Inc., 612 East Lake Street, Lake Mills, WI 53551, USA; motion and
vibration sensors available from Comus International, 263 Hillside Avenue, Nutley, New
Jersey 07110, USA; temperature, shock, and magnetic sensors available from Murata Electronics Ltd., Hampshire, England; and switches available from C & K Components
Inc., 15 Riverdale Avenue, Newton, MA 02058-1082, USA or from Micro Switch Inc.,
a division of Honeywell, USA. The output devices 150 may comprise, for example, one or more of the following: a speaker 151; a light 152; a solenoid 153 which may be
operative to move a portion of the toy; a motor, such as a stepping motor, operative to
move a portion of the toy or all of the toy (not shown in Fig. IB). Appropriate
commercially available output devices include the following: DC motors available from Alkatel (dunkermotoren), Postfach 1240, D-7823, BonndorCSchwarzald, Germany;
stepping motors and miniature motors available from Haydon Switch and Instruments, Inc. (HSI), 1500 Meriden Road, Waterbury, CT, USA; and DC solenoids available from Communications Instruments, Inc., P.O. Box 520, Fairview, North Carolina 28730, USA.
Examples of actions which the toy may perform include the following:
move a portion of the toy; move the entire toy; or produce a sound, which may comprise
one or more of the following: a recorded sound, a synthesized sound, music including recorded music or synthesized music, speech including recorded speech or synthesized speech.
The received signal may comprise a condition governing the action as, for
example, the duration of the action, or the number of repetitions of the action.
Typically, the portion of the received signal comprising a message
comprising a command to perform a specific action as, for example, to produce a sound with a given duration, comprises a digital signal. The portion of the received signal
comprising a sound, for example, typically comprises an analog signal. Alternatively, in a
preferred embodiment of the present invention, the portion of the received signal
comprising a sound, including music, may comprise a digital signal, typically a signal
comprising MIDI data. The action the toy may perform also includes reacting to signals transmitted by another toy, such as, for example, playing sound that the other toy is monitoring and transmitting.
In a preferred embodiment of the present invention, the toy control device
130 is also operative to transmit a signal intended for the computer 100, to be received
by the computer radio interface 110. In this embodiment, the computer radio interface
110 is preferably also operative to poll the toy control device 130, that is, transmit a
signal comprising a request that the toy control device 130 transmit a signal to the computer radio interface 110. It is appreciated that polling is particularly preferred in the
case where there are a plurality of toys having a plurality of toy control devices 130.
The signal transmitted by the toy control device 130 may comprise one or more of the following: sound, typically sound captured by a microphone input device
141; status of sensor input devices 140 as, for example, light sensors or micro switch; an indication of low power in the power source 125; or information identifying the toy.
It is appreciated that a sound signal transmitted by the device 130 may
also include speech. The computer system is operative to perform a speech recognition
operation on the speech signals.
Appropriate commercially available software for speech recognition is
available from companies such as: Stylus Innovation Inc., One Kendall Square, Building
300, Cambridge, MA 02139, USA; A&G Graphics Interface, USA, Telephone No. (617)
492-0120, Telefax No. (617) 427-3625; "Dragon Dictate For Windows", available from Dragon Systems Inc., 320 Nevada Street, MA. 02160, USA, and "SDK" available from
Lernout & Hausple Speech Products, Sint-Krispijnstraat 7, 8900 Leper, Belgium. The signal from the radio control interface 110 may also comprise, for example, one or more of the following: a request to ignore input from one or more input
devices 140; a request to activate one or more input devices 140 or to stop ignoring
input from one or more input devices 140; a request to report the status of one or more input devices 140; a request to store data received from one or more input devices 140,
typically by latching a transition in the state of one or more input devices 140, until a future time when another signal from the radio control interface 110 requests the toy
control device 130 to transmit a signal comprising the stored data received from the one
or more input devices 140; or a request to transmit analog data, typically comprising sound, typically for a specified period of time.
Typically, all signals transmitted in both directions between the computer
radio interface 110 and the toy control device 130 include information identifying the
toy.
Reference is now made to Fig. 1C, which is a partly pictorial, partly block
diagram illustration of a computer control system including a toy, constructed and
operative in accordance with an alternative preferred embodiment of the present
invention. The system of Fig. 1C comprises two computers 100. It is appreciated that, in
general, a plurality of computers 100 may be used. In the implementation of Fig. 1C, all
signals transmitted in both directions between the computer radio interface 110 and the
toy control device 130 typically include information identifying the computer.
The operation of the system of Fig. 1A is now briefly described.
Typically, the computer 100 runs software comprising a computer game, typically a
game including at least one animated character. Alternatively, the software may comprise
educational software or any other interactive software including at least one animated object. As used herein, the term "animated object" includes any object which may be
depicted on the computer screen 105 and which interacts with the user of the computer via input to and output from the computer. An animated object may be any object depicted on the screen such as, for example: a doll; an action figure; a toy, such as, for example, an activity toy, a vehicle, or a ride-on vehicle; a drawing board or sketch board;
or a household object such as, for example, a clock, a lamp, a chamber pot, or an item of
furniture.
Reference is now additionally made to Figs 2A - 2C, which depict a portion of the system of Fig. 1A in use. The apparatus of Fig. 2 A comprises the computer screen 105 of Fig. 1A. On the computer screen are depicted animated objects 160 and 165.
Fig. 2B depicts the situation after the toy 122 has been brought into range
of the computer radio interface 110 of Fig. 1A, typically into the same room therewith. Preferably, the toy 122 corresponds to the animated object 160. For example, in Fig. 2B
the toy 122 and the animated object 160, shown in Fig. 2 A, are both a teddy bear. The
apparatus of Fig. 2B comprises the computer screen 105, on which is depicted the
animated object 165. The apparatus of Fig. 2B also comprises the toy 122. The computer
100, having received a message via the computer radio interface 110, from the toy 122,
no longer displays the animated object 160 corresponding to the toy 122. The functions
of the animated object 160 are now performed through the toy 122, under control of the computer 100 through the computer radio interface 110 and the toy control device 130.
Fig. 2C depicts the situation after the toy 126 has also been brought into
range of the computer radio interface 110 of Fig. 1A, typically into the same room
therewith. Preferably, the toy 126 corresponds to the animated object 165. For example, in Fig. 2C the toy 126 and the animated object 165, shown in Figs. 2 A and 2B, are both a clock. The apparatus of Fig. 2C comprises the computer screen 105, on which no
animated objects are depicted.
The apparatus of Fig. 2C also comprises the toy 126. The computer 100,
having received a message via the computer radio interface 110 from the toy 126, no
longer displays the animated object 165 corresponding to the toy 126. The functions of the animated object 165 are now performed through the toy 126, under control of the
computer 100 through the computer radio interface 110 and the toy control device 130.
In Fig. 2A, the user interacts with the animated objects 160 and 165 on
the computer screen, typically using conventional methods. In Fig. 2B the user also
interacts with the toy 122, and in Fig. 2C typically with the toys 122 and 126, instead of interacting with the animated objects 160 and 165 respectively. It is appreciated that the
user may interact with the toys 122 and 126 by moving the toys or parts of the toys; by
speaking to the toys; by responding to movement of the toys which movement occurs in
response to a signal received from the computer 100; by responding to a sound produced
by the toys, which sound is produced in response to a signal received from the computer
100 and which may comprise music, speech, or another sound; or otherwise.
Reference is now made to Fig. 3 which is a simplified block diagram of a
preferred embodiment of the computer radio interface 110 of Fig. 1A. The apparatus of
Fig. 3 comprises the computer radio interface 110. The apparatus of Fig. 3 also comprises a sound card 190, as described above with reference to Fig. 1A. In Fig. 3, the
connections between the computer radio interface 110 and the sound card 190 are
shown. The computer radio interface 110 comprises a DC unit 200 which is fed
with power through a MIDI interface 210 from a sound card MIDI interface 194, and the following interfaces: a MIDI interface 210 which connects to the sound card MIDI
interface 194; an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary audio interface 230 which preferably connects to a
stereo sound system for producing high quality sound under control of software running
on the computer 100 (not shown).
The apparatus of Fig. 3 also comprises an antenna 240, which is operative
to send and receive signals between the computer radio interface 110 and one or more
toy control devices 130.
Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3. The apparatus of Fig. 4 comprises the DC unit 200, the MIDI interface
210, the audio interface 220, and the secondary audio interface 230. The apparatus of Fig. 4 also comprises a multiplexer 240, a micro controller 250, a radio transceiver 260,
a connection unit 270 connecting the radio transceiver 260 to the micro controller 250,
and a comparator 280.
Reference is now made to Figs. 5A - 5D, which taken together comprise
a schematic diagram of the apparatus of Fig. 4.
The following is a preferred parts list for the apparatus of Figs. 5 A - 5C:
1. Kl Relay Dept, Idee, 1213 Elco Drive, Sunnyvale, Calif. 94089-2211,
USA.
2. Ul 8751 microcontroller, Intel Corporation, San Tomas 4, 2700 San Tomas Expressway, 2nd Floor, Santa Clara 95051, CA USA. 3. U2 CXO - 12MHZ (crystal oscillator),Raltron, 2315 N.W. 107th Avenue, Miami Florida 33172, USA.
4. U4 MC33174, Motorola, Phoenix, AZ, USA., Tel. No. (602) 897-
5056.
5. Diodes 1N914, Motorola, Phoenix, AZ, USA. Tel. No. (602)897-
5056.
6. Transistors 2N2222 and MPSA14, Motorola, Phoenix, AZ, USA. Tel.
No. (602)897-5056.
The following is a preferred parts list for the apparatus of Fig. 5D:
1. Ul SILRAX-418-A UHF radio telemetry receive module, Ginsburg
Electronic GmbH, Am Moosfeld 85, D-81829, Munchen, Germany.
Alternatively, Ul of Fig. 5D may be replaced by:
Ul 433.92MHz Receive Module Part No. 0927, available from CEL
SALES LTD., Cel House, Unit 2, Block 6, Shenstone Trading Estate, Bromsgrove,
Halesowen, West Midlands B36 3XB, UK.
2. U2 TXM-418-A low power UHF radio telemetry transmit module,
Ginsburg Electronic GmbH, Am Moosfeld 85, D-1829, Munchen, Germany.
Alternatively, U2 of Fig. 5D may be replaced by:
U2 433.92 SIL FM Transmitter Module Part No, 5229, available
from CEL SALES LTD., Cel House, Unit 2, Block 6, Shenstone Trading Estate,
Bromsgrove, Halesowen, West Midlands B36 3XB UK.
Reference is now additionally made to Fig. 5E, which is a schematic
diagram of an alternative implementation of the apparatus of Fig. 5D. The following is a
preferred parts list for the apparatus of Fig. 5E: 1. Ul BIM-418-F low power UHF data transceiver module, Ginsburg Electronic GmbH, Am Moosfeld 85, D-81829, Munchen, Germany.
Alternate 1. Ul S20043 spread spectrum full duplex transceiver,
AMI Semiconductors - American Microsystems, Inc., Idaho, USA.
Alternate 1. U1 SDT-300 synthesized transceiver, Circuit Design,
Inc., Japan.
Alternatively, Ul may be replaced by:
Ul RY3GB021 RF 900Mhz units, available from SHARP
ELECTRONIC COMPONENTS GROUP, 5700 Northwest, Pacific Rim Boulevard #20, Camas, Washington, USA.
Ul RY3GB100 RF Units For DECT, available from SHARP
ELECTRONIC COMPONENTS GROUP 5700 Northwest, Pacific Rim Boulevard #20,
Camas, Washington, USA.
In the parts list for Fig. 5E, one of item 1 or either of the alternate items 1
may be used for Ul.
It is appreciated that the appropriate changes will have to be made to all
the circuit boards for alternate embodiments of the apparatus.
The apparatus of Fig. 5E has similar functionality to the apparatus of Fig.
5D, but has higher bit rate transmission and reception capacity and is, for example,
preferred when MIDI data is transmitted and received.
Figs. 5A - 5E are self-explanatory with regard to the above parts lists.
Reference is now made to Fig. 6 which is a simplified block diagram of a
preferred embodiment of the toy control device 130 of Fig. 1A. The apparatus of Fig. 6
comprises a radio transceiver 260, similar to the radio transceiver 260 of Fig. 4. The apparatus of Fig. 6 also comprises a microcontroller 250 similar to the microcontroller
250 of Fig. 4.
The apparatus of Fig. 6 also comprises a digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the
microcontroller 250 and a plurality of input and output devices which may be connected
thereto such as, for example, four input device and four output devices. A preferred implementation of the digital I/O interface 290 is described in more detail below with
reference to Fig. 7A - 7F.
The apparatus of Fig. 6 also comprises an analog input/output interface
(analog I/O interface) 300 operatively connected to the radio transceiver 260, and
operative to receive signals therefrom and to send signals thereto.
The apparatus of Fig. 6 also comprises a multiplexer 305 which is
operative, in response to a signal from the microcontroller 250, to provide output to the analog I/O interface 300 only when analog signals are being transmitted by the radio transceiver 260, and to pass input from the analog I/O interface 300 only when such
input is desired.
The apparatus of Fig. 6 also comprises input devices 140 and output
devices 150. In Fig. 6, the input devices 140 comprise, by way of example, a tilt switch
operatively connected to the digital I/O interface 290, and a microphone operatively connected to the analog I/O interface 300. It is appreciated that a wide variety of input
devices 140 may be used.
In Fig. 6, the output devices 150 comprise, by way of example, a DC
motor operatively connected to the digital I/O interface 290, and a speaker operatively connected to the analog I/O interface 300. It is appreciated that a wide variety of output
devices 150 may be used.
The apparatus of Fig. 6 also comprises a DC control 310, a preferred implementation of which is described in more detail below with reference to Figs. 7A -
7F.
The apparatus of Fig. 6 also comprises a comparator 280, similar to the
comparator 280 of Fig. 4.
The apparatus of Fig. 6 also comprises a power source 125, shown in Fig. 6 by way of example as batteries, operative to provide electrical power to the apparatus
of Fig. 6 via the DC control 310.
Reference is now made to Figs. 7A - 7F which, taken together with either
Fig. 5D or 5E, comprise a schematic diagram of the toy control device of Fig. 6. If the
schematics of Fig. 5E is employed to implement the computer radio interface of Fig. 4,
using RY3GB021 as Ul of Fig. 5E, then the same schematics of Fig. 5E are preferably
employed to implement the toy control device of Fig. 6 except that RY3GH021 is used
to implement Ul rather than RY3GB021.
The following is a preferred parts list for the apparatus of Figs. 7A - 7F:
1. Ul 8751 microcontroller, Intel Corporation, San Tomas 4, 2700 San
Tomas Expressway, 2nd Floor, Santa Clara 95051, CA USA.
2. U2 LM78L05, National Semiconductor, 2900 Semiconductor Drive,
Santa Clara, CA. 95052, USA.
3. U3 CXO - 12MHz (crystal oscillator), Raltron, 2315 N.W. 107th
Avenue, Miami, FL. 33172, USA. 4. U4 MC33174, Motorola, Phoenix, AZ, USA. Tel. No. (602) 897-
5056.
5. U5 MC34119, Motorola, Phoenix, AZ, USA. Tel. No. (602) 897-
5056.
6. U6 4066, Motorola, Phoenix, AZ, USA. Tel. No. (602) 897-5056.
7. Diode 1N914, 1N4005, Motorola, Phoenix, AZ, USA. Tel. No. (602) 897-5056.
8. Transistor 2N2222, 2N3906, Motorola, Phoenix, AZ, USA. Tel. No.
(602) 897-5056.
9. Transistors 2N2907 and MPSA14, Motorola, Phoenix, AZ, USA. Tel.
No. (602) 897-5056.
Figs. 7A - 7F are self-explanatory with reference to the above parts list.
As stated above with reference to Fig. 1A, the signals transmitted
between the computer radio interface 110 and the toy control device 130 may be either
analog signals or digital signals. It the case of digital signals, the digital signals preferably
comprise a plurality of predefined messages, known to both the computer 100 and to the
toy control device 130.
Each message sent by the computer radio interface 110 to the toy control
device 130 comprises an indication of the intended recipient of the message. Each
message sent by the toy control device 130 to the computer radio interface 110
comprises an indication of the sender of the message.
In the embodiment of Fig. 1C described above, messages also comprise
the following: each message sent by the computer radio interface 110 to the toy control device 130 comprises an indication of the sender of the message; and each message sent by the toy control device 130 to the computer radio interface 110 comprises an indication of the intended recipient of the message.
A preferred set of predefined messages is as follows:
COMMAND STRUCTURE
Figure imgf000042_0001
COMMANDS LIST
From the Computer to the Toy control device.
A. OUTPUT COMMANDS
4^>
O SET 10 TO DATA
Figure imgf000042_0002
Set Toy control device output pin to a digital level D.
P: Computer address 00-03 11
A: unit address - 00-FF II 10: i/o number - 00-03 I I
D: Data- 00-01 I I
Example
1 01 00 00 05 00 01 03 01 00 00 set io 3 to " 1 "
2. 01 00 00 05 00 01 03 00 00 00 sel io 3 to "0"
CHANGE IO FOR TIME
Figure imgf000043_0001
Change Toy control device output pin to D for a peiiod of time and then return to previous state
P Computer address 00-03 II A. unit address - 00-FF II
IO i/o number - 00-03 II
TI 2 time - 00-FF II
D Data- 00-01 II
example
I 01000005000203050000 set io 3 to "1" for 5 seconds
B. INPUT COMMANDS
SEND STATUS OF SENSORS
Figure imgf000044_0001
send the Toy control device status of all sensors
~ P: Computer address 00-03 I I
A: unit address - 00-FF II
example:
I . 01 00 00 05 01 00 00 00 00 00 send current status of sensors
SENSORS SCAN MODE ON
Figure imgf000045_0002
Start scanning the Toy control device sensors, and ifone of them is closed (pressed to '0'), send back an ack
Computer address 00-03 II
-P" A: unit address - 00-FF H
example.
I. 01000005010100000000 scan mode of sensors ON
Figure imgf000045_0001
SENSORS SCAN MODE ON ONCE
Figure imgf000046_0001
Start scanning the Toy control device sensors, and i ne of them is closed (pressed to '()'), send back an ack, then disable scanning the sensors
Computer address 00-03 II A. unit address - 00-FF I I
01 00 00 05 01 02 00 00 00 00 scan mode of sensors ON once
SENSORS SCAN MODE OFF
Figure imgf000047_0001
Stop scanning the Toy control device sensor
P. Computer address 00-03 H
A: unit address - 00-FF H example:
01 00 00 05 01 03 00 00 00 00 scan mode of sensors OFF
C. AUDIO OUT COMMANDS
START A DIO PLAY
Figure imgf000048_0001
Start playing an audio in a speaker of the Toy control device The Audio is sent to the Toy control device by the computer sound card and the Computer radio interface.
P: Computer address 00-03 II
«* A: unit address - 00-FF I I
1. 01 00 00 05 02 00 00 00 00 00 Start audio-play
STOP AUDIO PLAY
Figure imgf000049_0001
Stop playing an audio in a speaker of the Toy control device
P Computer address 00-03 II
A unit address - 00-FF II
I. 01000005020100000000 Stop audio-play
START AUDIO AND IO PLAY FOR TIME
Figure imgf000050_0002
Start playing an audio in a speakei of the Toy control device and set an io pin to ' I ' Aftei lime T, stop audio and set 10 to '0' start this comman after a delay td* 100ms if SC-'I" then after the execution of this command, start the input command SCAN SENSORS ON ONCE (if any sensor is pressed, even duiing the audio play, send a message to ihe computer)
J P Computer address 00-03 11 ∞ A A unit address - 00-FF II
10 i/o number - 0-3 II ( if IO>3 then don't set IO)
T0 I.T2 TIME 000-FF II (* 100ms) (T0=MMSB, Tl-MSB T0=LSB) td delay lime belbr execute 0-F 11 (*l00ms)
010000 OS 0204802 A 0300 Start audio-play and IO tt 3 for 64 second
640=28011 delay before execution = 10*100ms= 1 sec
010000050204802 A 1300 Start audio-play and IO it 3 for 64 second and set scan sensors on once mode delay before execution = 10* 100ms=lsec
Figure imgf000050_0001
D. AUDIO IN COMMANDS
TRANSMIT MIC FOR TIM E
Figure imgf000051_0001
Requests the Toy contiol device to Transmit microphone audio from the Toy control device to the Computer radio intei face and to the sound car of the computer foi time T
-P» P Computer address 00-03 I I io A unit address - 00-FF I I
T1.T2 TIME 00-FF I I (SEC)
example
I 01 00 00 05 03 00 0A 00 00 00 start mic mode for 10 seconds
E. GENERAL TOY COMMANDS
GOTO SLEEP MODE
Figure imgf000052_0001
Requests the Toy control device to go into power save mode (sleep) en O
P Computer address 00-03 I I
A unit address - 00-FF I I
01 00 00 05 04 01 00 00 00 00 switch the Toy control device into sleep mode
GOTO AWAKE MODE
Figure imgf000053_0001
Figure imgf000053_0003
Requests the Toy control device to go into an awake mode .
P: Computer address 00-03 I I
A: unit address - 00-FF I I
01 00 00 05 04 02 00 00 00 00 switch the Toy control device into awake mode
Figure imgf000053_0002
OY RESET
Figure imgf000054_0001
Figure imgf000054_0002
Requests the Toy control device to perform RESET
P: Computer address 00-03 I I
A: unit address - 00-FF I I
01 00 00 05 04 OF 00 00 00 00 Toy reset
TOY USE NEW RF CHANNELS
Figure imgf000055_0001
Figure imgf000055_0002
Requests the Toy contiol device to switch to new RF transmit and receive channels.
P: Computer address 00-03 11 A: unit address - 00-FF I I π CHI : Transmit RF channel number 0-F 11 J CII2: Receive RF Channel number 0-F 11
01 00 00 05 04 0A 12 00 00 00 Switcli to new RX and TX RF channels
Note: This command is available only with enhanced radio modules (alternate U l of Fig. 5E ) or with the modules described if Fig I 5A- 15E and 24A-24E.
F. TELEMETRY
Information sent by the Toy control device, as an ACK to the command received fro the Computer radio interface.
OK ACK
Figure imgf000056_0001
Send back an ACK about the command that was received ok
P Computer addiess 00-03 H
A. unit address - 00-FF I I cmd 1 ,2 Received command MSB ok ack. 00-FF I I cmd 3,4 Received command LSB ok ack. 00-FF 11 sen 1 ,2 Sensors 0-7 status 00-FF I I
01 60 00 05 OA 00 01 01 FF 00 OK ack for 0101 command (sensors scan mode on command) status all sensors aie not pressed (FF) the computei radio intei face number is 6
01 60 00 05 OA 00 01 01 FE 00 OK ack Ibi 0101 command (sensors scan mode on command) status sensor tt 8 is pressed
(FE) the computer radiojnterface number is 6
G. REQUESTS
Requests sent by the Toy control device, alter an event.
TOY IS AWAKE REQ
Figure imgf000057_0002
Send a message to the Computer radio interface if the Toy control device goes from sleep mode to awake mode. tn
P: Computer address 00-03 II
A: unit address - 00-FF II cl,c2 status command AB II
I. 01600005 OA 00 AB 00 FF 00 Toy is awake message
Figure imgf000057_0001
IL CRI (Computer Radio Interface)- commaiuls
Commands that are sent only to the Computer radio interface. SWITCH AUDIO OUT TO RADIO & TRANSMIT
Figure imgf000058_0001
tn
Requests the Computer radio interface to switch audio out from the computer sound card to the radio wireless transceiver and transmit.
Computer address 00-03 II
SWITCH AUDIO OUT TO JACK & STOP TRANSMIT
Figure imgf000059_0001
Requests the Computer radio inteiface to switch aiidio out from the radio RF wireless transceiver to the speakers jack and to stop transmit P Computer address 00-03 I I
MUTE RADIO
Figure imgf000059_0002
Mute the radio transmit
Computer address 00-03 I I
UN-MUTE RADIO
Figure imgf000060_0001
UN-Mute the radio transmit.
CRI RESET
Figure imgf000060_0002
Perform software reset on the Computer radio interface unit P. Computer address 00-03 I I
I. CRI - ACK
ACK sent only to the Computer by the Computer radio interface, only after CRI commands.
CRI COMMAND ACK
Figure imgf000061_0001
This is an ACK for a CRI command, this ACK is sent to the computer by the computer-radio-interface, after executing a command successfully. tn vo P: Computer address 00-03 I I cmd 1 ,2: Received CRI command MSB ok ack 00-FF I I cmd 1,4: Received CRI command LSB ok ack 00-FF I I
I 01 60 00 00 01) 00 0C 01 00 00 OK ack for 0C01 CRI command (SWITCH AUDIO OUT TO JACK) the computer radio interface number is 6
01 60 00 00 01) 00 0C OF 00 00 OK ack for OCOF CRI command (CRI reset) the computer radio interface number is 6. This ack is also sent on POWER UP RESET
Reference is now made to Fig. 8A, which is a simplified flowchart illustration of a preferred method for receiving radio signals, executing commands comprised therein, and sending radio signals, within the toy control device 130 of Fig.
1A. Typically, each message as described above comprises a command, which may
include a command to process information also comprised in the message. The method
of Fig. 8A preferably comprises the following steps:
A synchronization signal or preamble is detected (step 400). A header is
detected (step 403).
A command contained in the signal is received (step 405).
The command contained in the signal is executed (step 410). Executing the command may be as described above with reference to Fig. 1 A.
A signal comprising a command intended for the computer radio interface 110 is sent (step 420).
Reference is now made to Figs. 8B - 8T which, taken together, comprise
a simplified flowchart illustration of a preferred implementation of the method of Fig.
8A. The method of Figs. 8B - 8T is self-explanatory.
Reference is now made to Fig. 9A, which is a simplified flowchart
illustration of a preferred method for receiving MIDI signals, receiving radio signals,
executing commands comprised therein, sending radio signals, and sending MIDI signals,
within the computer radio interface 110 of Fig. 1A. Some of the steps of Fig. 9 A are
identical to steps of Fig. 8A, described above. Fig. 9A also preferably comprises the following steps:
A MIDI command is received from the computer 100 (step 430). The
MIDI command may comprise a command intended to be transmitted to the toy control device 130, may comprise an audio in or audio out command, or may comprise a general
command.
A MIDI command is sent to the computer 100 (step 440). The MIDI command may comprise a signal received from the toy control device 130, may comprise a response to a MIDI command previously received by the computer radio interface 110 from the computer 100, or may comprise a general command.
The command contained in the MIDI command or in the received signal is executed (step 450). Executing the command may comprise, in the case of a received signal, reporting the command to the computer 100, whereupon the computer 100 may typically carry out any appropriate action under program control as, for example, changing a screen display or taking any other appropriate action in response to the received command. In the case of a MIDI command received from the computer 100, executing the command may comprise transmitting the command to the toy control device 130. Executing a MIDI command may also comprise switching audio output of the computer control device 110 between the secondary audio interface 230 and the radio transceiver 260. Normally the secondary audio interface 230 is directly connected to the audio interface 220 preserving the connection between the computer sound board and the peripheral audio devices such as speakers, microphone and stereo system.
Reference is now made to Figs. 9B - 9N, and additionally reference is made back to Figs. 8D - 8M, all of which, taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 9A. The method of Figs. 9B - 9M, taken together with Figs. 8D - 8M, is self-explanatory.
Reference is now additionally made to Figs. 10A - 10C, which are simplified pictorial illustrations of a signal transmitted between the computer radio interface 110 and the toy control device 130 of Fig. 1A. Fig. 10A comprises a synchronization preamble. The duration T_SYNC of the synchronization preamble is preferably .500 millisecond, being preferably substantially equally divided into on and off
components.
Fig. 10B comprises a signal representing a bit with value 0, while Fig.
10C comprises a signal representing a bit with value 1.
It is appreciated that Figs. 10B and 10C refer to the case where the
apparatus of Fig. 5D is used. In the case of the apparatus of Fig. 5E, functionality corresponding to that depicted in Figs. 10B and 10C is provided within the apparatus of
Fig. 5E.
Preferably, each bit is assigned a predetermined duration T, which is the same for every bit. A frequency modulated carrier is transmitted, using the method of frequency modulation keying as is well known in the art. An "off' signal (typically less
than 0.7 Volts) presented at termination 5 of U2 in Fig. 5D causes a transmission at a
frequency below the median channel frequency. An "on" signal (typically over 2.3 Volts)
presented at pin 5 of U2 in Fig. 5D causes a transmission at a frequency above the
median frequency. These signals are received by the corresponding receiver Ul. Output
signal from pin 6 of Ul is fed to the comparator 280 of Figs. 4 and 6 that is operative to
determine whether the received signal is "off" or "on", respectively.
It is also possible to use the comparator that is contained within Ul by connecting pin 7 of Ul of Fig. 5D, through pin 6 of the connector Jl of Fig. 5D, pin 6 of
connector Jl of Fig. 5 A, through the jumper to pin 12 of Ul of Fig. 5A.
Preferably, receipt of an on signal or spike of duration less than 0.01 * T
is ignored. Receipt of an on signal as shown in Fig. 10B, of duration between 0.01 * T and 0.40 * T is preferably taken to be a bit with value 0. Receipt of an on signal as shown in Fig. 10C, of duration greater than 0.40 * T is preferably taken to be a bit with
value 1. Typically, T has a value of 1.0 millisecond.
Furthermore, after receipt of an on signal, the duration of the subsequent off signal is measured. The sum of the durations of the on signal and the off signal must
be between 0.90 T and 1.10 T for the bit to be considered valid. Otherwise, the bit is
considered invalid and is ignored.
Reference is now made to Fig. 11, which is a simplified flowchart illustration of a method for generating control instructions for the apparatus of Fig. 1A. The method of Fig. 11 preferably includes the following steps:
A toy is selected (step 550). At least one command is selected, preferably
from a plurality of commands associated with the selected toy (steps 560 - 580).
Alternatively, a command may be entered by selecting, modifying, and creating a new binary command (step 585).
Typically, selecting a command in steps 560 - 580 may include choosing a
command and specifying one or more control parameters associated with the command.
A control parameter may include, for example, a condition depending on a result of a
previous command, the previous command being associated either with the selected toy
or with another toy. A control parameter may also include an execution condition
governing execution of a command such as, for example: a condition stating that a
specified output is to occur based on a status of the toy, that is, if and only if a specified input is received; a condition stating that the command is to be performed at a specified time; a condition stating that performance of the command is to cease at a specified time;
a condition comprising a command modifier modifying execution of the command, such as, for example, to terminate execution of the command in a case where execution of the
command continues over a period of time; a condition dependent on the occurrence of a
future event; or another condition.
The command may comprise a command to cancel a previous command.
The output of the method of Fig. 11 typically comprises one or more
control instructions implementing the specified command, generated in step 590.
Typically, the one or more control instructions are comprised in a command file. Typically, the command file is called from a driver program which typically determines which command is to be executed at a given point in time and then calls the command
file associated with the given command.
Preferably, a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface. Reference is now made to Figs. 12A
- 12C, which are pictorial illustrations of a preferred embodiment of a graphical user interface implementation of the method of Fig. 11.
Fig. 12A comprises a toy selection area 600, comprising a plurality of toy
selection icons 610, each depicting a toy. The user of the graphical user interface of Figs.
12A - 12C typically selects one of the toy selection icons 610, indicating that a command
is to be specified for the selected toy.
Fig. 12A also typically comprises action buttons 620, typically comprising
one or more of the following:
a button allowing the user, typically an expert user, to enter a direct binary command implementing an advanced or particularly complex command not
otherwise available through the graphical user interface of Figs. 12A - 12C; a button allowing the user to install a new toy, thus adding a new toy
selection icon 610; and a button allowing the user to exit the graphical user interface of Figs. 12A
- 12C.
Fig. 12B depicts a command generator screen typically displayed after the
user has selected one of the toy selection icons 610 of Fig. 12 A. Fig. 12B comprises an animation area 630, preferably comprising a depiction of the selected toy selection icon
610, and a text area 635 comprising text describing the selected toy.
Fig. 12B also comprises a plurality of command category buttons 640, each of which allow the user to select a category of commands such as, for example:
output commands; input commands; audio in commands; audio out commands; and general commands.
Fig. 12B also comprises a cancel button 645 to cancel command selection
and return to the screen of Fig. 12 A.
Fig. 12C comprises a command selection area 650, allowing the user to specify a specific command. A wide variety of commands may be specified, and the
commands shown in Fig. 12C are shown by way of example only.
Fig. 12C also comprises a file name area 655, in which the user may
specify the name of the file which is to receive the generated control instructions. Fig.
12C also comprises a cancel button 645, similar to the cancel button 645 of Fig. 12B.
Fig. 12C also comprises a make button 660. Wlien the user actuates the make button 660, the control instruction generator of Fig. 11 generates control instructions
implementing the chosen command for the chosen toy, and writes the control instructions
to the specified file. Fig. 12C also comprises a parameter selection area 665, in which the user may specify a parameter associated with the chosen command.
The steps for programming the microcontrollers of the present invention include the use of a universal programmer, such as the Universal Programmer, type EXPRO 60/80, manufactured by Sunshine Electronics Co. Ltd., Taipei, Japan.
The above-described embodiment of Fig. 1C includes a description of a preferred set of predefined messages including a category termed "General commands". Other General Commands are defined by the following description:
MULTIPORT COMMANDS
AVAILABILITY INTERROGATION COMMAND
Figure imgf000069_0001
Figure imgf000069_0002
A computer transmits this command to verify that the radio channel is vacant If another computer is already using this channel it will respond with the Availability Response Command If no response is received within 250msec the channel is deemed vacant.
CT) P Computer address 00-03 II
A unit address - 00-FF H
AVAILABILITY RESPONSE COMMAND
Figure imgf000069_0003
A computer transmits this command in response to an Availability Interrogation Command to announce that the radio channel is in use. P Computer address 00-03 11
A unit address - 00-FF H
TOY AVAILABILITY COMMAND
Figure imgf000070_0001
A Toy transmits this command to declare its existence and receive in response a Channel Pair Selection Command designating the computer that will control it and the radio channels to use.
P: Computer address 00-03 H
A: unit address - 00-FF H
C7ι co CHANNEL PAIR SELECTION COMMAND
Figure imgf000070_0002
A computer transmits this command in response to a Toy Availability Command to inform the toy the radio channels to be used. P: Computer address 00-03 H
A: unit address - 00-FF H
CH 1 : Toy transmit channel 0- F H
CH I : Toy receive channel 0- F H
In Figs. 13 and 14 there are illustrated block diagrams of multiport multichannel implementation of the computer radio interface 110 of Fig. 1A. Fig. 13 illustrates the processing sub-unit of the computer interface that is implemented as an add-in board
installed inside a PC. Fig. 14 is the RF transceiver which is a device external to the
computer and connects to the processing subunit by means of a cable. In the present
application of the RF unit there are 4 transceivers each capable of utilizing two radio
channels simultaneously.
Referring briefly to Fig. 3, it is appreciated that, optionally, both sound and control commands may be transmitted via the MIDI connector 210 rather than
transmitting sound commands via the analog connector 220. It is additionally appreciated
that the functions of the interfaces 210 and 220 between the computer radio interface 110 and the sound card 190 may, alternatively, be implemented as connections between the computer radio interface 110 to the serial and/or parallel ports of the computer 100, as shown in Figs. 25A - 25E and Figs 26A -26D, respectively.
If it is desired to provide full duplex communication, each transceiver 260
which forms part of the computer radio interface 110 of Fig. 1A preferably is operative
to transmit on a first channel pair and to receive on a different, second channel pair. The
transceiver 260 (Fig. 4) which forms part of the toy control device 130 of Fig. 1A
preferably is operative to transmit on the second channel and to receive on the first channel.
Any suitable technology may be employed to define at least two channel
pairs such as narrow band technology or spread spectrum technologies such as frequency
hopping technology or direct sequence technology, as illustrated in Figs. 15A - 15E, showing a Multi-Channel Computer Radio Interface, and in Figs. 24A - 24E showing a
Multi-Channel Toy Control Device.
Reference is now made to Fig. 16 which is a simplified flowchart
illustration of a preferred method of operation of a computer radio interface (CRI) 110
operative to service an individual computer 100 of Fig. 1A without interfering with other
computers or being interfered with by the other computers, each of which is similarly serviced by a similar CRI. Typically, the method of Fig. 16 is implemented in software on
the computer 100 of Fig. 1A
The CRI includes a conventional radio transceiver (260 of Fig. 4) which
may, for example, comprise an RY3 GB021 having 40 channels which are divided into
20 pairs of channels. Typically, 16 of the channel pairs are assigned to information
communication and the remaining 4 channel pairs are designated as control channels.
In the method of Fig. 16, one of the 4 control channel pairs is selected by
the radio interface (step 810) as described in detail below in Fig. 17. The selected control channel pair i is monitored by a first transceiver (step 820) to detect the appearance of a
new toy which is signaled by arrival of a toy availability command from the new toy (step
816). When the new toy is detected, an information communication channel pair is
selected (step 830) from among the 16 such channel pairs provided over which game
program information will be transmitted to the new toy. A preferred method for
implementing step 830 is illustrated in self-explanatory flowchart Fig. 18 A. The "Locate
Computer" command in Fig. 18A (step 1004) is illustrated in the flowchart of Fig. 18B.
The identity of the selected information communication channel pair, also
termed herein a "channel pair selection command", is sent over the control channel pair to the new toy (step 840). A game program is then begun (step 850), using the selected information communication channel pair. The control channel pair is then free to receive
and act upon a toy availability command received from another toy. Therefore, it is
desirable to assign another transceiver to that control channel pair since the current transceiver is now being used to provide communication between the game and the toy.
To assign a further transceiver to the now un-monitored control channel,
the transceiver which was formerly monitoring that control channel is marked as busy in
a transceiver availability table (step 852). The transceiver availability table is then
scanned until an available transceiver, i.e. a transceiver which is not marked as busy, is identified (step 854). This transceiver is then assigned to the control channel i (step 858).
Fig. 17 is a simplified flowchart illustration of a preferred method for implementing "select control channel pair" step 810 of Fig. 16. In Fig. 17, the four
control channels are scanned. For each channel pair in which the noise level falls below a
certain threshold (step 895), the computer sends an availability interrogation command
(step 910) and waits for a predetermined time period, such as 250 ms, for a response (steps 930 and 940). If no other computer responds, i.e. sends back an "availability
response command", then the channel pair is deemed vacant. If the channel pair is found
to be occupied the next channel is scanned. If none of the four channel pairs are found to
be vacant, a "no control channel available" message is returned.
Fig. 19 is a self-explanatory flowchart illustration of a preferred method
of operation of the toy control device 130 which is useful in conjunction with the "multichannel" embodiment of Figs. 16 - 18B. i = 1, ..., 4 is an index of the control channels of
the system. The toy control device sends a "toy availability command" (step 1160) which is a message advertising the toy's availability, on each control channel i in turn (steps
1140, 1150, 1210), until a control channel is reached which is being monitored by a computer. This becomes apparent when the computer responds (step 1180) by
transmitting a "channel pair selection command" which is a message designating the information channel pair over which the toy control device may communicate with the game running on the computer. At this point (step 1190), the toy control device may begin receiving and executing game commands which the computer transmits over the
information channel pair designated in the control channel i.
According to a preferred embodiment of the present invention, a
computer system is provided, in communication with a remote game server, as shown in Fig. 20. The remote game server 1250 is operative to serve to the computer 100 at least a portion of at least one toy-operating game, which operates one or more toys 1260.
Optionally, an entire game may be downloaded from the remote game server 1250.
However, alternatively, a new toy action script or new text files may be downloaded
from the remote game server 1250 whereas the remaining components of a particular
game may already be present in the memory of computer 100.
Downloading from the remote game server 1250 to the computer 100
may take place either off-line, before the game begins, or on-line, in the course of the
game. Alternatively, a first portion of the game may be received off-line whereas an
additional portion of the game is received on-line.
The communication between the remote game server 1250 and the
computer 100 may be based on any suitable technology such as but not limited to ISDN; X.25; Frame-Relay; and Internet.
An advantage of the embodiment of Fig. 20 is that a very simple
computerized device may be provided locally, i.e. adjacent to the toy, because all "intelligence" may be provided from a remote source. In particular, the computerized device may be less sophisticated than a personal computer, may lack a display monitor of its own, and may, for example, comprise a network computer 1270.
Fig. 21 is a simplified flowchart illustration of the operation of the computer 100 or of the network computer 1260 of Fig. 20, when operating in
conjunction with the remote server 1250.
Fig. 22 is a simplified flowchart illustration of the operation of the remote game server 1250 of Fig. 20.
Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
computer controlled toy system including a toy 1500 having a toy control device 1504, a
computer 1510 communicating with the toy control device 1504 by means of a computer-radio interface 1514 and a proximity detection subsystem operative to detect
proximity between the toy and the computer. The proximity detection subsystem may for example include a pair of ultrasound transducers 1520 and 1530 associated with the toy and computer respectively. The toy's ultrasound transducer 1520 typically broadcasts
ultrasonic signals which the computer's ultrasound transducer 1530 detects if the
computer and toy are within ultrasonic communication range, e.g. are in the same room.
Figs. 24A - 24E, taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
which is similar to the detailed electronic schematic diagrams of Figs. 5 A - 5D except for
being multi-channel, therefore capable of supporting full duplex applications, rather than
single-channel.
Figs. 25A - 25E, taken together, form a detailed schematic illustration of a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer. Figs. 26A - 26D, taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
the sound board of the computer.
Figs. 27A - 27J are preferred self-explanatory flowchart illustrations of a
preferred radio coding technique, based on the Manchester coding, which is an alternative to the radio coding technique described above with reference to Figs. 8E, 8G
- 8M and lOA - C.
Figs. 28A - 28K, taken together, form a detailed electronic schematic
diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 13.
Figs. 29A - 291, taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14.
Fig. 30 illustrates a further embodiment of the present invention which includes a combination of a Computer Radio Interface (CRI) and a Toy Control Device
(TCD), 1610.
The combined unit 1610 controls a toy 1620 which is connected to the
computer 100 by a device, such as a cable, and communicates with other toys, 120, by
means such as radio communication, using the computer radio interface 110. The toy
1620 is operated in a similar manner as the toy device 120.
Fig 31 illustrates a simplified block diagram of the combined unit 1610.
Figs. 32A, 32B and 32C taken together form a simplified schematic diagram of the EP900 EPLD chip (U9) of Fig. 28H. The code to program the EPLD
chip for this schematic diagram preferably uses the programming package "Max Plus II
Ver. 6.2" available from Altera Corporation, 3525 Monroe Street, Santa Clara, CA. 5051, USA. Figs. 33 - 43, described hereinbelow, illustrate embodiments of the toy system of Figs. 1 - 32C in which a computer-controlled toy system has a capacity for
modifying a known language and/or speaking in a previously unknown or whimsical
language.
Reference is now made to Fig. 33 which is a simplified pictorial
illustration of a display-based fanciful figure interaction system constructed and operative in accordance with a preferred embodiment of the present invention. Shown is a
computer 2200 on which a fanciful figure 2210 is displayed. Computer 2200 is
preferably configured with an audio input device 2220, typically a microphone, through which computer 2200 may receive audio input, and an audio output device 2230, typically a speaker, through which computer 2200 may provide audio output, as is well
known in the art.
Reference is additionally made to Figs. 34A and 34B which are simplified pictorial illustrations of a toy-based fanciful figure interaction system, and Fig. 34C which is a simplified pictorial illustration of a toy-based fanciful figure constructed and
operative in accordance with another preferred embodiment of the present invention.
Shown in Figs. 34A and 34B is computer 2200 preferably configured with audio input
device 2220 and audio output device 2230. In Fig. 34A a toy 2240 is shown in wired communication with computer 2200 along wired connection 2250, while in Fig. 34B toy
2240 is shown to be in wireless communication with computer 2200 via toy transceiver
2260 and computer radio interface 2270. It is appreciated that more than one toy may be
in communication with computer 2200 at any given time. Audio input device 2220
and/or audio output device 2230 may be replaced with or augmented by audio input
device 2222 and/or audio output device 2233 (Fig. 34C) assembled with toy 2240 for input and/or output communication with computer 2200. Shown more clearly in Fig. 34C, toy 2240 is preferably configured with a control unit 2262, a power unit 2264, and one or more articulating appendages 2266. A user 2280 is also shown interacting with toy 2240. It is appreciated that any or all of the functionality of computer 2200 may be assembled with or otherwise incorporated in toy 2240. A preferred configuration of the
toy-based fanciful figure interaction system of Figs. 34 A, 34B, and 34C are described in
greater detail hereinabove with reference to Figs. 1 - 32C.
Reference is now made to Fig. 35 which is a simplified block diagram of a
fanciful figure interaction system useful in the systems of Figs. 33, 34A, 34B, and 34C, constructed and operative in accordance with a preferred embodiment of the present
invention. It is appreciated that the system of Fig. 35 may be implemented in computer
hardware, computer software, or in any combination of computer hardware and software. The system of Fig. 35 preferably comprises a control unit 2300, a speech input and recognition unit 2310 capable of receiving a speech input and identifying the words comprising the speech input, an action interface 2320 capable of receiving action
instructions from users, a speech synthesis unit 2330 capable of producing audio speech
output, and an action control unit 2340 capable of controlling an external action. Speech
unit 2310 may receive input from audio input device 2220 (Fig. 33). Action interface
2320 may be implemented via computer 2200 (Figs. 33, 34A, 34B, and 34C) using
known computer menu interfaces or other known interfaces. Speech synthesis unit 2330
may provide output via audio output device 2230 (Fig. 33). Action control unit 2340 may control an action associated with fanciful figure 2210 (Fig. 33) or toy 2240 (Figs.
34 A, 34B, and 34C). The system of Fig. 35 also preferably comprises one or more sets
of phonemes 2350, one or more language sets 2360, each typically comprising one or more words in a known language such as English or fanciful words, a set 2370 of
actions, terms, feelings, or other concepts, one or more modification rule sets 2380, and an association set 2390 for maintaining associations between language set 2360 and
action set 2370. Any of the sets described with reference to Fig. 35 may be maintained in volatile or non-volatile computer storage as is well known. The system of Fig. 35 also
preferably comprises a clock 2400. A logical implementation of the various sets shown in Fig. 35 is described in greater detail hereinbelow with reference to Fig. 41.
Reference is now made to Fig. 36 which is a simplified operational flow chart of a fanciful figure interaction system useful in describing the systems of Figs. 33,
34 A, 34B, 34C, and 35 constructed and operative in accordance with a preferred embodiment of the present invention. Typical operation begins (step 3430) with the
fanciful figure 2210 (Fig. 33) or toy 2240 (Figs. 34 A, 34B, and 34C) performing an
action and verbalizing associated speech. A preferred method of performing step 3430 is
described in greater detail hereinbelow with reference to Fig. 37. In step 3450 speech
input is accepted. A preferred method of performing step 3450 is described in greater
detail hereinbelow with reference to Fig. 38. Should the speech not be successfully
recorded (step 3460) operation continues with step 3440. Successfully recorded speech
is then identified, typically using known speech-recognition software (step 3470). A
preferred method of performing step 3470 is described in greater detail hereinbelow with
reference to Fig. 39. Should the speech not be successfully identified (step 3480)
operation continues with step 3440. Successfully identified speech is then checked for an
association with a known action which is then performed (step 3470). A preferred
method of performing step 3490 is described in greater detail hereinbelow with reference to Fig. 40. Reference is now made to Fig. 37 which is a simplified operational flow
chart of a preferred implementation of step 3440 of Fig. 36 in greater detail, constructed
and operative in accordance with a preferred embodiment of the present invention. Typical operation begins (step 3500) with selecting a term or action from action set 2390
(Fig. 35) in accordance with selection criteria (step 3510). The selection may be random or in accordance with a level of complexity or history of usage associated with an action. Clock 2400 (Fig. 35) may be used to advance the level of complexity over time. A
language is then selected to be the current language, similarly at random or in accordance
with selection criteria (step 3520). Association set 2390 (Fig. 35) is then searched for an
association between language in language set 2360 (Fig. 35) and the selected term or action (step 3530). The associated action is then performed (step 3540) with or without verbalizing the associated language, and operation continues with step 3450 (Fig. 36) (step 3550).
Reference is now made to Fig. 38 which is a simplified operational flow
chart of a preferred implementation of step 3450 of Fig. 36 in greater detail, constructed and operative in accordance with a preferred embodiment of the present invention.
Typical operation begins (step 3560) with recording audio input typically comprising
speech (step 3570). The audio input in typically received via audio input device 2220
(Figs. 33, 34A, 34B, and 34C). A data file in a volatile or non-volatile storage medium is
typically used for recording the audio input as is well known. The presence or absence
of audio input is detected (step 3580) with operation continuing with step 3460 (Fig. 36) when either a file is constructed given the presence of audio input (step 3590) or no file is created in the absence of audio input (step 3600). Reference is now made to Fig. 39 which is a simplified operational flow chart of a preferred implementation of step 3470 of Fig. 36 in greater detail, constructed
and operative in accordance with a preferred embodiment of the present invention.
Typical operation begins (step 3610) with analyzing the file constructed in step 3590 of Fig. 38 for a first pause between speech elements, yielding a first speech element (step 3620). Speech recognition is then performed on the first speech element (step 3630). If
the first speech element is a language identifier (step 3640) then the current language is
set to the language indicated by the identifier (step 3650) and operation continues with
step 3690. If the first speech element is not a language identifier, speech recognition is
performed on the rest of the file using the language last used as the current language (step 3660). The speech is then identified for known words in the current language (step
3670). If no known words are found, another language is set to the current language (step 3680) and speech recognition is again performed on the rest of the file (step 3690). The speech is then identified for known words in the current language (step 3700). If no
known words are found, an indicator is returned indicating that the speech has not been
identified (step 3710). If the word is identified in a known, learned, generated, or
modified language an indicator is returned indicating that the speech has been identified
(step 3720). Operation continues with step 3480 (Fig. 36).
Reference is now made to Fig. 40 which is a simplified operational flow
chart of a preferred implementation of step 3490 of Fig. 36 in greater detail, constructed
and operative in accordance with a preferred embodiment of the present invention. Typical operation begins (step 3730) with selecting a language which becomes the current language, at random or in accordance with selection criteria (step 3740).
Association set 2390 (Fig. 35) is then searched for an association between language in language set 2360 (Fig. 35) and the term or action (step 3750). The associated action is
then performed (step 3760) and operation continues with step 3450 (Fig. 36) (step
3770).
Reference is now made to Fig. 41 which is a simplified block diagram of a
logical implementation of the various sets described hereinabove with reference to Fig.
35 constructed and operative in accordance with a preferred embodiment of the present invention. A root entity 2780 typically comprises a list of terms comprising preset
terminology and learned terminology. Preset terminology is typically preconfigured with toy 2240 (Figs. 34A, 34B, 34C), and/or preconfigured in ROM, diskette, and/or CD-
ROM, etc. for access by computer 2200 (Figs. 33, 34 A, 34B and 34C). Learned
terminology is typically acquired from a user, referred to herein as a "player." The
terminology is preferably associated with two tables: a vocabulary table 2790 and a table of "emotions" 2800. Vocabulary table 2790 is typically used to provide fanciful figure 2210 (Fig. 33) and/or toy 2240 (Fig. 34A, 34B, and 34C) with the pronunciation of each
term in the list of terms. The pronunciation may be effected via a voice file, a sequence of
phonemes, a text file, etc. as required to produce the necessary sound and according to
its medium (i.e., microphone, rule-based or keyboard input, etc.). Table of emotions
2800 typically comprises toy emotions 2810, with toy alternately being referred to herein
as "alien," and player emotions 2820. Each toy emotion typically comprises:
a need field that uniquely identifies the emotion;
a sequence of expressions that form the sounds, motions, etc. performed by the toy;
satisfaction that defines the response expected from the player; and gratitude that includes another sequence of expressions and/or a term by which the toy confirms to the player that his or her response was correct. Player emotions 2820 preferably has the same structure as toy emotions 2810.
Reference is now made to Figs. 42 and 43 which are simplified block
diagrams of possible implementations of various tables described in Fig 41 constructed
and operative in accordance with a preferred embodiment of the present invention. A table of terms 2830 typically includes a list of terms and a vocabulary in two languages. Each record contains a term field, the term's pronunciation in the two languages, a usage
counter and a level field. The usage counter is useful to ensure that a term will not be
under-used. The level field provides for a gradual and automatic increase in the number
and complexity of terms available for selection. Typically, the level is automatically increased at a preset pace, such as every week or when the average usage value for a
lower level reaches a certain preset value (e.g. 25). To introduce the player to the enhanced vocabulary fanciful figure 2210 (Fig. 33) and/or toy 2240 (Figs. 34 A, 34B, and
34C) are preferably equipped with a selection of stories each appropriate to a particular
levels of vocabulary. Typically, when a level is increased, fanciful figure 2210 (Fig. 33)
and or toy 2240 (Fig. 34A, 34B, and 34C) plays a story using the relevant vocabulary.
A toy emotion record 2840 and 2850 of Fig. 42 and 2860 and 2870 of
Fig. 43 describes needs (or emotions), "cold", "happy", "right-hand" and "left-hand" respectively associated with fanciful figure 2210 (Fig. 33) and/or toy 2240 (Figs. 34A,
34B, and 34C). Each toy emotion record typically comprises several sub-records (for example, each sub-record appears as a row of fields in the toy emotion records 2840,
2850, 2860 and 2870) with each sub-record containing the following fields: sub-record
type, field identifier and optional parameters. The first sub-record comprises a need field type followed by a need value (such as cold or happy) as a record identifier. The next sub-records are the expressions, comprising a sub-record type expression type and
relevant parameters. Following is a satisfaction sub-record that comprises a sub-record
type field, an expected response type, and relevant parameters such as switch opening or closure, content of speech recorded from the player, etc. The last sub-records form a
sequence of gratitude actions or the following need.
It is appreciated that the software components of the present invention
may, if desired, be implemented in ROM (read-only memory) form. The software components may, generally, be implemented in hardware, if desired, using conventional
techniques.
It is appreciated that various features of the invention which are, for
clarity, described in the contexts of separate embodiments may also be provided in
combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided
separately or in any suitable subcombination.
It will be appreciated by persons skilled in the art that the present
invention is not limited to what has been particularly shown and described hereinabove.
Rather, the scope of the present invention is defined only by the claims that follow which
are:

Claims

CLAIMSWe claim:
1. A toy with developing skills, the toy comprising: a fanciful figure having a capacity to perform an action; and action control circuitry operative to control the fanciful figure to perform
said action at different levels of skill at different times.
2. A toy according to claim 1 wherein said capacity to perform an action
comprises a capacity to talk.
3. A toy according to claim 1 wherein said action control circuitry is operative to control the fanciful figure to perform said action at an increasing level of
skill over time.
4. A toy according to claim 3 wherein said action comprises talking and wherein the fanciful figure is operative to increase its vocabulary over time.
5. A toy according to claim 1 wherein said capacity to perform an action
comprises performance of at least one physical action in response to an oral stimulus.
6. A toy according to claim 1 wherein said capacity to perform an action
comprises a capacity to perform an action other than talking and to emit a verbalization
associated with said action.
7. A system for interacting with a computer-controlled fanciful figure
comprising:
at least one fanciful figure; at least one speech output apparatus; at least one computer operative to control said fanciful figure and provide
a speech output associated with said fanciful figure via said at least one speech output
apparatus, wherein said speech output is in a special language.
8. A system for interacting with a computer-controlled fanciful figure according to claim 7 wherein said special language is at least partly generated by said at
least one computer.
9. A system for interacting with a computer-controlled fanciful figure
according to claim 8 wherein said special language is at least partly generated by
modifying at least one known language according to at least one language modification
rule.
10. A system for interacting with a computer-controlled fanciful figure
according to claim 9 and wherein said at least one computer is operative to receive said
at least one language modification rule from a user.
11. A system for interacting with a computer-controlled fanciful figure
according to claim 9 and wherein said at least one computer is operative to provide said at least one language modification rule to a user.
12. A system for interacting with a computer-controlled fanciful figure according to claim 8 wherein said special language is at least partly generated from a
predefined set of phonemes.
13. A system for interacting with a computer-controlled fanciful figure
according to claim 7 wherein said at least one computer is operative to receive at least a
portion of said special language from a user.
14. A system for interacting with a computer-controlled fanciful figure according to claim 7 wherein said at least one fanciful figure is action induceable for
producing an action.
15. A system for interacting with a computer-controlled fanciful figure
according to claim 8 wherein said action comprises a movement.
16. A system for interacting with a computer-controlled fanciful figure
according to claim 8 wherein said action comprises a sound.
17. A system for interacting with a computer-controlled fanciful figure according to claim 8 wherein said action comprises a light emission.
18. A system for interacting with a computer-controlled fanciful figure
according to claim 8 wherein said speech output is identifiable with said action.
19. A system for interacting with a computer-controlled fanciful figure
according to claim 18 wherein said at least one computer maintains a memory comprising at least one said speech output identifiable with said action.
20. A system for interacting with a computer-controlled fanciful figure according to claim 8 wherein said at least one computer is operative to induce said
fanciful figure to produce said action.
21. A system for interacting with a computer-controlled fanciful figure according to claim 8 wherein a user induces said fanciful figure to produce said action
and wherein said at least one computer is operative to detect said action.
22. A system for interacting with a computer-controlled fanciful figure
according to claim 7 and further comprising at least one speech input apparatus and
wherein said at least one computer is operative to receive a speech input via said at least
one speech input apparatus.
23. A system for interacting with a computer-controlled fanciful figure
according to claim 18 wherein said speech input is identifiable with said action.
24. A system for interacting with a computer-controlled fanciful figure according to claim 18 wherein said at least one computer maintains a memory comprising at least one said speech input identifiable with said action.
25. A system for interacting with a computer-controlled fanciful figure
according to claim 7 and wherein said at least one computer is additionally operative to translate between said special language and at least one other language wherein said other language comprises a language of common discourse.
26. A system for interacting with a computer-controlled fanciful figure
according to claim 7 wherein said at least one fanciful figure is displayable on a computer
display.
27. A system for interacting with a computer-controlled fanciful figure
according to claim 7 wherein said speech output apparatus is assembled with said at least
one computer.
28. A system for interacting with a computer-controlled fanciful figure
according to claim 7 wherein said fanciful figure is a toy in communication with said at least one computer.
29. A system for interacting with a computer-controlled fanciful figure
according to claim 28 wherein said at least one computer is assembled with said toy.
30. A system for interacting with a computer-controlled fanciful figure according to claim 28 wherein said toy comprises at least one appendage that is actuable.
31. A system for interacting with a computer-controlled fanciful figure
according to claim 28 wherein said toy comprises at least one appendage that is
articulatable.
32. A system for interacting with a computer-controlled fanciful figure according to claim 28 wherein said speech output apparatus is assembled with said toy.
33. A system for interacting with a computer-controlled fanciful figure
according to claim 7 wherein said language is a previously unknown language.
34. A system for interacting with a computer-controlled fanciful figure
according to claim 22 wherein said at least one fanciful figure comprises a toy in
communication with said at least one computer and said speech input apparatus is
assembled with said toy.
35. A system for interacting with a computer-controlled fanciful figure according to claim 7 wherein said at least one fanciful figure comprises a plurality of
fanciful figures.
36. A system for interacting with a computer-controlled fanciful figure
according to claim 22 wherein said speech input apparatus is assembled with said at least
one computer .
37. A system for interacting with a computer-controlled fanciful figure according to claim 7 wherein said special language is preassembled with said at least one
computer.
38. A method of playing with a toy, the method comprising: selecting an action having an associated skill level;
controlling a fanciful figure to perform said action; and increasing said skill level over time.
39. A method according to claim 38 wherein said selecting step comprises
selecting a talking action.
40. A method according to claim 38 wherein said increasing step comprises
increasing a vocabulary over time.
41. A method of playing with a toy, the method comprising:
providing at least one fanciful figure;
controlling speech output apparatus to provide a speech output associated with said fanciful figure wherein said speech output is in a special language.
42. A method of playing with a toy according to claim 41 wherein said controlling step comprises generating at least part of said special language.
43. A method of playing with a toy according to claim 42 wherein said generating step comprises generating said at least part of said special language by modifying at least one known language according to at least one language modification rule.
44. A method of playing with a toy according to claim 42 wherein said generating step comprises generating said at least part of said special language from a predefined set of phonemes.
45. A method of playing with a toy according to claim 41 and further comprising controlling said at least one fanciful figure to perform an action associated with said speech output.
46. A method of playing with a toy, the method comprising: providing at least one fanciful figure; controlling said at least one fanciful figure to produce an action; and accepting at least one speech input for association with said action.
47. A method of playing with a toy according to claim 46 wherein said
controlling-action step comprises articulating at least one appendage of said fanciful
figure.
48. A method of playing with a toy according to claim 46 and further
comprising controlling speech output apparatus to provide a speech output associated
with said fanciful figure.
49. A method of playing with a toy according to claim 48 wherein said
controlling speech output step further comprises providing said speech output associated
with said action.
50. A method of playing with a toy according to claim 48 wherein said controlling speech output step further comprises providing said speech output in a
previously unknown language.
PCT/IL1998/000406 1997-08-27 1998-08-25 Interactive talking toy WO1999010065A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU88834/98A AU8883498A (en) 1997-08-27 1998-08-25 Interactive talking toy
EP98940531A EP0935492A4 (en) 1997-08-27 1998-08-25 Interactive talking toy

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IL12164297A IL121642A0 (en) 1997-08-27 1997-08-27 Interactive talking toy
IL121642 1997-08-27
US09/062,499 US6290566B1 (en) 1997-08-27 1998-04-17 Interactive talking toy
US09/062,499 1998-04-17

Publications (2)

Publication Number Publication Date
WO1999010065A2 true WO1999010065A2 (en) 1999-03-04
WO1999010065A3 WO1999010065A3 (en) 1999-05-20

Family

ID=26323495

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL1998/000406 WO1999010065A2 (en) 1997-08-27 1998-08-25 Interactive talking toy

Country Status (3)

Country Link
EP (1) EP0935492A4 (en)
AU (1) AU8883498A (en)
WO (1) WO1999010065A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1064976A1 (en) * 1999-06-30 2001-01-03 Onilco Innovacion S.A. Crawling doll fitted with search and direction change device
FR2819906A1 (en) * 2001-01-25 2002-07-26 Berchet Groupe Soc INTERACTION DEVICE WITH A MICROCOMPUTER
FR2819907A1 (en) * 2001-01-25 2002-07-26 Berchet Groupe Soc INTERACTION DEVICE WITH A MICROCOMPUTER
US6773344B1 (en) 2000-03-16 2004-08-10 Creator Ltd. Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4679789A (en) * 1983-12-26 1987-07-14 Kabushiki Kaisha Universal Video game apparatus with automatic skill level adjustment
WO1987006487A1 (en) * 1986-05-02 1987-11-05 Vladimir Sirota Toy
US4857030A (en) * 1987-02-06 1989-08-15 Coleco Industries, Inc. Conversing dolls
US4923428A (en) * 1988-05-05 1990-05-08 Cal R & D, Inc. Interactive talking toy
US5021878A (en) * 1989-09-20 1991-06-04 Semborg-Recrob, Corp. Animated character system with real-time control
US5281143A (en) * 1992-05-08 1994-01-25 Toy Biz, Inc. Learning doll
US5479564A (en) * 1991-08-09 1995-12-26 U.S. Philips Corporation Method and apparatus for manipulating pitch and/or duration of a signal
US5752880A (en) * 1995-11-20 1998-05-19 Creator Ltd. Interactive doll

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4679789A (en) * 1983-12-26 1987-07-14 Kabushiki Kaisha Universal Video game apparatus with automatic skill level adjustment
WO1987006487A1 (en) * 1986-05-02 1987-11-05 Vladimir Sirota Toy
US4857030A (en) * 1987-02-06 1989-08-15 Coleco Industries, Inc. Conversing dolls
US4923428A (en) * 1988-05-05 1990-05-08 Cal R & D, Inc. Interactive talking toy
US5021878A (en) * 1989-09-20 1991-06-04 Semborg-Recrob, Corp. Animated character system with real-time control
US5479564A (en) * 1991-08-09 1995-12-26 U.S. Philips Corporation Method and apparatus for manipulating pitch and/or duration of a signal
US5281143A (en) * 1992-05-08 1994-01-25 Toy Biz, Inc. Learning doll
US5752880A (en) * 1995-11-20 1998-05-19 Creator Ltd. Interactive doll

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0935492A2 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1064976A1 (en) * 1999-06-30 2001-01-03 Onilco Innovacion S.A. Crawling doll fitted with search and direction change device
US6773344B1 (en) 2000-03-16 2004-08-10 Creator Ltd. Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems
FR2819906A1 (en) * 2001-01-25 2002-07-26 Berchet Groupe Soc INTERACTION DEVICE WITH A MICROCOMPUTER
FR2819907A1 (en) * 2001-01-25 2002-07-26 Berchet Groupe Soc INTERACTION DEVICE WITH A MICROCOMPUTER
WO2002058808A1 (en) * 2001-01-25 2002-08-01 Groupe Berchet Device for interaction with a micro-computer
WO2002058807A1 (en) * 2001-01-25 2002-08-01 Groupe Berchet Device for interaction with a micro-computer

Also Published As

Publication number Publication date
AU8883498A (en) 1999-03-16
WO1999010065A3 (en) 1999-05-20
EP0935492A4 (en) 1999-12-01
EP0935492A2 (en) 1999-08-18

Similar Documents

Publication Publication Date Title
US6290566B1 (en) Interactive talking toy
US6206745B1 (en) Programmable assembly toy
US6773322B2 (en) Programmable assembly toy
US6022273A (en) Interactive doll
US20020005787A1 (en) Apparatus and methods for controlling household appliances
US20020107591A1 (en) "controllable toy system operative in conjunction with a household audio entertainment player"
US6352478B1 (en) Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites
WO1998053456A1 (en) Apparatus and methods for controlling household appliances
JP5349860B2 (en) PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE
WO1999008762A1 (en) Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites
US20050154594A1 (en) Method and apparatus of simulating and stimulating human speech and teaching humans how to talk
US20010021669A1 (en) I*doll
JPH11511859A (en) Educational and entertainment device with dynamic configuration and operation
KR20100044779A (en) An audio animation system
JP2002169590A (en) System and method for simulated conversation and information storage medium
US20090141905A1 (en) Navigable audio-based virtual environment
EP1080352A1 (en) Intelligent doll
WO1999010065A2 (en) Interactive talking toy
CA2611635A1 (en) Remote game device for dvd gaming systems
WO1998053567A1 (en) Controllable toy operative with audio player
US20230306666A1 (en) Sound Based Modification Of A Virtual Environment
Roden et al. Toward mobile entertainment: A paradigm for narrative-based audio only games
WO2005038776A1 (en) Voice controlled toy
RU2209651C2 (en) Playing system
Spöhrer A History of Disability and Voice-Enabled Gaming from the 1970s to Intelligent Personal Assistants

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A3

Designated state(s): AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1998940531

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1998940531

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase in:

Ref country code: JP

Ref document number: 1999514137

Format of ref document f/p: F

NENP Non-entry into the national phase in:

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1998940531

Country of ref document: EP