EP0935492A2 - Interactive talking toy - Google Patents
Interactive talking toyInfo
- Publication number
- EP0935492A2 EP0935492A2 EP98940531A EP98940531A EP0935492A2 EP 0935492 A2 EP0935492 A2 EP 0935492A2 EP 98940531 A EP98940531 A EP 98940531A EP 98940531 A EP98940531 A EP 98940531A EP 0935492 A2 EP0935492 A2 EP 0935492A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- computer
- fanciful
- toy
- action
- controlled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L13/00—Speech synthesis; Text to speech systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1025—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
Definitions
- the present invention relates to toys in general, and particularly to computer-controlled toys with a capacity for speech.
- toys which are remotely controlled by wireless communication and which are not used in conjunction with a computer system.
- such toys include vehicles whose motion is controlled by a human user via a
- Haugerud describes computer control of a toy via a wired connection, wherein the user of the computer typically writes a simple program
- US Patent 4,840,602 to Rose describes a talking doll responsive to an external signal, in which the doll has a vocabulary stored in digital data in a memory which may be accessed to cause a speech synthesizer in the doll to simulate speech.
- control signals to the animated character to provide speech, hearing vision and
- US Patent 5,388,493 describes a system for a housing for a vertical dual keyboard MIDI wireless controller for accordionists.
- the system may be used with either a conventional MIDI cable connection or by a wireless MIDI transmission system.
- German Patent DE 3009-040 to Neuhierl describes a device for adding the capability to transmit sound from a remote control to a controlled model vehicle.
- the sound is generated by means of a microphone or a tape recorder and transmitted to the controlled model vehicle by means of radio communications.
- the model vehicle is equipped with a speaker that emits the received sounds.
- the present invention seeks to provide an improved computer-controlled toy system with a capacity for modifying a known language and/or speaking in a previously unknown or whimsical language.
- the computer or computer-controlled toy may be any suitable computer or computer-controlled toy.
- the computer or computer-controlled toy may be any suitable computer or computer-controlled toy.
- a user may interact with the computer or
- the computer-controlled toy "demodifies" the speech to arrive at an associated English word.
- the computer or computer-controlled toy may perform an action based on
- a computer or computer-controlled toy speaks a language with an increasing
- the present invention also seeks to provide an improved computer-
- a computer or computer-controlled toy is configured with a set of actions or
- the computer or computer-controlled toy is further capable of introducing
- the computer or computer-controlled toy is additionally or alternatively capable of receiving a word chosen by the user for association with the action.
- the computer or computer-controlled toy may maintain
- Words of any language known to the computer or computer-controlled toy may have an associated level of complexity for controlling what words are available to the computer
- the user may then command the computer or computer-controlled toy using the private language.
- the computer or computer-controlled toy makes-up a language for a each of a
- predefined and/or user defined base language units comprising monosyllabic or
- Base language units may be predefined together with a complexity designation (e.g., those with more syllables, more
- the user provides the computer or computer-controlled toy with made-up
- the computer or computer-controlled toy interprets user speech by searching made-up, modified, and/or known languages, possibly in a particular order.
- the user may give a cue to indicate that he is using and wishes to be understood using a particular
- the present invention a toy with developing skills, the toy including a fanciful figure having a
- action control circuitry operative to control the fanciful figure to perform the action at different levels of skill at different times.
- the capacity to perform an action includes a capacity to talk.
- the action control circuitry is operative to control the fanciful figure to perform
- the action at an increasing level of skill over time. Additionally in accordance with a preferred embodiment of the present invention the action includes talking and the fanciful figure is operative to increase its
- the capacity to perform an action includes performing at least one physical action in response to an oral stimulus.
- a system for interacting with a computer-controlled fanciful figure including at least one fanciful figure, at least one speech output apparatus, at least
- one computer operative to control the fanciful figure and provide a speech output associated with the fanciful figure via the at least one speech output apparatus, the
- speech output is in a special language.
- the special language is at least partly generated by the at least one computer.
- the special language is at least partly generated by modifying at least one
- the at least one computer is operative to receive the at least one language
- the at least one computer is operative to provide the at least one language modification rule to a user.
- the at least one fanciful figure is action induceable for producing an action.
- the action includes a movement.
- the action includes a sound.
- the action includes a light emission.
- the speech output is identifiable with the action.
- the at least one computer is operative to induce the fanciful figure to produce the action.
- the user induces the fanciful figure to produce the action and the at least one
- computer is operative to detect the action.
- the computer is operative to receive a speech input via the at least one speech input
- the speech input is identifiable with the action.
- the at least one computer is additionally operative to translate between the
- the at least one fanciful figure is displayable on a computer display.
- the speech output apparatus is assembled with the at least one computer. Additionally in accordance with a preferred embodiment of the present invention
- the fanciful figure is a toy in communication with the at least one computer.
- the at least one computer is assembled with the toy.
- the toy includes at least one appendage that is actuable.
- the toy includes at least one appendage that is articulatable.
- the speech output apparatus is assembled with the toy.
- the language is a previously unknown language.
- one computer and the speech input apparatus is assembled with the toy.
- the at least one fanciful figure includes a plurality of fanciful figures.
- the speech input apparatus is assembled with the at least one computer .
- the special language is preassembled with the at least one computer.
- a method of playing with a toy including selecting an action having an associated skill level, controlling a fanciful figure to perform the
- the selecting step includes selecting a talking action.
- the increasing step includes increasing a vocabulary over time.
- a method of playing with a toy including providing
- At least one fanciful figure controlling speech output apparatus to provide a speech output associated with the fanciful figure the speech output is in a special language.
- controlling step includes generating at least part of the special language.
- the generating step includes generating the at least part of the special language by modifying at least one known language according to at least one language
- the generating step includes generating the at least part of the special language
- the method includes controlling the at least one fanciful figure, to perform an
- a method of playing with a toy including providing at least one fanciful figure, controlling the at least one fanciful figure to produce an action, and accepting at least one speech input for association with the action.
- the method includes controlling speech output apparatus to provide a speech output associated with the fanciful figure.
- a wireless computer controlled toy system including a computer system
- toy including a first wireless receiver, the toy receiving the first transmission via the first wireless receiver and operative to carry out at least one action based on the first
- the computer system may include a computer game.
- the toy may include
- the at least one action may include a plurality of actions.
- the first transmission may include a digital signal.
- the computer system includes a computer having a MLDI port and wherein the
- the computer may be operative to transmit the digital signal by way of the MIDI port.
- the sound includes music, a pre-recorded sound and/or speech.
- the speech may include recorded speech and synthesized speech.
- the at least one toy has a plurality of states including at least a sleep state and
- the first transmission includes a state transition command
- the at least one action includes transitioning between the sleep state and the awake state.
- a sleep state may typically include a state in which the toy consumes a
- an awake state is typically a state of normal operation.
- the computer system includes a plurality of computers.
- the first transmission includes computer identification data and the second
- transmission includes computer identification data.
- the at least one toy is operative to transmit a second transmission via a second wireless transmitter and the computer system is operative to receive the second transmission via a second wireless receiver.
- the at least one toy includes at least a first toy and a second toy, and wherein the first toy is operative to transmit a toy-to-toy transmission to the second toy via the second
- the second toy is operative to carry out at least one
- operation of the computer system is controlled, at least in part, by the second
- the computer system includes a computer game, and wherein operation of the
- the second transmission may include a digital signal and/or an analog
- the computer system has a plurality of states including at least a sleep state and
- the second transmission include a state transition command
- invention at least one toy includes sound input apparatus, and the second transmission
- the computer system is also operative to perform at least one of the following actions: manipulate the sound signal; and play the sound signal.
- the sound includes speech
- the computer system is operative to perform a speech recognition operation on the speech.
- the second transmission includes toy identification data, and the computer
- system is operative to identify the at least one toy based, at least in part, on the toy
- the first transmission includes toy identification data.
- the toy identification may adapt a mode of operation thereof based, at least in part, on the toy identification
- the at least one action may include movement of the toy, movement of a part of
- the sound may be transmitted using a MIDI
- a game system including a computer system operative to control
- a computer game and having a display operative to display at least one display object, and at least one toy in wireless communication with the computer system, the computer
- game including a plurality of game objects, and the plurality of game objects includes the at least one display object and the at least one toy.
- the at least one toy is operative to transmit toy identification data to the
- the computer game based, at least in part, on the toy identification data.
- the computer system may include a plurality of computers.
- the first transmission includes computer identification data and the second transmission includes computer identification data.
- MIDI musical instrument data interface
- apparatus including MIDI apparatus operative to receive and transmit MIDI data
- the first wireless apparatus is
- MIDI data including data received from the first MIDI device to the second wireless apparatus, and to transmit MIDI data including data received from
- the second wireless apparatus to the first MIDI device, and the second wireless
- apparatus is operative to transmit M DI data including data received from the second MIDI device to the first wireless apparatus, and to transmit MIDI data including data
- the second wireless apparatus includes a plurality of wirelesses each
- each of the second plurality of wirelesses is operative to transmit MIDI data including data received from the associated MIDI device to the first wireless apparatus, and to transmit MIDI data including data received from the first wireless apparatus to the associated MIDI device.
- the first MIDI device may include a computer, while the second MIDI
- the device may include a toy.
- the first wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the first wireless and a first analog device,
- the second wireless apparatus also includes analog interface apparatus operative to receive and transmit analog signals between the second wireless and a second analog device, and the first wireless apparatus is also operative to transmit analog signals
- first wireless apparatus and to transmit analog signals including data received from the
- a method for generating control instructions for a computer controlled toy system includes selecting a toy, selecting at least one command from among a plurality of commands associated with the toy, and generating
- control instructions for the toy including the at least one command.
- the step of selecting at least one command includes choosing a command
- command includes utilizing a graphical user interface.
- the at least one control parameter includes an execution condition controlling
- the execution condition may include a time at which to perform the
- condition may also include a status of the toy.
- the at least one control parameter includes a command modifier modifying
- the at least one control parameter includes a condition dependent on a future
- the at least one command includes a command to cancel a previous command.
- the present invention a signal transmission apparatus for use in conjunction with a
- the apparatus including wireless transmission apparatus; and signal processing
- apparatus including at least one of the following analog/digital sound conversion apparatus operative to convert analog sound signals to digital sound signals, to convert digital sound signals to analog sound signals, and to transmit the signals between the
- a peripheral control interface operative to transmit control signals between the computer and a
- peripheral device using the wireless transmission apparatus and a MIDI interface
- a computer system including a computer, and a sound card operatively attached to the computer and having a MIDI connector and at least one
- the computer is also operative to receive digital signals by means of the MIDI connector and to receive analog signals by means of the at least one analog connector.
- Figs. 1 - 32C illustrate a toy system for use in conjunction with a
- Fig. 1A is a partly pictorial, partly block diagram illustration of a
- Fig. IB is a partly pictorial, partly block diagram illustration a preferred
- FIG. 1C is a partly pictorial, partly block diagram illustration of a
- FIGS. 2A - 2C are simplified pictorial illustrations of a portion of the
- Fig. 3 is a simplified block diagram of a preferred implementation of the
- Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3;
- Figs. 5A - 5D taken together comprise a schematic diagram of the apparatus of Fig. 4;
- Fig. 5E is an schematic diagram of an alternative implementation of the
- Fig. 6 is a simplified block diagram of a preferred implementation of the
- Figs. 7A - 7F taken together with either Fig. 5D or Fig. 5E, comprise a
- Fig. 8A is a simplified flowchart illustration of a preferred method for
- FIG. 8B - 8T taken together, comprise a simplified flowchart illustration
- Fig. 9A is a simplified flowchart illustration of a preferred method for
- Figs. 9B - 9N taken together with Figs. 8D - 8M, comprise a simplified
- Figs. 10A - IOC are simplified pictorial illustrations of a signal transmitted
- Fig. 11 is a simplified flowchart illustration of a preferred method for
- Figs. 12A - 12C are pictorial illustrations of a preferred implementation of
- Fig. 13 is a block diagram of a first sub-unit of a multi-port multi-channel
- Fig. 14 is a block diagram of a second sub-unit of a multi-port multi ⁇
- Fig. 16 is a simplified flowchart illustration of a preferred method by
- FIG. 17 is a simplified flowchart illustration of a preferred method for implementing the "select control channel pair" step of Fig. 16;
- Fig. 18A is a simplified flowchart illustration of a preferred method for
- Fig. 18B is a simplified flowchart illustration of a preferred method for
- Fig. 19 is a simplified flowchart illustration of a preferred method of
- Fig. 20 is a simplified illustration of a remote game server in association
- a wireless computer controlled toy system which may include a network computer
- Fig. 21 is a simplified flowchart illustration of the operation of the computer or of the network computer of Fig. 20, when operating in conjunction with the
- Fig. 22 is a simplified flowchart illustration of the operation of the remote
- Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
- Figs. 24A - 24E taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
- a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer
- FIGS. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
- Figs. 27A - 27J are preferred flowchart illustrations of a preferred radio
- Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14;
- Fig. 30 is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and operative in accordance with a further
- Fig. 31 is a block diagram is a simplified block diagram illustrating the
- FIG. 32 A, 32B and 32C taken together form a simplified block diagram
- Figs. 33 - 43 illustrates embodiments of the toy system of Figs. 1 - 32C
- FIG. 33 is a simplified pictorial illustration of a display-based fanciful figure interaction system constructed and operative in accordance with a preferred embodiment of the present invention
- Figs. 34A and 34B taken together, are simplified pictorial illustrations of a toy-based fanciful figure interaction system constructed and operative in accordance
- Fig. 34C is a simplified pictorial illustration of the toy-based fanciful figure of Figs. 34A and 34B;
- Fig. 35 is a simplified block diagram of a fanciful figure interaction system
- Fig. 36 is a simplified operational flow chart of a fanciful figure
- Fig. 37 is a simplified operational flow chart of a preferred implementation of step 3440 of Fig. 36;
- Fig. 38 is a simplified operational flow chart of a preferred embodiment
- Fig. 39 is a simplified operational flow chart of a preferred embodiment
- Fig. 40 is a simplified operational flow chart of a preferred implementation of step 3490 of Fig. 36;
- Fig. 41 is a simplified block diagram of a preferred logical implementation
- Figs. 42 and 43 taken together, are simplified block diagrams of possible
- FIG. 1 A is a partly pictorial, partly block diagram illustration of a computer control system including a toy, constructed and
- the system of Fig. 1A comprises a computer 100, which may be any suitable computer such
- the computer 100 is equipped
- the computer 100 is preferably equipped with a sound card such as,
- the computer 100 is equipped with a computer radio interface 110
- the computer 100 and, in a preferred embodiment of the present invention, also to receive signals transmitted elsewhere via wireless transmission and to deliver the signals
- computer radio interface 110 are transmitted via both analog signals and digital signals,
- the transmitted signal may be an analog signal or a digital signal.
- the received signal may also be an analog signal or a digital signal.
- Each signal typically comprises a message.
- a preferred implementation of the computer radio interface 110 is
- the system of Fig. 1A also comprises one or more toys 120.
- the system of Fig. 1A also comprises one or more toys 120.
- Fig. 1A comprises a plurality of toys, namely three toys 122, 124, and 126 but it is
- Fig. IB is a partly pictorial, partly block diagram illustration of the toy 122 of Fig. 1 A.
- Each toy 120 comprises a power source 125, such as a battery or a
- Each toy 120 also comprises a toy control device 130,
- the received signal may be, as explained above, an analog signal or a digital signal.
- Each toy 120 preferably comprises a plurality of input devices 140 and
- the input devices 140 may comprise, for example
- a microphone 141 a microswitch sensor 142; a touch sensor (not shown in Fig. IB); a light sensor (not shown in Fig. IB); a movement sensor
- 143 which may be, for example, a tilt sensor or an acceleration sensor.
- the output devices 150 may comprise, for example, one or more of the following: a speaker 151; a light 152; a solenoid 153 which may be
- a motor such as a stepping motor, operative to move a portion of the toy;
- DC motors available from Alkatel (dunkermotoren), Postfach 1240, D-7823, BonndorCSchwarzald, Germany;
- stepping motors and miniature motors available from Haydon Switch and Instruments, Inc. (HSI), 1500 Meriden Road, Waterbury, CT, USA; and DC solenoids available from Communications Instruments, Inc., P.O. Box 520, Fairview, North Carolina 28730, USA.
- Examples of actions which the toy may perform include the following:
- a recorded sound a synthesized sound
- music including recorded music or synthesized music
- speech including recorded speech or synthesized speech.
- the received signal may comprise a condition governing the action as, for
- the duration of the action or the number of repetitions of the action.
- the portion of the received signal comprising a message
- comprising a sound typically comprises an analog signal.
- a sound typically comprises an analog signal.
- the portion of the received signal is the portion of the received signal
- comprising a sound, including music may comprise a digital signal, typically a signal
- the action the toy may perform also includes reacting to signals transmitted by another toy, such as, for example, playing sound that the other toy is monitoring and transmitting.
- the toy control device In a preferred embodiment of the present invention, the toy control device
- the computer radio interface 110 controls the computer radio interface 110.
- the computer radio interface 110 controls the computer radio interface 110.
- the 110 is preferably also operative to poll the toy control device 130, that is, transmit a
- the signal transmitted by the toy control device 130 may comprise one or more of the following: sound, typically sound captured by a microphone input device
- sensor input devices 140 as, for example, light sensors or micro switch; an indication of low power in the power source 125; or information identifying the toy.
- a sound signal transmitted by the device 130 may
- the computer system is operative to perform a speech recognition
- the signal from the radio control interface 110 may also comprise, for example, one or more of the following: a request to ignore input from one or more input
- control device 130 to transmit a signal comprising the stored data received from the one
- radio interface 110 and the toy control device 130 include information identifying the
- Fig. 1C is a partly pictorial, partly block
- FIG. 1 diagram illustration of a computer control system including a toy, constructed and
- the system of Fig. 1C comprises two computers 100. It is appreciated that, in
- toy control device 130 typically include information identifying the computer.
- the computer 100 runs software comprising a computer game, typically a
- the software may comprise
- animated object includes any object which may be
- An animated object may be any object depicted on the screen such as, for example: a doll; an action figure; a toy, such as, for example, an activity toy, a vehicle, or a ride-on vehicle; a drawing board or sketch board;
- a household object such as, for example, a clock, a lamp, a chamber pot, or an item of
- FIG. 2 A depicts a portion of the system of Fig. 1A in use.
- the apparatus of Fig. 2 A comprises the computer screen 105 of Fig. 1A.
- animated objects 160 and 165 are depicted on the computer screen.
- Fig. 2B depicts the situation after the toy 122 has been brought into range
- the toy 122 corresponds to the animated object 160.
- Fig. 2B For example, in Fig. 2B
- the toy 122 and the animated object 160, shown in Fig. 2 A are both a teddy bear.
- apparatus of Fig. 2B comprises the computer screen 105, on which is depicted the
- the apparatus of Fig. 2B also comprises the toy 122.
- Fig. 2C depicts the situation after the toy 126 has also been brought into
- the toy 126 corresponds to the animated object 165.
- the toy 126 and the animated object 165 shown in Figs. 2 A and 2B, are both a clock.
- the apparatus of Fig. 2C comprises the computer screen 105, on which no
- the apparatus of Fig. 2C also comprises the toy 126.
- the computer 100 is the means for calculating the total power required to produce the data.
- Fig. 2A the user interacts with the animated objects 160 and 165 on
- toys 122 and 126 may interact with the toys 122 and 126 by moving the toys or parts of the toys; by
- FIG. 3 is a simplified block diagram of a
- Fig. 3 comprises the computer radio interface 110.
- the apparatus of Fig. 3 also comprises a sound card 190, as described above with reference to Fig. 1A.
- the sound card 190 as described above with reference to Fig. 1A.
- connections between the computer radio interface 110 and the sound card 190 are
- the computer radio interface 110 comprises a DC unit 200 which is fed
- a MIDI interface 210 which connects to the sound card MIDI
- an audio interface 220 which connects to an audio interface 192 of the sound card 190; and a secondary audio interface 230 which preferably connects to a
- the apparatus of Fig. 3 also comprises an antenna 240, which is operative
- Fig. 4 is a more detailed block diagram of the computer radio interface 110 of Fig. 3.
- the apparatus of Fig. 4 comprises the DC unit 200, the MIDI interface
- the apparatus of Fig. 4 also comprises a multiplexer 240, a micro controller 250, a radio transceiver 260,
- connection unit 270 connecting the radio transceiver 260 to the micro controller 250
- FIG. 4 a schematic diagram of the apparatus of Fig. 4.
- Transistors 2N2222 and MPSA14 Motorola, Phoenix, AZ, USA. Tel.
- Ul of Fig. 5D may be replaced by:
- U2 of Fig. 5D may be replaced by:
- Fig. 5E is a schematic
- Fig. 5E Ul BIM-418-F low power UHF data transceiver module, Ginsburg Electronic GmbH, Am Moosfeld 85, D-81829, Munchen, Germany.
- Ul may be replaced by:
- circuit boards for alternate embodiments of the apparatus.
- the apparatus of Fig. 5E has similar functionality to the apparatus of Fig.
- MIDI data is transmitted and received.
- Figs. 5A - 5E are self-explanatory with regard to the above parts lists.
- FIG. 6 is a simplified block diagram of a
- the apparatus of Fig. 6 also comprises a microcontroller 250 similar to the microcontroller
- the apparatus of Fig. 6 also comprises a digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the digital input/output interface (digital I/O interface) 290, which is operative to provide an interface between the
- microcontroller 250 and a plurality of input and output devices which may be connected
- the apparatus of Fig. 6 also comprises an analog input/output interface
- the apparatus of Fig. 6 also comprises a multiplexer 305 which is
- the apparatus of Fig. 6 also comprises input devices 140 and output
- the input devices 140 comprise, by way of example, a tilt switch
- the output devices 150 comprise, by way of example, a DC
- the apparatus of Fig. 6 also comprises a DC control 310, a preferred implementation of which is described in more detail below with reference to Figs. 7A -
- the apparatus of Fig. 6 also comprises a comparator 280, similar to the
- the apparatus of Fig. 6 also comprises a power source 125, shown in Fig. 6 by way of example as batteries, operative to provide electrical power to the apparatus
- Fig. 5D or 5E comprise a schematic diagram of the toy control device of Fig. 6. If the
- FIG. 5E schematics of Fig. 5E is employed to implement the computer radio interface of Fig. 4,
- Figs. 7A - 7F are self-explanatory with reference to the above parts list.
- analog signals or digital signals are analog signals or digital signals. It the case of digital signals, the digital signals preferably
- device 130 comprises an indication of the intended recipient of the message.
- messages also comprise
- each message sent by the computer radio interface 110 to the toy control device 130 comprises an indication of the sender of the message; and each message sent by the toy control device 130 to the computer radio interface 110 comprises an indication of the intended recipient of the message.
- a preferred set of predefined messages is as follows:
- Set Toy control device output pin to a digital level D.
- I 01000005000203050000 set io 3 to "1" for 5 seconds
- the Audio is sent to the Toy control device by the computer sound card and the Computer radio interface.
- tn vo P Computer address 00-03 I I cmd 1 ,2: Received CRI command MSB ok ack 00-FF I I cmd 1,4: Received CRI command LSB ok ack 00-FF I I
- Fig. 8A is a simplified flowchart illustration of a preferred method for receiving radio signals, executing commands comprised therein, and sending radio signals, within the toy control device 130 of Fig.
- each message as described above comprises a command, which may
- Fig. 8A preferably comprises the following steps:
- a synchronization signal or preamble is detected (step 400).
- a header is
- a command contained in the signal is received (step 405).
- the command contained in the signal is executed (step 410). Executing the command may be as described above with reference to Fig. 1 A.
- a signal comprising a command intended for the computer radio interface 110 is sent (step 420).
- Fig. 9A is a simplified flowchart
- Fig. 9A also preferably comprises the following steps:
- a MIDI command is received from the computer 100 (step 430).
- MIDI command may comprise a command intended to be transmitted to the toy control device 130, may comprise an audio in or audio out command, or may comprise a general
- a MIDI command is sent to the computer 100 (step 440).
- the MIDI command may comprise a signal received from the toy control device 130, may comprise a response to a MIDI command previously received by the computer radio interface 110 from the computer 100, or may comprise a general command.
- the command contained in the MIDI command or in the received signal is executed (step 450).
- Executing the command may comprise, in the case of a received signal, reporting the command to the computer 100, whereupon the computer 100 may typically carry out any appropriate action under program control as, for example, changing a screen display or taking any other appropriate action in response to the received command.
- executing the command may comprise transmitting the command to the toy control device 130.
- Executing a MIDI command may also comprise switching audio output of the computer control device 110 between the secondary audio interface 230 and the radio transceiver 260. Normally the secondary audio interface 230 is directly connected to the audio interface 220 preserving the connection between the computer sound board and the peripheral audio devices such as speakers, microphone and stereo system.
- Figs. 9B - 9N Reference is now made to Figs. 8D - 8M, all of which, taken together, comprise a simplified flowchart illustration of a preferred implementation of the method of Fig. 9A.
- Figs. 10A - 10C are simplified pictorial illustrations of a signal transmitted between the computer radio interface 110 and the toy control device 130 of Fig. 1A.
- Fig. 10A comprises a synchronization preamble.
- the duration T_SYNC of the synchronization preamble is preferably .500 millisecond, being preferably substantially equally divided into on and off
- Fig. 10B comprises a signal representing a bit with value 0, while Fig.
- 10C comprises a signal representing a bit with value 1.
- Figs. 10B and 10C refer to the case where the
- each bit is assigned a predetermined duration T, which is the same for every bit.
- a frequency modulated carrier is transmitted, using the method of frequency modulation keying as is well known in the art.
- An "off' signal typically less
- Receipt of an on signal as shown in Fig. 10B of duration between 0.01 * T and 0.40 * T is preferably taken to be a bit with value 0.
- Receipt of an on signal as shown in Fig. 10C of duration greater than 0.40 * T is preferably taken to be a bit with
- T has a value of 1.0 millisecond.
- the duration of the subsequent off signal is measured. The sum of the durations of the on signal and the off signal must be measured.
- bit is between 0.90 T and 1.10 T for the bit to be considered valid. Otherwise, the bit is
- Fig. 11 is a simplified flowchart illustration of a method for generating control instructions for the apparatus of Fig. 1A.
- the method of Fig. 11 preferably includes the following steps:
- a toy is selected (step 550). At least one command is selected, preferably
- steps 560 - 580 From a plurality of commands associated with the selected toy (steps 560 - 580).
- a command may be entered by selecting, modifying, and creating a new binary command (step 585).
- selecting a command in steps 560 - 580 may include choosing a
- a control parameter may include, for example, a condition depending on a result of a
- previous command the previous command being associated either with the selected toy
- a control parameter may also include an execution condition
- governing execution of a command such as, for example: a condition stating that a
- specified output is to occur based on a status of the toy, that is, if and only if a specified input is received; a condition stating that the command is to be performed at a specified time; a condition stating that performance of the command is to cease at a specified time;
- condition comprising a command modifier modifying execution of the command, such as, for example, to terminate execution of the command in a case where execution of the
- the command may comprise a command to cancel a previous command.
- the output of the method of Fig. 11 typically comprises one or more
- control instructions implementing the specified command, generated in step 590 are included in step 590.
- the one or more control instructions are comprised in a command file.
- the command file is called from a driver program which typically determines which command is to be executed at a given point in time and then calls the command
- a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface.
- Figs. 12A a user of the method of Fig. 11 performs steps 550 and 560 using a computer having a graphical user interface.
- Fig. 12A comprises a toy selection area 600, comprising a plurality of toy
- selection icons 610 each depicting a toy.
- Fig. 12A also typically comprises action buttons 620, typically comprising
- Fig. 12B depicts a command generator screen typically displayed after the
- Fig. 12B comprises an animation area 630, preferably comprising a depiction of the selected toy selection icon
- a text area 635 comprising text describing the selected toy.
- Fig. 12B also comprises a plurality of command category buttons 640, each of which allow the user to select a category of commands such as, for example:
- Fig. 12B also comprises a cancel button 645 to cancel command selection
- Fig. 12C comprises a command selection area 650, allowing the user to specify a specific command.
- a wide variety of commands may be specified, and the
- Fig. 12C also comprises a file name area 655, in which the user may
- FIG. 12C also comprises a cancel button 645, similar to the cancel button 645 of Fig. 12B.
- Fig. 12C also comprises a make button 660. Wlien the user actuates the make button 660, the control instruction generator of Fig. 11 generates control instructions
- Fig. 12C also comprises a parameter selection area 665, in which the user may specify a parameter associated with the chosen command.
- the steps for programming the microcontrollers of the present invention include the use of a universal programmer, such as the Universal Programmer, type EXPRO 60/80, manufactured by Sunshine Electronics Co. Ltd., Taipei, Japan.
- Fig. 1C includes a description of a preferred set of predefined messages including a category termed "General commands".
- Other General Commands are defined by the following description:
- a computer transmits this command to verify that the radio channel is vacant If another computer is already using this channel it will respond with the Availability Response Command If no response is received within 250msec the channel is deemed vacant.
- a computer transmits this command in response to an Availability Interrogation Command to announce that the radio channel is in use.
- a Toy transmits this command to declare its existence and receive in response a Channel Pair Selection Command designating the computer that will control it and the radio channels to use.
- a computer transmits this command in response to a Toy Availability Command to inform the toy the radio channels to be used.
- P Computer address 00-03 H
- FIGs. 13 and 14 there are illustrated block diagrams of multiport multichannel implementation of the computer radio interface 110 of Fig. 1A.
- Fig. 13 illustrates the processing sub-unit of the computer interface that is implemented as an add-in board
- Fig. 14 is the RF transceiver which is a device external to the
- both sound and control commands may be transmitted via the MIDI connector 210 rather than
- the functions of the interfaces 210 and 220 between the computer radio interface 110 and the sound card 190 may, alternatively, be implemented as connections between the computer radio interface 110 to the serial and/or parallel ports of the computer 100, as shown in Figs. 25A - 25E and Figs 26A -26D, respectively.
- each transceiver 260 is configured to provide full duplex communication. If it is desired to provide full duplex communication, each transceiver 260
- transceiver 260 (Fig. 4) which forms part of the toy control device 130 of Fig. 1A
- Figs. 15A - 15E showing a Multi-Channel Computer Radio Interface
- Figs. 24A - 24E showing a
- Fig. 16 is a simplified flowchart
- the CRI includes a conventional radio transceiver (260 of Fig. 4) which
- RY3 GB021 having 40 channels which are divided into
- control channels are designated as control channels.
- one of the 4 control channel pairs is selected by
- the selected control channel pair i is monitored by a first transceiver (step 820) to detect the appearance of a
- step 830 selected from among the 16 such channel pairs provided over which game
- the identity of the selected information communication channel pair also relates to the identity of the selected information communication channel pair.
- a channel pair selection command is sent over the control channel pair to the new toy (step 840).
- a game program is then begun (step 850), using the selected information communication channel pair.
- the control channel pair is then free to receive
- transceiver availability table (step 852).
- transceiver which is not marked as busy. This transceiver is then assigned to the control channel i (step 858).
- Fig. 17 is a simplified flowchart illustration of a preferred method for implementing "select control channel pair" step 810 of Fig. 16. In Fig. 17, the four
- control channels are scanned. For each channel pair in which the noise level falls below a
- step 895 the computer sends an availability interrogation command
- step 910 waits for a predetermined time period, such as 250 ms, for a response (steps 930 and 940). If no other computer responds, i.e. sends back an "availability
- the channel pair is deemed vacant. If the channel pair is found
- Fig. 19 is a self-explanatory flowchart illustration of a preferred method
- the toy control device sends a "toy availability command" (step 1160) which is a message advertising the toy's availability, on each control channel i in turn (steps
- step 1190 the toy control device may begin receiving and executing game commands which the computer transmits over the
- the computer system is provided, in communication with a remote game server, as shown in Fig. 20.
- the remote game server 1250 is operative to serve to the computer 100 at least a portion of at least one toy-operating game, which operates one or more toys 1260.
- an entire game may be downloaded from the remote game server 1250.
- a new toy action script or new text files may be downloaded
- a first portion of the game may be received off-line whereas an
- computer 100 may be based on any suitable technology such as but not limited to ISDN; X.25; Frame-Relay; and Internet.
- the computerized device may be provided locally, i.e. adjacent to the toy, because all "intelligence" may be provided from a remote source.
- the computerized device may be less sophisticated than a personal computer, may lack a display monitor of its own, and may, for example, comprise a network computer 1270.
- Fig. 21 is a simplified flowchart illustration of the operation of the computer 100 or of the network computer 1260 of Fig. 20, when operating in
- Fig. 22 is a simplified flowchart illustration of the operation of the remote game server 1250 of Fig. 20.
- Fig. 23 is a semi-pictorial semi-block diagram illustration of a wireless
- the proximity detection subsystem may for example include a pair of ultrasound transducers 1520 and 1530 associated with the toy and computer respectively.
- the toy's ultrasound transducer 1520 typically broadcasts
- Figs. 24A - 24E taken together, form a detailed electronic schematic diagram of a multi-channel implementation of the computer radio interface 110 of Fig. 3
- Figs. 25A - 25E taken together, form a detailed schematic illustration of a computer radio interface which connects to a serial port of a computer rather than to the sound board of the computer.
- Figs. 26A - 26D taken together, form a detailed schematic illustration of a computer radio interface which connects to a parallel port of a computer rather than to
- Figs. 27A - 27J are preferred self-explanatory flowchart illustrations of a
- Figs. 29A - 291 taken together, form a detailed electronic schematic diagram of the multi-port multi-channel computer radio interface sub-unit of Fig. 14.
- Fig. 30 illustrates a further embodiment of the present invention which includes a combination of a Computer Radio Interface (CRI) and a Toy Control Device
- the combined unit 1610 controls a toy 1620 which is connected to the
- the toy means such as radio communication, using the computer radio interface 110.
- the toy means such as radio communication, using the computer radio interface 110.
- Fig 31 illustrates a simplified block diagram of the combined unit 1610.
- Figs. 32A, 32B and 32C taken together form a simplified schematic diagram of the EP900 EPLD chip (U9) of Fig. 28H.
- Figs. 33 - 43 illustrate embodiments of the toy system of Figs. 1 - 32C in which a computer-controlled toy system has a capacity for
- Fig. 33 is a simplified pictorial
- Computer 2200 on which a fanciful figure 2210 is displayed.
- Computer 2200 is
- an audio input device 2220 typically a microphone, through which computer 2200 may receive audio input
- an audio output device 2230 typically a speaker, through which computer 2200 may provide audio output
- FIGS. 34A and 34B are simplified pictorial illustrations of a toy-based fanciful figure interaction system
- Fig. 34C is a simplified pictorial illustration of a toy-based fanciful figure constructed
- Figs. 34A and 34B Shown in Figs. 34A and 34B is computer 2200 preferably configured with audio input
- a toy 2240 is shown in wired communication with computer 2200 along wired connection 2250, while in Fig. 34B toy
- Audio input device 2220 is in communication with computer 2200 at any given time. Audio input device 2220
- audio output device 2230 may be replaced with or augmented by audio input
- toy 2240 is preferably configured with a control unit 2262, a power unit 2264, and one or more articulating appendages 2266.
- a user 2280 is also shown interacting with toy 2240. It is appreciated that any or all of the functionality of computer 2200 may be assembled with or otherwise incorporated in toy 2240.
- Fig. 35 is a simplified block diagram of a
- the system of Fig. 35 preferably comprises a control unit 2300, a speech input and recognition unit 2310 capable of receiving a speech input and identifying the words comprising the speech input, an action interface 2320 capable of receiving action
- a speech synthesis unit 2330 capable of producing audio speech
- unit 2310 may receive input from audio input device 2220 (Fig. 33).
- Action interface may be input from audio input device 2220 (Fig. 33).
- Speech synthesis unit 2330
- Action control unit 2340 may control an action associated with fanciful figure 2210 (Fig. 33) or toy 2240 (Figs.
- the system of Fig. 35 also preferably comprises one or more sets
- phonemes 2350 one or more language sets 2360, each typically comprising one or more words in a known language such as English or fanciful words, a set 2370 of
- Fig. 35 preferably comprises a clock 2400.
- Fig. 35 A logical implementation of the various sets shown in Fig. 35 is described in greater detail hereinbelow with reference to Fig. 41.
- FIG. 36 is a simplified operational flow chart of a fanciful figure interaction system useful in describing the systems of Figs. 33,
- Typical operation begins (step 3430) with the
- a preferred method of performing step 3430 is
- step 3450 speech
- step 3450 A preferred method of performing step 3450 is described in greater
- step 3460 operation continues with step 3440.
- step 3470 is described in greater detail hereinbelow with
- step 3440 Successfully identified speech is then checked for an
- Fig. 37 is a simplified operational flow
- Typical operation begins (step 3500) with selecting a term or action from action set 2390
- Fig. 35 in accordance with selection criteria (step 3510).
- the selection may be random or in accordance with a level of complexity or history of usage associated with an action.
- Clock 2400 (Fig. 35) may be used to advance the level of complexity over time.
- Association set 2390 (Fig. 35) is then searched for an
- the associated action is then performed (step 3540) with or without verbalizing the associated language, and operation continues with step 3450 (Fig. 36) (step 3550).
- Fig. 38 is a simplified operational flow
- Typical operation begins (step 3560) with recording audio input typically comprising
- the audio input in typically received via audio input device 2220
- a data file in a volatile or non-volatile storage medium is
- Fig. 39 is a simplified operational flow chart of a preferred implementation of step 3470 of Fig. 36 in greater detail, constructed
- Typical operation begins (step 3610) with analyzing the file constructed in step 3590 of Fig. 38 for a first pause between speech elements, yielding a first speech element (step 3620). Speech recognition is then performed on the first speech element (step 3630). If
- the first speech element is a language identifier (step 3640) then the current language is
- step 3690 If the first speech element is not a language identifier, speech recognition is
- step 3660 The speech is then identified for known words in the current language (step 3660).
- step 3670 If no known words are found, another language is set to the current language (step 3680) and speech recognition is again performed on the rest of the file (step 3690). The speech is then identified for known words in the current language (step 3700). If no
- step 3710 If the word is identified in a known, learned, generated, or
- step 3720 Operation continues with step 3480 (Fig. 36).
- Fig. 40 is a simplified operational flow
- Typical operation begins (step 3730) with selecting a language which becomes the current language, at random or in accordance with selection criteria (step 3740).
- Association set 2390 (Fig. 35) is then searched for an association between language in language set 2360 (Fig. 35) and the term or action (step 3750).
- the associated action is
- step 3760 then performed (step 3760) and operation continues with step 3450 (Fig. 36) (step
- Fig. 41 is a simplified block diagram of a
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL12164297 | 1997-08-27 | ||
IL12164297A IL121642A0 (en) | 1997-08-27 | 1997-08-27 | Interactive talking toy |
US09/062,499 US6290566B1 (en) | 1997-08-27 | 1998-04-17 | Interactive talking toy |
US62499 | 1998-04-17 | ||
PCT/IL1998/000406 WO1999010065A2 (en) | 1997-08-27 | 1998-08-25 | Interactive talking toy |
Publications (2)
Publication Number | Publication Date |
---|---|
EP0935492A2 true EP0935492A2 (en) | 1999-08-18 |
EP0935492A4 EP0935492A4 (en) | 1999-12-01 |
Family
ID=26323495
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP98940531A Withdrawn EP0935492A4 (en) | 1997-08-27 | 1998-08-25 | Interactive talking toy |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0935492A4 (en) |
AU (1) | AU8883498A (en) |
WO (1) | WO1999010065A2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2156557B1 (en) * | 1999-06-30 | 2002-02-16 | Onilco Innovacion Sa | GATEADOR DOLL WITH SEARCH AND CHANGE ADDRESS DEVICE. |
US6773344B1 (en) | 2000-03-16 | 2004-08-10 | Creator Ltd. | Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems |
FR2819906B1 (en) * | 2001-01-25 | 2003-12-05 | Berchet Groupe Soc | INTERACTION DEVICE WITH A MICROCOMPUTER |
FR2819907B1 (en) * | 2001-01-25 | 2003-03-28 | Berchet Groupe Soc | INTERACTION DEVICE WITH A MICROCOMPUTER |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4679789A (en) * | 1983-12-26 | 1987-07-14 | Kabushiki Kaisha Universal | Video game apparatus with automatic skill level adjustment |
EP0265438A1 (en) * | 1986-05-02 | 1988-05-04 | SIROTA, Vladimir | Toy |
US4857030A (en) * | 1987-02-06 | 1989-08-15 | Coleco Industries, Inc. | Conversing dolls |
US4923428A (en) * | 1988-05-05 | 1990-05-08 | Cal R & D, Inc. | Interactive talking toy |
US5021878A (en) * | 1989-09-20 | 1991-06-04 | Semborg-Recrob, Corp. | Animated character system with real-time control |
EP0527527B1 (en) * | 1991-08-09 | 1999-01-20 | Koninklijke Philips Electronics N.V. | Method and apparatus for manipulating pitch and duration of a physical audio signal |
US5281143A (en) * | 1992-05-08 | 1994-01-25 | Toy Biz, Inc. | Learning doll |
US5752880A (en) * | 1995-11-20 | 1998-05-19 | Creator Ltd. | Interactive doll |
-
1998
- 1998-08-25 EP EP98940531A patent/EP0935492A4/en not_active Withdrawn
- 1998-08-25 AU AU88834/98A patent/AU8883498A/en not_active Abandoned
- 1998-08-25 WO PCT/IL1998/000406 patent/WO1999010065A2/en not_active Application Discontinuation
Non-Patent Citations (4)
Title |
---|
FUJITA M ET AL: "An open architecture for robot entertainment" PROCEEDINGS OF THE FIRST INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS, PROCEEDINGS OF 1ST INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS, MARINA DEL REY, CA, USA, 5-8 FEB. 1997, pages 435-442, XP002117732 1997, New York, NY, USA, ACM, USA ISBN: 0-89791-877-0 * |
See also references of WO9910065A2 * |
SEKIGUCHI M ET AL: "BEHAVIOR CONTROL FOR A MOBILE ROBOT BY MULTI-HIERARCHICAL NEURAL NETWORK" PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, SCOTTSDALE, MAY 15 - 19, 1989, vol. 3, 15 May 1989 (1989-05-15), pages 1578-1583, XP000044339 INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS ISBN: 0-8186-1938-4 * |
TAYLOR D: "THREE WAYS TO GET A-LIFE" IEEE EXPERT, vol. 12, no. 4, 1 July 1997 (1997-07-01), pages 25-30, XP000720766 ISSN: 0885-9000 * |
Also Published As
Publication number | Publication date |
---|---|
WO1999010065A3 (en) | 1999-05-20 |
WO1999010065A2 (en) | 1999-03-04 |
EP0935492A4 (en) | 1999-12-01 |
AU8883498A (en) | 1999-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6290566B1 (en) | Interactive talking toy | |
US6206745B1 (en) | Programmable assembly toy | |
US6773322B2 (en) | Programmable assembly toy | |
US6022273A (en) | Interactive doll | |
US20020005787A1 (en) | Apparatus and methods for controlling household appliances | |
US20020107591A1 (en) | "controllable toy system operative in conjunction with a household audio entertainment player" | |
US6352478B1 (en) | Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites | |
WO1998053456A1 (en) | Apparatus and methods for controlling household appliances | |
JP5349860B2 (en) | PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE | |
EP0934105A1 (en) | Techniques and apparatus for entertainment sites, amusement parks and other information and/or entertainment dispensing sites | |
US20050154594A1 (en) | Method and apparatus of simulating and stimulating human speech and teaching humans how to talk | |
US20010021669A1 (en) | I*doll | |
JPH11511859A (en) | Educational and entertainment device with dynamic configuration and operation | |
KR20100044779A (en) | An audio animation system | |
JP2002169590A (en) | System and method for simulated conversation and information storage medium | |
US20090141905A1 (en) | Navigable audio-based virtual environment | |
EP1080352A1 (en) | Intelligent doll | |
WO1999010065A2 (en) | Interactive talking toy | |
CA2611635A1 (en) | Remote game device for dvd gaming systems | |
WO1998053567A1 (en) | Controllable toy operative with audio player | |
US20230306666A1 (en) | Sound Based Modification Of A Virtual Environment | |
Roden et al. | Toward mobile entertainment: A paradigm for narrative-based audio only games | |
WO2005038776A1 (en) | Voice controlled toy | |
RU2209651C2 (en) | Playing system | |
Spöhrer | A History of Disability and Voice-Enabled Gaming from the 1970s to Intelligent Personal Assistants |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 19990527 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 19991020 |
|
AK | Designated contracting states |
Kind code of ref document: A4 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE |
|
RIC1 | Information provided on ipc code assigned before grant |
Free format text: 6A 63H 1/00 A, 6G 06F 19/00 B, 6G 06F 15/18 B |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20020301 |